Gaze analysis often relies on hypothesized, subjectively defined ROIs or heatmaps: ROIs enable condition comparisons but reduce objectivity and exploration, while heatmaps avoid this, they require many pixel-wise comparisons, making differences hard to detect. Here, we propose an advanced data driven approach for analysing gaze behaviour. We use DNNs to classify conditions from gaze patterns, paired with reverse correlation to show where and how gaze differs between conditions. We test our approach on data from an experiment investigating the effects of object specific sound (e.g. church bell ringing) on gaze allocation. ROI-based analysis shows a significant difference between conditions (congruent sound, no sound, phase scrambled sound and pink noise) with more gaze allocation on sound associated objects in the congruent sound condition, however, as expected significance depends on the definition of the ROIs. Heatmaps show some not very clear qualitative differences, but none are significant after correcting for pixelwise comparisons. Our approach shows that sound alters gaze allocation in some scenes, revealing task-specific, non-trivial strategies: fixations are not always drawn to the sound source but shift away from salient features, sometime falling between salient features and the sound source. Overall, the method is objective, data-driven, and enables clear condition comparisons.