Our research seeks to link the patterns of neural activity in auditory cortex to our perception of the world around us. Making sense of the complex mixture of sound that arrives at our ears is a considerable neurocomputational challenge – and one that cannot yet be solved by machine listening devices. While sounds within an environment, such as a person talking, may be clearly intelligible at their source, noisy and reverberant listening conditions often combine to degrade the intelligibility of the sound wave arriving at the ear. The auditory brain must recover and identify the original sound source and, ultimately, derive information on which to base behaviour. This requires appropriately grouping sound elements into auditory ‘objects’, developing neural representations that are robust to background noise and integrating behavioural variables with information from other sensory systems. Our research methods combine electrophysiological recordings (from auditory, parietal and frontal cortex) during the performance of complex sensory discrimination tasks, with causal manipulations of neural activity, animal and human psychophysics and computational modelling.