The Noel Lab studies perception and decision-making using cognitive (EEG, virtual reality), systems (Neuropixels, optogenetics), and computational neuroscience (biologically plausible neural networks and Bayesian normative models) approaches, working across humans and rodents. Core research areas include visual, auditory, and tactile perception, multisensory integration, cognitive flexibility, expectation updating, naturalistic and ecological neuroscience, embodied cognition, consciousness, spatial navigation and planning, and computational psychiatry in Autism, Schizophrenia, and Psychosis.
A central focus of the lab is understanding how the brain infers hidden causes from sensory observations, a process often described as causal inference. Perception requires forming hypotheses about the latent structure of the world that gives rise to incoming signals; the lab investigates both the normative principles underlying this process and how it breaks down in neurodevelopmental and psychiatric conditions, with particular emphasis on Autism Spectrum Disorders. This research has direct relevance for next-generation generative artificial intelligence, where similar inferential challenges arise.
The lab adopts an integrative, cross-species strategy that combines human cognitive neuroscience (EEG, psychophysics, VR/AR), rodent systems neuroscience (large-scale electrophysiology, optogenetics), and computational modeling (Bayesian inference, artificial neural networks). The lab welcomes inquiries from motivated individuals interested in training at the intersection of these domains and possessing strong technical skills. For more information, visit www.noel-lab.org.