Two‑month‑old infants show category-level visual processing far earlier than thought
Infant fMRI of 130 babies finds category and animacy distinctions at two months, suggesting neural object representations emerge before behavioral signs.

Researchers report that brains of two‑month‑old infants already distinguish broad object categories, including an animacy split between living and nonliving things, according to a study published in Nature Neuroscience. Using awake functional magnetic resonance imaging, teams led from Trinity College Dublin imaged 130 two‑month‑old infants as they viewed bright, colourful pictures drawn from 12 common categories for 15–20 minutes while lying on a beanbag and wearing sound‑cancelling headphones.
The paper, conducted in collaboration with Queen’s University Belfast and Stanford University and assisted by Dublin’s Coombe and Rotunda hospitals, mapped responses across the infant visual cortex and compared them with data from older infants and adults. In the ventral visual cortex the authors report a measurable animacy signal at two months that becomes stronger with age: partial Spearman correlations for animacy, controlling for four perceptual features, rose from ρ = 0.198 (95% CI: 0.164 to 0.233) in two‑month infants to ρ = 0.552 (95% CI: 0.522 to 0.581) at nine months and to ρ = 0.657 (95% CI: 0.613 to 0.696) in adults. Figures in the paper document the presence of category distinctions at two months and their refinement with experience (Fig. 3d, Fig. 4).
The study also compared infant brain data with deep neural networks. An untrained network matched early visual cortex patterns in infants better than in adults, but a fully trained supervised network outperformed the untrained model across all ages, a result the authors use to argue that features learned from visual input statistics help explain early representations. The Nature paper notes the supervised DNN had labeled training images, a condition infants do not experience. RTE summed up that "Babies learn much more quickly than AI."
Investigators framed the work as unusually large and longitudinal. As RTE quoted Rhodri Cusack, the Thomas Mitchell Professor of Cognitive Neuroscience at Trinity and team leader on the project, "This study represents the largest longitudinal study with functional magnetic resonance imaging of awake infants," a claim that reflects the paper’s multi‑age comparisons. Dr Anna Truzzi of Queen’s University Belfast, a co‑author, told The Irish Times that her daughter Maeve took part in the study when she was two months old, underscoring the practical challenges of scanning awake infants.
The Nature findings contrast with other recent infant work. An eLife study cited in the source materials reported that reliable category representations emerge later, with face responses appearing at four to six months and other categories developing through the first year; eLife provides model coefficients and post hoc tests showing significant growth for faces, characters and cars across infancy. A PubMed report using S‑cone isolated stimuli found two‑month infants could prefer upright faces, indicating that stimulus and method choices shape when cortical contributions become detectable.
Authors and reporting outlets emphasize that different methods probe different processes: fMRI reveals neural representational structure that may precede observable behavioral preferences measured by looking time or head‑mounted cameras. The Nature paper concludes that high‑level feature representations are present in the infant ventral stream from two months and are fine‑tuned with age and experience, but the published excerpts leave open several details, including the full list of stimuli, exact sample sizes for nine‑month infants and adults, and full DNN specifications, that the authors say will be clarified in the complete manuscript.
Sources:
Know something we missed? Have a correction or additional information?
Submit a Tip

