Out of the Mouths of Babes

A recent study published in Science shows how babies can teach AI a thing or two about language.

NYU researchers developed a machine learning model that mimics the way children learn language, utilizing 61 hours of first-person video and audio from an infant's perspective. The footage was gathered through a head-mounted camera worn by a child from 6 to 25 months (check out the adorable picture below).

The model learned 250,000 word-image pairs and was able to correctly identify objects a staggering 62% of the time - more than twice the anticipated rate of 25%.

The AI model's ability to learn word-object associations directly challenges existing language acquisition theories and opens the door to new approaches to training AI.

This study highlights the potential for AI to learn more efficiently from far less data Than what is required to train small and large language models. Currently AI systems need “billions, sometimes trillions of words. In contrast, humans learn from tens to hundreds of millions of words.”

This study and approach has the potential to open new avenues for AI development, suggesting a shift towards more intuitive and naturalistic learning models that could transform our understanding and implementation of AI in various fields. Very exciting to watch how AI research is evolving.

It also gives new meaning to the phrase “out of the mouths of babes!”

To read more about the study or see a video from the researchers visit here.

Previous
Previous

Our Work on CNN!

Next
Next

Proud Moments & NYC’s HE3AT Program