Researchers at New York University found that a simple AI program could learn basic elements of language from the sensory input of a child's experience.
The researchers used data from an Australian baby known only as Sam, who is now 11 years old, from the SAYCam database.
Trained on just 61 hours of footage of Sam, including 600,000 video frames paired with 37,500 transcribed words, the AI was able to match basic nouns and images on par with an AI trained on 400 million captioned-images.
From The Washington Post
View Full Article - May Require Paid Subscription
Abstracts Copyright © 2024 SmithBucklin, Washington, D.C., USA
No entries found