New Delhi: Meta<\/a> (formerly Facebook<\/a>) has announced a long-term artificial intelligence<\/a> (AI) research initiative to better understand how the human brain processes speech and text, and build AI systems<\/a> that learn like people do.
In collaboration with neuroimaging center Neurospin<\/a> (CEA) and Inria, Meta<\/a> said it is comparing how AI language models and the brain respond to the same spoken or written sentences.
\"We'll use insights from this work to guide the development of AI that processes speech and text as efficiently as people,\" the social network<\/a> said in a statement.
Over the past two years, Meta<\/a> has applied deep learning techniques to public neuro-imaging data sets to analyse how the brain processes words and sentences.
Children learn that \"orange\" can refer to both a fruit and colour from a few examples, but modern AI systems<\/a> can't do this as efficiently as people.
Meta research has found that language models that most resemble brain activity are those that best predict the next word from context (like once upon a... time).
\"While the brain anticipates words and ideas far ahead in time, most language models are trained to only predict the very next word,\" said the company.
Unlocking this long-range forecasting capability could help improve modern AI language models.
Meta recently revealed evidence of long-range predictions in the brain, an ability that still challenges today's language models.
For the phrase, \"Once upon a...\" most language models today would typically predict the next word, \"time,\" but they're still limited in their ability to anticipate complex ideas, plots and narratives, like people do.
In collaboration with Inria, Meta<\/a> research team compared a variety of language models with the brain responses of 345 volunteers who listened to complex narratives while being recorded with fMRI.
\"Our results showed that specific brain regions are best accounted for by language models enhanced with far-off words in the future,\" the team said.
<\/body>","next_sibling":[{"msid":91251562,"title":"HFCL drops 8% after PAT falls in March quarter","entity_type":"ARTICLE","link":"\/news\/hfcl-drops-8-after-pat-falls-in-march-quarter\/91251562","category_name":null,"category_name_seo":"telecomnews"}],"related_content":[],"msid":91252340,"entity_type":"ARTICLE","title":"Meta building AI that processes speech & text as humans do","synopsis":"In collaboration with neuroimaging center Neurospin (CEA) and Inria, Meta said it is comparing how AI language models and the brain respond to the same spoken or written sentences.","titleseo":"telecomnews\/meta-building-ai-that-processes-speech-text-as-humans-do","status":"ACTIVE","authors":[],"Alttitle":{"minfo":""},"artag":"IANS","artdate":"2022-05-02 11:51:37","lastupd":"2022-05-02 12:02:26","breadcrumbTags":["meta","internet","neurospin","artificial intelligence","facebook","inria, meta","modern AI language models","social network","AI systems"],"secinfo":{"seolocation":"telecomnews\/meta-building-ai-that-processes-speech-text-as-humans-do"}}" data-news_link="//www.iser-br.com/news/meta-building-ai-that-processes-speech-text-as-humans-do/91252340">