╌>

Bilingual AI brain implant helps stroke survivor communicate in Spanish and English

  

Category:  News & Politics

Via:  perrie-halpern  •  2 weeks ago  •  3 comments

By:   Nicole Acevedo

Bilingual AI brain implant helps stroke survivor communicate in Spanish and English
Scientists at the University of California, San Francisco have developed a bilingual brain implant that uses artificial intelligence to help a stroke survivor communicate in Spanish and English for the first time.

S E E D E D   C O N T E N T


Scientists at the University of California, San Francisco have developed a bilingual brain implant that uses artificial intelligence to help a stroke survivor communicate in Spanish and English for the first time.

Nearly a dozen scientists from the university's Center for Neural Engineering and Prostheses have worked for several years to design a decoding system that could turn the man's brain activity into sentences in both languages and display them on a screen.

An article published May 20 in Nature Biomedical Engineering outlining their research identifies the man as Pancho. At age 20, he became severely paralyzed as a result of a stroke he had in the early 2000s. Pancho can moan and grunt but can't articulate clear words. He is a native Spanish speaker who learned English as an adult.

Under the leadership of Dr. Edward Chang, a neurosurgeon who serves as co-director of the Center for Neural Engineering and Prostheses, Pancho received a neural implant in February 2019, allowing scientists to start tracking his brain activity.

By using an AI method known as a neural network, researchers were able to train Pancho's implant to decode words based on the brain activity produced when he attempted to articulate them. This AI training method basically allows the brain implant, known scientifically as a brain-computer interface device, to process data in a way that is somewhat similar to the human brain.

By 2021, the technology had significantly helped restore Pancho's ability to communicate, but only in English.

"Speech decoding has primarily been shown for monolinguals but half the world is bilingual with each language contributing to a person's personality and worldview," Chang's research group said on X. "There is a need to develop decoders that let bilinguals communicate with both languages."

However, the 2021 research served as the foundation to develop the decoding system that later made Pancho's brain implant bilingual in Spanish and English.

Allowing a language switch based on preference


After discovering that Pancho's brain had "cortical activity" across both languages years after he became paralyzed, the scientists realized they could leverage that to train a bilingual brain implant without the need to train separate language-specific decoding systems.

"We leveraged this finding to demonstrate transfer learning across languages. Data collected in a first language could significantly expedite training a decoder in the second language," Chang's research group said on X, because it is based on the brain activity produced by "the intended vocal-tract movements of the participant, irrespective of the language."

In 2022, the scientists sought out to prove that. They again used the artificial neural network to train Pancho's brain implant on the distinct neural activity produced by his bilingual speech.

According to their findings, Pancho was able to use the bilingual decoding system powering his brain implant to "participate in a conversation, switching between [both] languages on the basis of preference."

The study ultimately shows "the feasibility of a bilingual speech neuroprosthesis," or bilingual brain implant, and provides a glimpse into how this type of technology has the "potential to restore more natural communication" among bilingual speakers with paralysis, according to the May 20 article.

.


Tags

jrDiscussion - desc
[]
 
evilone
Professor Guide
1  evilone    2 weeks ago

With AI voice cloning working as well as it is, how long until they move this tech from text to natural speech? 

 
 
 
TᵢG
Professor Principal
1.1  TᵢG  replied to  evilone @1    2 weeks ago

Instantly.   

Amazing that they are able to interpret billions of electrical signals in a biological brain into anything ... much less coherent language.    What an astonishing feat!

 
 
 
Kavika
Professor Principal
2  Kavika     2 weeks ago

Simply amazing.

 
 

Who is online

Texan1211
cjcold
Igknorantzruls
JohnRussell
JBB
Right Down the Center
Gazoo


56 visitors