LINE

    Text:AAAPrint
    Sci-tech

    Scientists find how brain encodes speech, closing to making brain-speech machine interface

    1
    2018-09-27 09:16:11Xinhua Editor : Gu Liping ECNS App Download

    American researchers have unlocked new information about how the brain encoded speech, moving r to developing speech-brain machine interface that can decode the commands the brain is sending to the tongue, palate, lips and larynx.

    The study published on Wednesday in the Journal of Neuroscience revealed that the brain controls speech production in a similar manner to how it controls the production of arm and hand movements.

    Northwestern University researchers recorded signals from two parts of the brain and decoded what these signals represented.

    They found that the brain represented both the goals of what we are trying to say (speech sounds like "pa" and "ba") and the individual movements that we use to achieve those goals (how we move our lips, palate, tongue and larynx). The different representations occur in two different parts of the brain.

    The discovery could potentially help people like the late Stephen Hawking communicate more intuitively with an effective brain machine interface (BMI) and those people with speech disorders like apraxia of speech.

    "This can help us build better speech decoders for BMIs, which will move us r to our goal of helping people that are locked-in speak again," said the paper's lead author Marc Slutzky, associate professor of neurology and of physiology at Northwestern.

    Speech is composed of individual sounds, called phonemes, which are produced by coordinated movements of the lips, tongue, palate and larynx. However, scientists didn't know exactly how these movements, called articulatory gestures, are planned by the brain.

    Slutzky and his colleagues found speech motor areas of the brain had a similar organization to arm motor areas of the brain.

    Scientists recorded brain signals from the cortical surface using electrodes placed in patients undergoing brain surgery to remove brain tumors, keeping the patients awake during surgery and asking them to read words.

    After the surgery, scientists marked the times when the patients produced phonemes and gestures. Then they used the recorded brain signals from each cortical area to decode which phonemes and gestures had been produced, and measured the decoding accuracy.

    The brain signals in the precentral cortex were more accurate at decoding gestures than phonemes, while those in the inferior frontal cortex, a higher level speech area, were equally good at decoding both phonemes and gestures, according to the study.

    This finding helped support linguistic models of speech production and guide engineers in designing brain machine interfaces to decode speech from these brain areas.

    The next step for the research is to develop an algorithm for brain machine interfaces that would not only decode gestures but also combine those decoded gestures to form words.

      

    Related news

    MorePhoto

    Most popular in 24h

    MoreTop news

    MoreVideo

    News
    Politics
    Business
    Society
    Culture
    Military
    Sci-tech
    Entertainment
    Sports
    Odd
    Features
    Biz
    Economy
    Travel
    Travel News
    Travel Types
    Events
    Food
    Hotel
    Bar & Club
    Architecture
    Gallery
    Photo
    CNS Photo
    Video
    Video
    Learning Chinese
    Learn About China
    Social Chinese
    Business Chinese
    Buzz Words
    Bilingual
    Resources
    ECNS Wire
    Special Coverage
    Infographics
    Voices
    LINE
    Back to top Links | About Us | Jobs | Contact Us | Privacy Policy
    Copyright ?1999-2018 Chinanews.com. All rights reserved.
    Reproduction in whole or in part without permission is prohibited.
    主站蜘蛛池模板: 太和县| 平阴县| 辽源市| 湖口县| 遂昌县| 安岳县| 天峨县| 凉城县| 咸阳市| 南平市| 蓬溪县| 山阳县| 呼伦贝尔市| 白河县| 台北市| 宿州市| 潮安县| 炉霍县| 芦山县| 罗平县| 玛纳斯县| SHOW| 板桥市| 定安县| 临朐县| 德州市| 巢湖市| 延边| 屏南县| 灵石县| 布拖县| 晋中市| 汉寿县| 卓资县| 安吉县| 锦屏县| 富宁县| 台中市| 永宁县| 扎兰屯市| 太谷县|