Interpreting brainwave information

According to experts, among those who have become quadriplegia or unable to speak due to various reasons, they still have the will to speak in their heads, but often cannot speak it out. At this time, it would be fine if there was a device that transmits prepared words to the outside world, but such a technology has not yet been realized.

Until recently, ordinary people who have no disabilities volunteered for research have tried to list what they said or thought in sentences based on methods such as deep learning, and recent studies have announced for the first time that this technology is approaching the realization stage. This patient became unable to speak due to paralysis after suffering a stroke more than a decade ago. This is because it has lost control of the muscles needed to speak. My arms and legs have also been paralyzed, so I have been communicating by slightly moving my head to select letters on the computer screen. The speed was slow enough to choose about five words per minute. A research team led by Edward Chang, a neurosurgeon at the University of California in San Francisco, studied ways to speed up communication. After removing part of the patient’s skull and attaching a small electrode, the active pattern of the brain part associated with speech was observed. And this data was trained with a deep learning algorithm to analyze the association between the brain’s active pattern and specific words.After specifying 50 words and showing them to the patient one by one to try to say the word, I read the activity of brain waves. After learning the correlation between each word and the EEG activity pattern, it was to measure the EEG activity pattern again, showing sentences one by one consisting of words used in the study, such as “Bring my glasses, please.” He explained that the algorithm also included a language model that predicts what is suitable for each sentence with the next word, significantly lowering the rate of including wrong words when predicting sentences. As a result of collecting 22 hours of EEG measurement data through a total of 48 sessions and analyzing it with algorithms, the researchers said they reached the level of reading 15 words per minute from the patient’s brain. This was three times faster than the conventional method in which patients moved their heads to select words on the screen. In addition, the researchers reported that the probability of detecting individual words was around 98 percent, and that the algorithm used in the study misselected other words in the process of reading the patient’s brain waves into sentences was not high at 26 percent.

However, in the case of patients observed in the study, since they have not been able to speak for more than a decade, there is a question about how similar the person’s brainwave activity will be to ordinary people. In addition, the researchers added that in order to use the technology shown in the study in daily life, it is necessary to make the word repertoire more diverse and the process of reading brain waves technically easier. As described above, the core of research through brain waves is to properly grasp the signal and use it. Hyung-seop Han, CEO of HHS, developed a solution that approached various applications using artificial intelligence after converting the signal of brain waves into data. These days, various devices such as Galaxy Watch, Apple Watch, etc. are used, but it is the closest method in the form of a helmet. In order to be lighter and diversify its utilization through measurement, the design has been changed so that it can be comfortably attached to the helmet outside the initial prototype.

Sam Kim

Asia Journal

HHS
HHS

▲ HHS Co., Ltd.

▲ CEO : Hyung-seob Han

▲ Brand : HHS, PINYFINY

▲ www.hhskorea.com

▲ overhs@naver.com

▲ +82-10-6733-5112

spot_img

Latest Articles