Principal Investigator: Jason Friedman, Postdoc Candidate:Konstantin Sonkin

Principal Investigator: Jason Friedman, Postdoc Candidate:Konstantin Sonkin

Development of a multimodal (IMU and EEG) Human-Computer Interface providing a new instrument for control of devices and motor activity training for patients with motor disorders

 

Development of efficient software-hardware solutions for human-computer interaction based on non-invasive brain and body signal decoding is a priority direction of technology evolution in the world. The goal of this project is to develop a multimodal software-hardware system for control of devices and motor activity training based on residual movements measured by inertial measurement units (IMUs) and on motor imagery registered by electroencephalogram (EEG). This system represents a novel approach of Human-Computer Interaction which can be used for control of assistive devices (smartphones, PC, etc) and rehabilitation of millions of patients with severe movement disorders, such as after stroke, with amyotrophic lateral sclerosis, post traumatic brain injury or post spinal cord injury. Thus, the objective of the project is to provide new affordable, portable and efficient instrument for quality of life improvement for patients with severe movement disorders, which might be used both in clinics and at home.

 

KONSTANTIN SONKIN

Research Fellow and Project Manager experienced in leading and handling projects in interdisciplinary areas, such as brain-computer interaction and artificial intelligence. Research focus: development and implementation of high quality human-computer interaction system on the basis of advanced methods of artificial intelligence, which will be used worldwide, making a significant contribution to rehabilitation of immobilized people and to deeper understanding of brain activity. Research interests include artificial intelligence, machine learning, neuroscience, brain-computer interfaces, human-computer interaction.

Development of efficient software-hardware solutions for human-computer interaction based on non-invasive brain and body signal decoding is a priority direction of technology evolution in the world. The goal of this project is to develop a multimodal software-hardware system for control of devices and motor activity training based on residual movements measured by inertial measurement units (IMUs) and on motor imagery registered by electroencephalogram (EEG). This system represents a novel approach of Human-Computer Interaction which can be used for control of assistive devices (smartphones, PC, etc) and rehabilitation of millions of patients with severe movement disorders, such as after stroke, with amyotrophic lateral sclerosis, post traumatic brain injury or post spinal cord injury. Thus, the objective of the project is to provide new affordable, portable and efficient instrument for quality of life improvement for patients with severe movement disorders, which might be used both in clinics and at home.