Alternative Title脳信号によるロボット車椅子の適応型ナビゲーションシステムの研究
Note (General)A brain machine interface (BMI) is a direct communication pathway between the brain and an external device (i.e. robot). The BMI ability to directly communicate with machines and bypass the peripheral nerves and muscles is often used to assist or repair human sensorimotor functions in severely disable or locked-in syndrome affected hu-mans. Therefore, the BMI coupled with the assistive technology is seen as a promising approach to restore mobility in disable people. A very important application in rehabilita-tion technology is the BMI based navigation of robotic wheelchairs. The aim of this application is to restore some mobility independence to paralyzed patients.Navigating a robotic wheelchair by using only brain signals is a very challenging task. Since only brain signals and no other means can be used to control the wheelchair, the safety requirements are higher than standard wheelchairs. Proximity sensing data are usually used to detect objects and avoid collisions to improve wheelchair navigation safety.BMI is a low-bit communication channel. This means, the subject has to perform a high amount of mental tasks for a relatively long period of time, even for simple naviga-tion scenarios. This makes the navigation task tiring and long for the BMI subject. Shared control between subject’s mental intentions and intelligent robots is usually proposed as a solution of these problems. Shared control generally improves the navigation experience but also restricts the subject’s control over the robot. Also prior environment training and/or goal location information are required to assist the robot during navigation.In this thesis, we present a novel adaptive method that improves the navigation of a brain controlled robotic wheelchair. We employ a synchronous brain machine interface to retrieve mental intentions from the subject at specific points in time. Furthermore, we have developed two modules to assist the robotic wheelchair navigation. The first module is capable of navigating the wheelchair autonomously following assistive information (tactile paving for visually impaired people) on the floor captured by the visual sensor in real time. The second module uses a laser range finder sensor to detect and avoid objects in the navigation path. The adaptive platform integrates the brain signals, the robot sensing and the navigation modules in order to provide the subject with context based navigation choices through acoustic and visual queries.Based on environment conditions the subject can choose to navigate the robot turn-by-turn or give high-level control commands, and allow the robot to navigate autono-mously following the assistive information. The subject is able to accept or reject the assistance by using only brain signals.Experimental results show that the proposed adaptive navigation, the robotic wheelchair is able to navigate on a shorter trajectory, avoid potential collisions and reduce the navigation time. The number of mental tasks required is reduced significantly when the assistive information is used. As a consequence of a reduced mental workload the subject is more relaxed and is able to perform better mental tasks, which lead to a higher BMI classification accuracy during adaptive navigation.
富山大学・富理工博甲第63号・Mano Marsel・2013/09/27
開始ページ : 1
終了ページ : 127
Collection (particular)国立国会図書館デジタルコレクション > デジタル化資料 > 博士論文
Date Accepted (W3CDTF)2015-02-03T05:25:05+09:00
Data Provider (Database)国立国会図書館 : 国立国会図書館デジタルコレクション