本資料は、掲載誌(URI)等のリンク先にある学位授与機関のWebサイトやCiNii Dissertationsから、本文を自由に閲覧できる場合があります。
博士論文
An Intelligent Person Following Shopping Support Robot for the Elderly
国立国会図書館館内限定公開
収録元データベースで確認する
国立国会図書館デジタルコレクション
デジタルデータあり
公開元のウェブサイトで確認する
DOI[10.24561/00019355]のデータに遷移します
An Intelligent Person Following Shopping Support Robot for the Elderly
- 国立国会図書館永続的識別子
- info:ndljp/pid/11865183
国立国会図書館での利用に関する注記
資料に関する注記
一般注記:
- type:textThe lack of caregivers in an aging society is a major social problem. Without assistance, many of the elderly and disabled are unable to perf...
書店で探す
障害者向け資料で読む
全国の図書館の所蔵
国立国会図書館以外の全国の図書館の所蔵状況を表示します。
所蔵のある図書館から取寄せることが可能かなど、資料の利用方法は、ご自身が利用されるお近くの図書館へご相談ください
その他
埼玉大学学術情報リポジトリ(SUCRA)
デジタル連携先のサイトで、学術機関リポジトリデータベース(IRDB)(機関リポジトリ)が連携している機関・データベースの所蔵状況を確認できます。埼玉大学学術情報リポジトリ(SUCRA)のサイトで この本を確認
書店で探す
障害者向け資料で読む
書誌情報
この資料の詳細や典拠(同じ主題の資料を指すキーワード、著者名)等を確認できます。
デジタル
- 資料種別
- 博士論文
- 著者・編者
- ISLAM, MD MATIQUL
- 出版事項
- 出版年月日等
- 2020
- 出版年(W3CDTF)
- 2020
- 並列タイトル等
- 高齢者のための知的追従する買い物支援ロボット
- タイトル(掲載誌)
- 博士論文(埼玉大学大学院理工学研究科(博士後期課程))
- 授与機関名
- 埼玉大学
- 授与年月日
- 2020-09-23
- 授与年月日(W3CDTF)
- 2020-09-23
- 報告番号
- 甲第1177号
- 学位
- 博士(学術)
- 博論授与番号
- 甲第1177号
- 本文の言語コード
- eng
- 件名標目
- 対象利用者
- 一般
- 一般注記
- type:textThe lack of caregivers in an aging society is a major social problem. Without assistance, many of the elderly and disabled are unable to perform daily tasks. One important daily activity is shopping in supermarkets. Carrying heavy weighted goods or pushing a shopping cart and moving it from shelf to shelf is tiring and laborious, especially for customers with certain disabilities or the elderly.Many researcher develop person following robot for supporting elderly in shopping mall but just following may not sufficient or good enough for supporting elderly. Considering the body orientation of the customer it is better to find the appropriate positional relation between the customer and the robot.In addition to the robust person-following, the robot can more support the user if it can act in advance to meet the user's next move. For example, when the user picks up a product from a shelf, it is convenient if the robot automatically comes to the user's right hand side (if the user is right-handed) so that he or she can put it easily in the basket. To realize such functions, the robot needs to recognize the user's behavior.The first part of the work is on developing body orientation based shopping support robot. To do that, we address the problem of real-time human pose-based robust person tracking system. We achieve this by cropping the target person's body from the image and then apply a color histogram matching algorithm for tracking a unique person. After tracking the person, we used an omnidirectional camera and LiDAR sensor to find the target person's location and distance from the robot. When the target person stop in front of shopping shelves our robot finds the target person's body movement orientation using our proposed methodology. According to the body orientation our robot assumes a suitable position so that the target person can easily put his shopping product in the basket. Our proposed system was verified in real time environments and it shows that our robot system is highly effective at following a given target person and provides proper support while shopping the target person.The next step was to develop an intelligent shopping support robot that can carry a shopping cart while following its owners and provide the shopping support by observing the customer's head orientation, body orientation and recognizing different shopping behaviors. Recognizing shopping behavior or the intensity of such action is important for understanding the best way to support the customer without disturbing him or her. This system also liberates elderly and disabled people from the burden of pushing shopping carts, because our proposed shopping cart is essentially a type of autonomous mobile robots that recognizes its owner and following him or her. The proposed system discretizes the head and body orientation of customer into 8 directions to estimate whether the customer is looking or turning towards a merchandise shelf. From the robot's video stream, a DNN-based human pose estimator called OpenPose is used to extract the skeleton of 18 joints for each detected body. Using this extracted body joints information, we built a dataset and developed a novel Gated Recurrent Neural Network (GRU) topology to classify different actions that are typically performed in front of shelves: reach to shelf, retract from shelf, hand in shelf, inspect product, inspect shelf. Our GRU network model takes series of 32 frames skeleton data then gives the prediction. Using cross-validation tests, our model achieves an overall accuracy of 82%, which is a significant result. Finally, from the customer's head orientation, body orientation and shopping behavior recognition we develop a complete system for our shopping support robot.To operate our robot in a practical environment we must ensure three requirements such as speed, accuracy and cost. OpenPose based model does not ful-fill these requirements. For this reason, we replace OpenPose model with Kinect V2 depth camera. Kinect camera can detect 3D skeleton with approximately 30 frames/sec whereas OpenPose model detect skeleton approximately 5 frames/sec. The accuracy of 3D skeleton based shopping action recognition is high and it is 95% using our GRU network. Using Kinect camera we can measure the distance from robot to tracked person so extra LiDAR sensor does not need. So, Kinect based model is cost effective and does not need extra processing.Finally, we develop a person following shopping support robot using a Kinect camera that can recognize customer shopping actions or activities. Our robot can follow within a certain distance behind the customer. Whenever our robot detects the customer performing a "hand in shelf" action in front of a shelf it positions itself beside the customer with a shopping basket so that the customer can easily put his or her product in the basket. Afterwards, the robot again follows the customer from shelf to shelf until he or she is done with shopping. We conduct our experiments in a real supermarket to evaluate its effectiveness.1 Introduction 11.1 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.2 Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21.3 Research Contributions . . . . . . . . . . . . . . . . . . . . . . . . . . 21.4 Organization of Sections . . . . . . . . . . . . . . . . . . . . . . . . . 32 Background and Literature Review 52.1 Definition of Human Support Robots . . . . . . . . . . . . . . . . . . 72.2 Potential Application of Human Support Robot . . . . . . . . . . . . 72.2.1 Human Support Robot in Shopping Mall . . . . . . . . . . . . 72.2.2 Human Support Robot in Other Disciplines . . . . . . . . . . 92.3 Customer Shopping Behavior Recognition . . . . . . . . . . . . . . . 102.3.1 Shopping Behavior Understanding . . . . . . . . . . . . . . . . 102.3.2 Shopping Actions . . . . . . . . . . . . . . . . . . . . . . . . . 122.4 Chapter Summery . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153 A Person-Following Shopping Support Robot Based on Human Pose Skeleton Data and LiDAR Sensor 173.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173.2 Design Approach . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183.3 Person Tracking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183.4 Positional Angle Value of Tracked Person . . . . . . . . . . . . . . . . 193.5 Calculate the Rotation and Translation Value . . . . . . . . . . . . . 203.6 Body Orientation Angle of Target Person . . . . . . . . . . . . . . . . 213.7 Head Orientation Angle of Target Person . . . . . . . . . . . . . . . . 243.8 Experimental Results . . . . . . . . . . . . . . . . . . . . . . . . . . . 243.9 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . 284 An Intelligent Shopping Support Robot: Understanding ShoppingBehavior from 2D Skeleton Data Using GRU Network 294.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 294.2 Definition of Customer Behavior Model . . . . . . . . . . . . . . . . . 304.3 Framework of Customer Behavior Classification . . . . . . . . . . . . 304.4 Gated Recurrent Neural Network (GRU) . . . . . . . . . . . . . . . . 334.4.1 Dataset Construction . . . . . . . . . . . . . . . . . . . . . . . 344.4.2 Experiments Description . . . . . . . . . . . . . . . . . . . . . 354.5 Architecture of the Shopping Support Robot Based on the User's Behavior Recognition . . . . . . . . . . . . . . 364.6 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 374.6.1 Evaluation of Behavior Recognition . . . . . . . . . . . . . . . 374.6.1.1 Performance Metrics . . . . . . . . . . . . . . . . . . 374.6.2 Evaluation of Shopping Support Robot . . . . . . . . . . . . . 404.7 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . 415 Person-Following Shopping Support Robot using Kinect Depth Camera based on 3D Skeleton Tracking 425.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 425.2 Design Approach . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 445.2.1 Person's Skeleton Tracking . . . . . . . . . . . . . . . . . . . . 445.2.2 Person Following Procedure of Our Robot . . . . . . . . . . . 455.2.3 Shopping Behavior Action Recognition . . . . . . . . . . . . . 475.2.3.1 Dataset Construction . . . . . . . . . . . . . . . . . . 475.2.3.2 Training the GRU Network . . . . . . . . . . . . . . 475.3 Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 485.3.1 Experimental Conditions . . . . . . . . . . . . . . . . . . . . . 495.3.2 Experimental Results . . . . . . . . . . . . . . . . . . . . . . . 505.4 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . 506 Conclusions and Future Work 526.1 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 526.2 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52指導教員 : 小林貴訓
- DOI
- 10.24561/00019355
- 国立国会図書館永続的識別子
- info:ndljp/pid/11865183
- コレクション(共通)
- コレクション(障害者向け資料:レベル1)
- コレクション(個別)
- 国立国会図書館デジタルコレクション > デジタル化資料 > 博士論文
- 収集根拠
- 博士論文(自動収集)
- 受理日(W3CDTF)
- 2021-11-08T14:10:24+09:00
- 作成日(W3CDTF)
- 2021-09-16
- 記録形式(IMT)
- application/pdf
- オンライン閲覧公開範囲
- 国立国会図書館内限定公開
- デジタル化資料送信
- 図書館・個人送信対象外
- 遠隔複写可否(NDL)
- 可
- 連携機関・データベース
- 国立国会図書館 : 国立国会図書館デジタルコレクション