Vis enkel innførsel

dc.contributor.authorVenås, Gizem Ates
dc.date.accessioned2023-11-17T08:45:30Z
dc.date.available2023-11-17T08:45:30Z
dc.date.created2023-11-07T10:50:31Z
dc.date.issued2023
dc.identifier.citationVenås, G. A. (2023). Human motion estimation for human-robot cooperation [Doctoral dissertation, Western Norway University of Applied Sciences]. HVL Open.en_US
dc.identifier.isbn978-82-8461-036-8
dc.identifier.urihttps://hdl.handle.net/11250/3103117
dc.description.abstractHuman Robot Cooperation (HRC) refers to the cooperation between humans and robots to achieve a common goal or perform a task together. In HRC, the robot is designed to work alongside a human operator, providing assistance and support as needed. This collaboration can take many forms, ranging from a robot that provides physical assistance to a person with a disability, to a robot that helpsworkers in a factory assemble products more efficiently. HRC has many potential applications in areas such as manufacturing, healthcare, logistics, and service industries. For example, in manufacturing, robots can work alongside humans to perform repetitive or dangerous tasks, while in healthcare, robots can assist with patient care, rehabilitation and surgery. HRC can involve the use of advanced robotics technologies such as artificial intelligence, machine learning, natural language processing, and human motion processing to enable robots to understand human intentions and communicate with people in a more natural way. Through this collaboration, humans can leverage the strength, speed, and precision of robots to perform tasks that are difficult, dangerous, or impossible for humans to do alone. The main focus of this study is to develop a reliable Human Motion Estimation (HME) method to be used in industrial HRC applications. HME is the process of tracking and analysing human body movements with different sensory devices to determine the pose, posture, motion and/or gesture of the human. The estimated feature set is used as human input to create an action on the robot side. This thesis investigates how to choose the best motion capture system for industrial applications, how to obtain sufficiently accurate human motions and gestures, how to generate intuitive human input, and how to translate that input into a specific robot output. HRC is a widely researched topic in robotics, including dynamics, control, motion planning, robot learning, teleoperation and machine vision among other branches. All of these branches aim to optimize the advanced interactions between robots and humans. Today, examples of HRC in the industry are limited to turn-based and low-level applications, such as simple pick-and-place tasks and interacting with buttons to start/stop a process. The integration of more complex applications has not been successful enough. An analysis of the literature revealed two key points causing this problem. The first reason is related to the accuracy, reliability, applicability, and convenience of the HME for industrial applications. There are some highly accurate HME methods presented in the literature, which mainly use visual-based motion tracking devices. However, they often fail due to occlusion, loss of line-of-sight, intolerance to lighting changes, and lack of mobility, as well as they often come with a high financial and computational cost. The second reason for the gap between HRC research and implementation in industrial applications is related to user education and the neglect of training system development. Despite the substantial amount of research in the literature, only a few studies address user training, and no studies provide a methodological approach. Therefore, the primary motion capture system for this study is Inertial Measurement Unit (IMU) which does not suffer the aforementioned issues. The biomechanical human body is constructed based on real-time IMU measurements and a respective human input is generated. The usability of such an estimation is tested in a cooperative lifting scenario which is a fundamental task in many HRC applications such as manufacturing, assembly, medical rehabilitation, etc. Additionally, this study investigates effective user training methods and proposes a gamified approach for HRC training. The qualitative and quantitative results in this study show that HME based HRC is promising. IMUs are portable, affordable and reliable tools for this purpose, which makes them convenient for industry. The applicability of the proposed gamified training methodology is validated with multi-user experiments. According to the user test which is carried out atWestern Norway University of Applied Sciences Campus Førde with 32 healthy adults within the age range of 20-54 years, all users show a satisfactory progression. They achieved successful cooperation with the robot after a relatively short training process regardless of age, gender, job category, gaming background and familiarity with robots, etc. While some user backgrounds affect the learning criteria in terms of how quickly, consistently and optimally the user reached a sufficient learned state, no background factor was found to be significantly advantageous or disadvantageous on overall learning achievement. In conclusion, the developed system is highly promising to be implemented in industrial applications.en_US
dc.language.isoengen_US
dc.publisherHøgskulen på Vestlandeten_US
dc.relation.haspartGizem Ateş & Erik Kyrkjebø. Human-Robot Cooperative Lifting using IMUs and Human Gestures. Published in Proceedings of the of Annual Conference Towards Autonomous Robotic Systems, Part of the Lecture Notes in Computer Science book series (LNAI, volume 13054, pages 88-99) Springer, September 8-10, 2021 DOI: 10.1007/978-3-030-89177-0_9en_US
dc.relation.haspartGizem Ateş, Martin Fodstad Stølen & Erik Kyrkjebø. Force and Gesture-based Motion Control of Human-Robot Cooperative Lifting Using IMUs. Published in HRI’22 Proceedings of the 2022 ACM/IEEE International Conference on Human-Robot Interaction (Pages 688–692). IEEE, March 7-10, 2022 DOI: 10.5555/3523760.3523856en_US
dc.relation.haspartGizem Ateş. Work in Progress: Learning Fundamental Robotics Concepts Through Games at Bachelor Level. Published in International Conference on 2022 IEEE Global Engineering Education Conference (EDUCON) IEEE, March 28-31, 2022 DOI: 10.1109/EDUCON52537.2022.9766499en_US
dc.relation.haspartGizem Ateş, Martin Fodstad Stølen & Erik Kyrkjebø. A Framework for Human Motion Estimation using IMUs in Human-Robot Interaction. Published in Proceedings of the 23rd IEEE International Conference on Industrial Technology IEEE, August 22-25, 2022 DOI:10.1109/ICIT48603.2022.10002746en_US
dc.relation.haspartGizem Ateş & Erik Kyrkjebø. Design of a Gamified Training System for Human- Robot Cooperation. Published in Proceedings of the 2nd International Conference on Electrical, Computer, Communications and Mechatronics Engineering (ICECCME) IEEE, November 16-18, 2022 DOI:10.1109/ICECCME55909.2022.9988661en_US
dc.relation.haspartGizem Ateş Venås, Martin Fodstad Stølen & Erik Kyrkjebø. Exploring Human- Robot Cooperation with Gamified User Training: A User Study on Cooperative Lifting. Submitted to Frontiers in Robotics and AI Frontiers. (Submitted)en_US
dc.titleHuman motion estimation for human-robot cooperationen_US
dc.typeDoctoral thesisen_US
dc.description.versionacceptedVersionen_US
dc.rights.holder©Gizem Ateş Venås, 2023en_US
dc.source.pagenumber317en_US
dc.identifier.cristin2193100
cristin.ispublishedtrue
cristin.fulltextpostprint


Tilhørende fil(er)

Thumbnail

Denne innførselen finnes i følgende samling(er)

Vis enkel innførsel