CIVIL ENGINEERING 365 ALL ABOUT CIVIL ENGINEERING



AbstractRobotic teleoperation has shown great potentials in various construction applications. With the advancements of virtual telepresence and motion capture technologies, bilateral teleoperation has been tested in precision construction operations, where a human operator can drive the motion of a remote robot with their natural body motions. A significant challenge is that because of the mismatch between robot mechanic design and the human body, such as a different number of joints of a robotic arm and a human arm, the recovered robot motions driven by human hand motions may not be desired, leading to unintended consequences including collision. This study presents a proactive collision avoidance system based on the real-time prediction of human hand motions. The proposed method, Feature-based Human-Motion Prediction (FHMP), stores streaming motion data into a data pool, quantifies the spatiotemporal relationship between gaze focus and hand movement trajectories, and segments and clusters the streaming data into different pattern groups based the motion pattern similarity. Different machine learning (ML) models are trained for each of the pattern groups. During the real-time prediction, whenever a pattern change is detected, the ML model is transitioned to a new model that matches the new pattern. A data buffering approach is used to reuse the old data and old ML model for a certain period of time before the new ML model is well trained, to ensure an uninterrupted real-time prediction of human hand motions. The gaze and hand motion data of a human subject experiment (n=120) for pipe skid maintenance was used to test the system in a virtual reality (VR) environment. The result shows that FHMP can support anticipatory collision avoidance in bilateral teleoperation with a better prediction performance. Future research could enable testing the method on real robots for more believable results.



Source link

Leave a Reply

Your email address will not be published.