Hostname: page-component-cd9895bd7-7cvxr Total loading time: 0 Render date: 2024-12-27T06:16:21.015Z Has data issue: false hasContentIssue false

User–Robot Interaction for Safe Navigation of a Quadrotor

Published online by Cambridge University Press:  29 January 2020

L. F. Sanchez
Affiliation:
Sorbonne Universités, Université de Technologie de Compiègne, CNRS, UMR 7253, Heudiasyc, 57 Avenue de Landshut, CS 60319, 60203 Compiègne cedex, France E-mail: habaunza@hds.utc.fr Instituto Tecnológico y de Estudios Superiores de Monterrey, Mexico E-mail: luisanch@etu.utc.fr
H. Abaunza
Affiliation:
Sorbonne Universités, Université de Technologie de Compiègne, CNRS, UMR 7253, Heudiasyc, 57 Avenue de Landshut, CS 60319, 60203 Compiègne cedex, France E-mail: habaunza@hds.utc.fr
P. Castillo*
Affiliation:
Sorbonne Universités, Université de Technologie de Compiègne, CNRS, UMR 7253, Heudiasyc, 57 Avenue de Landshut, CS 60319, 60203 Compiègne cedex, France E-mail: habaunza@hds.utc.fr
*
*Corresponding author. E-mail: castillo@hds.utc.fr

Summary

This paper introduces an intuitive and safe command approach for a quadrotor, where inertial and muscular gestures are used for semi-autonomous flight. A bracelet composed of gyroscopes, accelerometers, and electromyographic sensors is used to detect user gestures, then an algorithm is proposed to interpret the signals as flight commands. The main goal is to provide a wearable, easy-to-handle human–machine interface for users a to safely command this kind of vehicles, even for inexpert operators. Safety measures are incorporated in the scheme to further enhance the user’s experience. Experimental tests are performed to validate the proposal.

Type
Articles
Copyright
Copyright © Cambridge University Press 2020

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Maza, I., Caballero, F., Molina, R., Peña, N. and Ollero, A., “Multimodal interface technologies for uav ground control stations,” J. Intell. Robot. Syst. 57(1–4), 371391 (2010).CrossRefGoogle Scholar
Orsag, M., Haus, T., Tolić, D., Ivanovic, A., Car, M., Palunko, I. and Bogdan, S., “Human-in-the-Loop Control of Multi-Agent Aerial Systems,” Control Conference (ECC), 2016 European (IEEE, 2016) pp. 21392145.Google Scholar
Poncela, A. and Gallardo-Estrella, L., “Command-based voice teleoperation of a mobile robot via a human-robot interface,” Robotica 33(1), 118 (2015).CrossRefGoogle Scholar
Medicherla, H. and Sekmen, A., “Human–robot interaction via voice-controllable intelligent user interface,” Robotica 25(5), 521527 (2007).CrossRefGoogle Scholar
Kazi, Z. and Foulds, R., “Knowledge driven planning and multimodal control of a telerobot,” Robotica 16(5), 509516 (1998).CrossRefGoogle Scholar
Lementec, J.-C. and Bajcsy, P., “Recognition of Arm Gestures using Multiple Orientation Sensors: Gesture Classification,” Proceedings of the 7th International IEEE Conference on Intelligent Transportation Systems (IEEE Cat. No. 04TH8749) (IEEE, 2004) pp. 965970.Google Scholar
Urban, M., Bajcsy, P., Kooper, R. and Lementec, J.-C., “Recognition of Arm Gestures using Multiple Orientation Sensors: Repeatability Assessment,” Proceedings of the 7th International IEEE Conference on Intelligent Transportation Systems (IEEE Cat. No. 04TH8749) (IEEE, 2004) pp. 553558.Google Scholar
Redrovan, D. V. and Kim, D., “Hand Gestures Recognition using Machine Learning for Control of Multiple Quadrotors,” 2018 IEEE Sensors Applications Symposium (SAS) (IEEE, 2018) pp. 16.CrossRefGoogle Scholar
Nahapetyan, V. and Khachumov, V., “Gesture recognition in the problem of contactless control of an unmanned aerial vehicle,” Optoelectr. Instrument. Data Process. 51(2), 192197 (2015).CrossRefGoogle Scholar
Costante, G., Bellocchio, E., Valigi, P. and Ricci, E., “Personalizing Vision-Based Gestural Interfaces for HRI with UAVs: A Transfer Learning Approach,” 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems (IEEE, 2014) pp. 33193326.Google Scholar
Monajjemi, V. M., Wawerla, J., Vaughan, R. and Mori, G., “HRI in the Sky: Creating and Commanding Teams of UAVs with a Vision-Mediated Gestural Interface,” 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems (IEEE, 2013) pp. 617623.CrossRefGoogle Scholar
Costalonga, T. O., Ávila, L. M., Muniz, L. and Brandão, A. S., “Gesture-Based Controllers to Guide a Quadrotor using Kinect Sensor,2014 Joint Conference on Robotics: SBR-LARS Robotics Symposium and Robocontrol (IEEE, 2014) pp. 109112.Google Scholar
Vadakkepat, P., Chong, T. C., Arokiasami, W. A. and Weinan, X., “Fuzzy Logic Controllers for Navigation and Control of AR. Drone using Microsoft Kinect,” 2016 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE) (IEEE, 2016) pp. 856863.CrossRefGoogle Scholar
Vitiello, N., Olcese, U., Oddo, C., Carpaneto, J., Micera, S., Carrozza, M. and Dario, P., “A Simple Highly Efficient Non Invasive EMG-Based HMI,” 2006 International Conference of the IEEE Engineering in Medicine and Biology Society (IEEE, 2006) pp. 34033406.Google Scholar
Zimenko, K., Margun, A. and Kremlev, A., “EMG Real-Time Classification for Robotics and HMI,” 2013 18th International Conference on Methods & Models in Automation & Robotics (MMAR) (IEEE, 2013) pp. 340343.Google Scholar
Xu, Y., Yang, C., Liang, P., Zhao, L. and Li, Z., “Development of a Hybrid Motion Capture Method using MYO Armband with Application to Teleoperation,” 2016 IEEE International Conference on Mechatronics and Automation (IEEE, 2016) pp. 11791184.CrossRefGoogle Scholar
Akhmadeev, K., Rampone, E., Yu, T., Aoustin, Y. and Le Carpentier, E., “A testing system for a real-time gesture classification using surface EMG,” IFAC-PapersOnLine 50(1), 1149811503 (2017).CrossRefGoogle Scholar
Kucukyildiz, G., Ocak, H., Karakaya, S. and Sayli, O., “Design and implementation of a multi sensor based brain computer interface for a robotic wheelchair,” J. Intell. Robot. Syst. 87(2), 247263 (2017).CrossRefGoogle Scholar
Fernandez, R. A. S., Sanchez-Lopez, J. L., Sampedro, C., Bavle, H., Molina, M. and Campoy, P., “Natural User Interfaces for Human-Drone Multi-Modal Interaction,” 2016 International Conference on Unmanned Aircraft Systems (ICUAS) (IEEE, 2016) pp. 10131022.Google Scholar
Choi, Y., Hwang, I. and Oh, S., “Wearable Gesture Control of Agile Micro Quadrotors,” 2017 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI) (IEEE, 2017) pp. 266271.Google Scholar
DelPreto, J., Salazar-Gomez, A. F, Gil, S., Hasani, R. M, Guenther, F. H and Rus, D., “Plug-and-play supervisory control using muscle and brain signals for real-time gesture and error detection”, In: Robotics: Science and Systems (2018).CrossRefGoogle Scholar
Muresan, B. and Esfahlani, S. S., “Autonomous Flight and Real-Time Tracking of Unmanned Aerial Vehicle,” Science and Information Conference (Springer, 2018) pp. 945956.CrossRefGoogle Scholar
Côté-Allard, U., St-Onge, D., Giguère, P., Laviolette, F. and Gosselin, B., “Towards the Use of Consumer-Grade Electromyographic Armbands for Interactive, Artistic Robotics Performances,” 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN) (IEEE, 2017) pp. 10301036.CrossRefGoogle Scholar
Sun, T., Nie, S., Yeung, D.-Y. and Shen, S., “Gesture-Based Piloting of an Aerial Robot using Monocular Vision,” 2017 IEEE International Conference on Robotics and Automation (ICRA) (IEEE, 2017) pp. 59135920.CrossRefGoogle Scholar
Peshkova, E., Hitz, M., Ahlström, D., Alexandrowicz, R. W. and Kopper, A., “Exploring Intuitiveness of Metaphor-Based Gestures for UAV Navigation,” 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN) (IEEE, 2017) pp. 175182.Google Scholar
Sanchez, L. F., Abaunza, H. and Castillo, P., “Safe Navigation Control for a Quadcopter using User’s Arm Commands,” 2017 International Conference on Unmanned Aircraft Systems (ICUAS) (IEEE, 2017) pp. 981988.Google Scholar
Oliva-Palomo, F., Sanchez-Orta, A., Castillo, P. and Alazki, H., “Nonlinear ellipsoid based attitude control for aggressive trajectories in a quadrotor: Closed-loop multi-flips implementation,” Control Engineering Practice, vol 77 (Elsevier, 2018), pp. 150161 Google Scholar
Carino, J., Abaunza, H. and Castillo, P., “Quadrotor Quaternion Control,” 2015 International Conference on Unmanned Aircraft Systems (ICUAS) (IEEE, 2015) pp. 825831.CrossRefGoogle Scholar
Sanahuja, G., Fl-Air Framework Libre Air. https://uav.hds.utc.fr/software-flair/. Accessed: 2019-03-10.Google Scholar
Myo Gesture Control Armband. https://support.getmyo.com/. Accessed: 2019-03-10.Google Scholar
Consentino, F., Pyoconnect. http://www.fernandocosentino.net/pyoconnect/. Accessed: 2019-03-10.Google Scholar

Sanchez et al. supplementary material

Sanchez et al. supplementary material

Download Sanchez et al. supplementary material(Video)
Video 4.3 MB