Mousa Moradi

Ph.D. Candidate

Menu

Integrating Human Hand Gestures with Vision Based Feedback Controller to Navigate a Virtual Robotic Arm


Journal article


Mousa Moradi, Sandra Dang, Zayed Alsalem, J. Desai, Alejandro Palacios
2020 23rd International Symposium on Measurement and Control in Robotics (ISMCR), 2020

Semantic Scholar DOI
Cite

Cite

APA   Click to copy
Moradi, M., Dang, S., Alsalem, Z., Desai, J., & Palacios, A. (2020). Integrating Human Hand Gestures with Vision Based Feedback Controller to Navigate a Virtual Robotic Arm. 2020 23rd International Symposium on Measurement and Control in Robotics (ISMCR).


Chicago/Turabian   Click to copy
Moradi, Mousa, Sandra Dang, Zayed Alsalem, J. Desai, and Alejandro Palacios. “Integrating Human Hand Gestures with Vision Based Feedback Controller to Navigate a Virtual Robotic Arm.” 2020 23rd International Symposium on Measurement and Control in Robotics (ISMCR) (2020).


MLA   Click to copy
Moradi, Mousa, et al. “Integrating Human Hand Gestures with Vision Based Feedback Controller to Navigate a Virtual Robotic Arm.” 2020 23rd International Symposium on Measurement and Control in Robotics (ISMCR), 2020.


BibTeX   Click to copy

@article{mousa2020a,
  title = {Integrating Human Hand Gestures with Vision Based Feedback Controller to Navigate a Virtual Robotic Arm},
  year = {2020},
  journal = {2020 23rd International Symposium on Measurement and Control in Robotics (ISMCR)},
  author = {Moradi, Mousa and Dang, Sandra and Alsalem, Zayed and Desai, J. and Palacios, Alejandro}
}

Abstract

This paper reports the design and development of a real-time IMU-Vision-based hybrid control algorithm to interact with a 6-DOF Kinova virtual robotic arm. Human Robot Interaction (HRI) control scheme proposed in this paper utilizes an embedded gyroscopic sensor from a Myo Gesture Control Armband’s inertial measurement unit and an 800*600-pixel resolution from a Microsoft HD camera. The algorithm exploits the mathematical features of a numerical discrete time integrator and a mean filter to process the raw angular velocity data from the gyroscope. The processed data further provides the angular displacements to the end-effector of the robotic arm during clockwise or counterclockwise actions along x, y, and z axes from the user. This also facilitates the end effector (gripper) motion which was also controlled simultaneously by the roll action through threshold comparison in the algorithm. A vision based feedback system was designed using a computer vision toolbox and blob analysis technique in order to make the system more reliable and to control the distance of the end-effector while reaching to desired objects. The results demonstrated a significant control of the 6-DOF virtual robotic limb using the gyroscopic information and user inputs. The virtual robotic arm stopped the movement after reaching 320 mm from the desired object as expected. For three different objects, the maximum error between the real and the measured distance was calculated as 15.3 cm for cylindrical object. Due to its smooth control and arm gesture controller, this technology has the potential to assist people with either physical impairments or neurological disorders to perform activities of daily living using assistive robotic arm in the near future.


Share



Follow this website


You need to create an Owlstown account to follow this website.


Sign up

Already an Owlstown member?

Log in