Skip to main content

Research Repository

Advanced Search

Controlling by Showing: i-Mimic: A Video-Based Method to Control Robotic Arms

Chakraborty, Debarati B.; Sharma, Mukesh; Vijay, Bhaskar

Authors

Mukesh Sharma

Bhaskar Vijay



Abstract

A novel concept of vision-based intelligent control of robotic arms is developed here in this work. This work enables the controlling of robotic arm motion only with visual input, that is, controlling by showing the videos of correct movements. This work can broadly be sub-divided into two segments. The first part of this work is to develop an unsupervised vision-based method to control robotic arms in 2-D plane, and the second one is with deep convolutional neural network (CNN) in the same task in 3-D plane. The first method is unsupervised, where our aim is to perform mimicking of human arm motion in real-time by a manipulator. We developed a network, namely the vision-to-motion optical network (DON). Given the input of a video stream containing the hand movements of human on the DON, the velocity and torque information of the hand movements shown in the video would be generated as the output. The output information of the DON is then fed to the robotic arm by enabling it to generate motion according to the real hand videos. The method has been tested on both live-stream video feeds as well as on recorded video obtained from a monocular camera even by intelligently predicting the trajectory of the human hand when it gets occluded. This is why the mimicry of the arm incorporates some intelligence to it and becomes an intelligent mimic (i- mimic). Furthermore, to enhance the performance of DON and make it applicable to mimic multi-joint movements with n-link manipulator, a deep network, namely, CNN has been used along with a refiner network as the predecessor of DON. Refiner network has been used to overcome the limitations of inadequate labelled data. Both the proposed methods are validated with off-line as well as with on-line video datasets in real-time. The entire methodology is validated with real-time 1-link and simulated n-link manipulators (an arm with n number of different joints) along with suitable comparisons.

Citation

Chakraborty, D. B., Sharma, M., & Vijay, B. (2022). Controlling by Showing: i-Mimic: A Video-Based Method to Control Robotic Arms. SN Computer Science, 3(2), Article 124. https://doi.org/10.1007/s42979-022-01014-2

Journal Article Type Article
Acceptance Date Jan 2, 2022
Online Publication Date Jan 10, 2022
Publication Date 2022-03
Deposit Date Mar 13, 2024
Publicly Available Date Jul 25, 2025
Journal SN Computer Science
Electronic ISSN 2661-8907
Publisher Springer
Peer Reviewed Peer Reviewed
Volume 3
Issue 2
Article Number 124
DOI https://doi.org/10.1007/s42979-022-01014-2
Keywords Visio-based robot control; Video processing; Deep network; Convolutional neural network; Robotic arm control
Public URL https://hull-repository.worktribe.com/output/4588820

Files

Accepted manuscript (3.6 Mb)
PDF

Copyright Statement
Copyright © 2022 Springer-Verlag. This version of the article has been accepted for publication, after peer review (when applicable) and is subject to Springer Nature’s AM terms of use, but is not the Version of Record and does not reflect post-acceptance improvements, or any corrections. The Version of Record is available online at: http://dx.doi.org/10.1007/s42979-022-01014-2





You might also like



Downloadable Citations