Towards teleoperation-based interactive learning of robot kinematics using a mobile augmented reality interface on a tablet

The integration of augmented reality (AR) techniques in user interface design has enhanced interactive experiences in teleoperation of robots, hands-on learning in classrooms, laboratory, and special education, and user training in an array of fields, e.g., aerospace, automotive, construction, manufacturing, medical, etc. However, AR-based user interfaces that command machines and tools have not been fully explored for their potential to enhance interactive learning of engineering concepts in the laboratory. This paper outlines the development of a mobile application executing on a tablet device, which renders an immersive AR-based graphical user interface to enable users to monitor, interact with, and control a four link underactuated planar robot. Computer vision routines are used to extract real-time, vision-based measurements of the robot’s joint angles and end effector location from the live video captured by the rear-facing camera on the tablet. The obtained measurements are used to render AR content to offer users with additional visual feedback. Touch gesture recognition is implemented to allow users to naturally and intuitively command the robot by tapping and dragging their fingers at desired locations on the tablet screen. Experimental results show the performance and efficacy of the proposed system as it is operated in two different modes: one in which the user has direct control over the angles of the actuated links of the robot and one in which the user has direct control over the end effector location.

Frank, J. A., & Kapila, V. (2016, January). Towards teleoperation-based interactive learning of robot kinematics using a mobile augmented reality interface on a tablet. Proc. Indian Control Conference, IEEE, pp. 385-392.