UR10e Teleoperation
The goal of this project was to develope an industrial-grade teleoperation system enabling human operators to control a Universal Robots UR10e cobot and Sarcomere Dynamics’ 5-finger dexterous hand using VR tracking and haptic feedback. Collaborating with Eaton, HaptX, Sarcomere Dynamics, and TouchLab, we engineered a solution to automate complex assembly tasks—such as screw driving and component placement—while replicating human dexterity in Eaton’s electrical panel production.
Company Partners
Eaton was our main sponsor for this project. If it wasn’t for them, none of this would have been possible. They provided the main funding and covered the cost of the HaptX and Sarcomere Systems
HaptX provided their HaptX Gloves G1 and continuously helped us with the use and application of their HaptX SDK
Sarcomere Dynamics provided their ARTUS Mark 9 robotic hand. We were lucky enough to get our hands on one of the first prototypes of their new version. As their product is still developing, we had a direct impact on the design features that were being implemented.
TouchLab was introduced to Sarcomere Dynamics when we introduced the Haptx system into the project. To take advantage of HaptX’s tactile feedback, the ARTUS Mark 9 was custom-designed to implement TouchLab’s custom finger sensors so that we could get real tactile feedback to the gloves, and the operator was programming the arm
3D Position and Orientation Tracking with Vive Trackers:
The teleoperation system leverages HTC Vive trackers and base stations for high-precision 3D spatial tracking. Vive trackers attached to the user’s forearm capture real-time positional and rotational data at a refresh rate of 325 Hz, enabling sub-millimeter accuracy. The system uses OpenVR to interface with the trackers, extracting pose data as quaternions and Cartesian coordinates.
3D Tracking Architecture:
The system’s precision stems from four integrated mechanisms: Coordinate Alignment maps the VR tracker’s spatial data to the robot’s workspace via calibration and a predefined transformation matrix, resolving frame-of-reference mismatches. Quaternion Operations process rotational data—normalizing, multiplying, and interpolating orientations (via SLERP)—to maintain smooth, gimbal-lock-free transitions. Continuous Rotation Tracking detects and corrects angle wrapping (e.g., ±180° discontinuities) to ensure seamless multi-revolution movements. Finally, Motion Scaling proportionally reduces the tracker’s physical motion range to match the robot’s operational envelope, enabling millimeter-level precision in confined workspaces. Together, these elements bridge human motion and robotic execution with sub-millimeter accuracy and sub-degree rotational fidelity.
Universal Tracker Alignment: Enabling Teleoperation from Any Orientation
To enable teleoperation from any initial tracker orientation, the system employs a two-step calibration protocol. First, the UR10e arm is programmatically moved to a known home pose (predefined in joint or Cartesian space). Once stationary, the operator physically aligns the Vive tracker with the robot’s end-effector and triggers calibration via a console prompt. During this phase:
1. Pose Capture: The tracker’s current position and orientation are recorded, establishing a spatial reference frame.
2. Offset Calculation: The difference between the tracker’s live pose and the robot’s home pose is computed, including rotational offsets via quaternion inversion and multiplication.
3. Dynamic Re-mapping: A transformation matrix and scaling factor adapt the tracker’s arbitrary starting orientation to the robot’s coordinate system.
This process decouples the tracker’s initial orientation from the robot’s frame, allowing operators to begin teleoperation without rigid alignment. The system continuously resolves orientation deltas during runtime using quaternion-based relative transforms, ensuring seamless control regardless of starting angles.
The Result:
This project successfully demonstrated the feasibility of a VR-driven teleoperation system for dexterous robotic assembly, integrating the UR10e cobot, Sarcomere Dynamics’ ARTUS Lite hand, HaptX haptic gloves, and Vive trackers into a unified framework. By resolving critical challenges in spatial mapping, orientation synchronization, and human-robot coordination, the system achieved sub-millimeter positional accuracy and seamless quaternion-based rotational control, validated through live demonstrations of assembly tasks. While full industrial deployment requires further refinement in multi-axis synchronization and error handling, the proof-of-concept underscores the viability of 5-finger end-effectors in automating complex, human-centric tasks. The project’s modular architecture will provide a foundation for future teams to expand functionality, such as dual-arm coordination or AI-assisted path planning. By bridging human dexterity with robotic precision, this work advances Eaton’s vision of safer, more adaptable manufacturing systems while exemplifying the transformative potential of collaborative robotics in industrial automation.