● Introduction
Features
20 DoFs (10 active + 10 passive) enable accurate simulation of human hand movements and fine operations, supporting adaptive object grasping and complex tasks.
●Multi-Modal Perception & Intelligent Interaction
Configurable with cameras, electronic skin, and other sensors to build a fullrange "vision + touch" perception model—enhancing environment understanding and interaction capabilities for unstructured scenarios.
●Edge-Cloud Integration & No-Code Deployment
Leverages edge-cloud architecture for one-click skill deployment from cloud libraries, reducing usage barriers and improving development efficiency.
●High-Reliability Structure & Data Support
Self-developed motor and linkage system resists impact and damage, adapting to high-intensity scenarios like embodied intelligence training. It supports efficient data collection for data farm construction and algorithm optimization.
Robot Hand Interfaces
●Supported robotic arms: UR, Franka, XArm, RealMan, Songling
●Supported data acquisition methods: teleoperation gloves, exoskeleton gloves, liquid metal gloves, vision, VR (Meta Quest 3)
●Supported simulators: PyBullet, Isaac, MuJoCo
●Supported interfaces: CAN, 485
●Example usages: ROS1, ROS2, Python, C++
Specification
Sheet 1
Communication Methods
●CAN Interface
Utilizes a proprietary protocol; baud rate is 1Mbps; default device IDs: left hand 0x28, right hand 0x27; supports broadcast ID 0xFF (for addressing, identification, and debugging).
●RS485 Interface
Adopts the Modbus protocol; baud rate is 115200bps; default device IDs: left hand 0x28, right hand 0x27; supports function codes: 03/04/06/16; UART settings are fixed: 8 data bits, 1 stop bit, no parity.