EFI-Robotics aims to create dexterous and safe robotic physical interactions with the environment or humans in unstructured scenarios by combining control methods and machine learning techniques.
For tendon-driven multi-fingered robotic hands, ensuring grasp adaptability while minimizing the number of actuators needed to provide human-like functionality is a challenging problem. Inspired by the Pisa/IIT SoftHand, this paper introduces a 3D-printed, highly-underactuated, five-finger robotic hand named the Tactile SoftHand-A, which features only two actuators. The dual-tendon design allows for the active control of specific (distal or proximal interphalangeal) joints to adjust the hand’s grasp gesture. We have also developed a new design of fully 3D-printed tactile sensor that requires no hand assembly and is printed directly as part of the robotic finger. This sensor is integrated into the fingertips and combined with the antagonistic tendon mechanism to develop a human-hand-guided tactile feedback grasping system. The system can actively mirror human hand gestures, adaptively stabilize grasp gestures upon contact, and adjust grasp gestures to prevent object movement after detecting slippage. Finally, we designed four different experiments to evaluate the novel fingers coupled with the antagonistic mechanism for controlling the robotic hand’s gestures, adaptive grasping ability, and human-hand-guided tactile feedback grasping capability. The experimental results demonstrate that the Tactile SoftHand-A can adaptively grasp objects of a wide range of shapes and automatically adjust its gripping gestures upon detecting contact and slippage. Overall, this study points the way towards a class of low-cost, accessible, 3D-printable, underactuated human-like robotic hands, and we openly release the designs to facilitate others to build upon this work. This work is Open-sourced.
CoRL
AnyRotate: Gravity-Invariant In-Hand Object Rotation with Sim-to-Real Touch
Max Yang, Chenghua Lu, Alex Church, and 6 more authors
Human hands are capable of in-hand manipulation in the presence of different hand motions. For a robot hand, harnessing rich tactile information to achieve this level of dexterity still remains a significant challenge. In this paper, we present AnyRotate, a system for gravity-invariant multi-axis in-hand object rotation using dense featured sim-to-real touch. We tackle this problem by training a dense tactile policy in simulation and present a sim-to-real method for rich tactile sensing to achieve zero-shot policy transfer. Our formulation allows the training of a unified policy to rotate unseen objects about arbitrary rotation axes in any hand direction. In our experiments, we highlight the benefit of capturing detailed contact information when handling objects with varying properties. Interestingly, despite not having explicit slip detection, we found rich multi-fingered tactile sensing can implicitly detect object movement within grasp and provide a reactive behavior that improves the robustness of the policy. The project website can be found at this https URL.
RA-L
BioTacTip: A Soft Biomimetic Optical Tactile Sensor for Efficient 3D Contact Localization and 3D Force Estimation
Haoran Li, Saekwang Nam, Zhenyu Lu, and 3 more authors
In this letter, we introduce a new soft biomimetic optical tactile sensor based on mimicking the interlocking structure of the epidermal-dermal boundary: the BioTacTip. The primary sensing unit comprises a sharp white tip surrounded by four black cover tips that when subjected to an external force emphasizes the applied direction and contact location, for high-resolution imaging by an internal camera. The sensor design means that we can utilize the tactile images directly as the model input (not requiring marker detection) for computationally efficient reconstruction of 3D external forces, contact geometry, localization and depth, by utilizing an analytic tactile model based on dynamic friction and internal pressure. Indentation and press-and-shear tests confirmed this mechanism, with sub-mm localization and indentation errors, and normal and shear force time series that match measured quantities. The sensor design opens up a new way to instantiate biomimicry in optical tactile sensors that utilizes mechanical processing in the skin.
ICRA
Tactile-Driven Gentle Grasping for Human-Robot Collaborative Tasks
Christopher J. Ford, Haoran Li, John Lloyd, and 4 more authors
In IEEE International Conference on Robotics and Automation 2023
This paper presents a control scheme for force sensitive, gentle grasping with a Pisa/IIT anthropomorphic SoftHand equipped with a miniaturised version of the TacTip optical tactile sensor on all five fingertips. The tactile sensors provide high-resolution information about a grasp and how the fingers interact with held objects. We first describe a series of hardware developments for performing asynchronous sensor data acquisition and processing, resulting in a fast control loop sufficient for real-time grasp control. We then develop a novel grasp controller that uses tactile feedback from all five fingertip sensors simultaneously to gently and stably grasp 43 objects of varying geometry and stiffness, which is then applied to a human-to-robot handover task. These developments open the door to more advanced manipulation with underactuated hands via fast reflexive control using high-resolution tactile sensing.
RA-L | IROS
A Robust Controller for Stable 3D Pinching Using Tactile Sensing
Efi Psomopoulou, Nicholas Pestell, Fotios Papadopoulos, and 3 more authors
Robotics & Automation Letters, IEEE/RSJ International Conference on Intelligent Robots and Systems 2021
This paper proposes a controller for stable grasping of unknown-shaped objects by two robotic fingers with tactile fingertips. The grasp is stabilised by rolling the fingertips on the contact surface and applying a desired grasping force to reach an equilibrium state. The validation is both in simulation and on a fully-actuated robot hand (the Shadow Modular Grasper) fitted with custom-built optical tactile sensors (based on the BRL TacTip). The controller requires the orientations of the contact surfaces, which are estimated by regressing a deep convolutional neural network over the tactile images. Overall, the grasp system is demonstrated to achieve stable equilibrium poses on a range of objects varying in shape and softness, with the system being robust to perturbations and measurement errors. This approach also has promise to extend beyond grasping to stable in-hand object manipulation with multiple fingers.
TCST
Prescribed Performance Tracking of a Variable Stiffness Actuated Robot
Efi Psomopoulou, Achilles Theodorakopoulos, Zoe Doulgeri, and 1 more author
IEEE Transactions on Control Systems Technology Sep 2015
This paper is concerned with the design of a state feedback control scheme for variable stiffness actuated (VSA) robots, which guarantees prescribed performance of the track- ing errors despite the low range of mechanical stiffness. The controller does not assume knowledge of the actual system dynamics nor does it utilize approximating structures (e.g., neural networks and fuzzy systems) to acquire such knowledge, leading to a low complexity design. Simulation studies, incorporating a model validated on data from an actual variable stiffness actuator (VSA) at a multi-degrees-of-freedom robot, are performed. Com- parison with a gain scheduling solution reveals the superiority of the proposed scheme with respect to performance and robustness.