A Camera-based Tactile Sensor for Sorting TasksA Camera-based Tactile Sensor for Sorting Tasks

A Camera-based Tactile Sensor for Sorting Tasks

Autonomous robots require sophisticated perception systems to manipulate objects. In this context, cameras denote a rich source of information and are also precise and cheap. Our novel sensor approach uses cameras together with robust image processing algorithms to measure forces and other haptic data. 

The sensor system bases on flexible rubber foam without any internal electronics, which is attached on the fingers of a gripper. This flexible material shows a characteristic deformation during contact. The camera observes the foam and image processing algorithms detect the foam’s deformation from the camera images. Finally, our sensor software calculates haptic contact data during the grasp procedure.

This video shows the capabilities of this so-called visuo-haptic sensor in a sorting task. A linear robot with a two-finger gripper sorts plastic bottles based on their compliance. During the grasp procedure, the visuo-haptic sensor measures the contact forces, the pressure distribution along the gripper finger, finger position, the object deformation, and also estimates the object compliance as well as object properties like, e.g., shape and size. The novelty lies in the fact that all this data is extracted from camera images, i.e., the robot is “feeling by seeing”.

More information: www.lmt.ei.tum.de/rovi/