DEVELOPMENT AND INTEGRATION OF TACTILE SENSING SYSTEM

Diploma

ABSTRACT

To grasp and manipulate complex objects, robots require information about the interaction between the end effector and the object. This work describes the integration of a low-cost 3-axis tactile sensing system into two different robotic systems and the measurement of some of these complex interactions. The sensor itself is small, lightweight, and compliant so that it can be integrated within a variety of end effectors and locations on those end effectors (e.g. wrapped around a finger).

To improve usability and data collection, a custom interface board and ROS (Robot Operating System) package were developed to read the sensor data and interface with the robots and grippers. Sensor data has been collected from four different tasks: 1. pick and place of non-conductive and conductive objects, 2. wrist-based manipulation, 3. peeling tape, and 4. human interaction with a grasped object. In the last task, a closed loop controller is used to adjust the grip force on the grasped object while the human interacts with it.

Introduction

Motivation

Robotic manipulators have been well-established in the industry for decades, and are exceptionally good at performing complex grasping and manipulation tasks in a carefully engineered environment. In production lines, manipulators are performing tasks ranging from pick & place to automobile assembly with great accuracy. It is only possible due to the closely controlled environment in which these manipulators operate. Any slight change in the environment and we see these manipulators tends to fall apart.

With the introduction of robots to more dynamic environments such as living room or kitchen, manipulators will have to perform complex human-like maneuvers such as grasping everyday objects, passing on drinks from the refrigerator, etc. The manipulators will be operating in constrained and unknown workspaces with limited information on objects. To overcome this lack of information, significant work has been done on integrating force sensors and remote sensors such as cameras, sonars, and laser scanners with manipulators. The remote sensors tend to consume high processing power to provide an estimate of object’s pose, and they lack in providing any confirmation on if the object has been gripped or not. Moreover, they are unable to provide some essential object properties such as weight of the object, slippage etc. Most of the force sensors integrated along with manipulators provide an accurate measure of the normal component of the grasping force, but in general require high forces to be applied on the objects and are incapable of measuring the shear forces.

Humans are much more adaptable to their environment. We can see humans grasping and manipulating new objects in an completely unknown environment. Bin picking which is one of the most common tasks in industries could be performed by humans without even looking at the bin whereas robots tend to fail even with the sense of vision and normal force.

Humans heavily rely on the shear data to perform dexterous manipulation tasks. These shear forces are extremely crucial for grasping and manipulating objects with different attributes. To utilize tactile sensing, humans contact the object first and once in contact with the object, the resulting reaction forces are used to gauge and improve gasping; measured shear forces are influenced by the weight and shape of the object, especially if tilt is involved. More than the global shape of the object, it is the local shape (shape of the object around the contact area) that affects the reaction force and the grasp. The mechanoreceptors in the skin of the human finger tips are highly sensitive to forces tangential to the skin and help provide a clear indication of the grasp. The work in shows how tactile perception from the mechanoreceptors allow humans to perform complex maneuvers.

Due to the potential advantages for grasping and manipulation, researchers have developed numerous sensors capable of detecting these tactile forces in works like. Many of these sensors are small, flexible, and relatively low cost. Industrial sensors like the 3D40 by ME-Meßsysteme Company, 3 axis load cells by Interface Company, the 3-axis Force Sensor by Tec Gihan Co, BioTac by SynTouch, Inc, and OMD by Optoforce provide force data in 3 axes. Although tactile sensors is not something new, the key challenges for proper utilization of the sensors lies with their integration. While researchers provide a number of unique and different tactile sensing technologies, most of them are still not mature enough to be employed on robotic systems. Industry on the other hand does have solutions implemented but they typically cost thousands of dollars and are relatively large and bulky. Along with that most of the industrial solutions require significant changes in the exiting setup thus restricting the environment once again.

Challenges

For the development of tactile systems it is essential to understand the challenges we will be facing with their integration and usage along with robotic systems. A lot of good tactile sensors have been developed in the past but most of them tend to lack the focus on their application in real world. In past, a lot of focus have been on the performance parameters such as sensitivity of the sensors, dynamic resolution, and noise levels. Although all these parameters are essential this would is more focused towards overcoming the challenges of the sensors application in real world scenarios. The following characteristics were considered essential. The importance of each will be covered in consecutive chapters:

1.    Modularity: The sensors used in the work were easy to design and fabricate, allowing fast (within 8 hrs) fabrication of sensors with different sensitivity and measurement range. A modular system with an ability to simply replace the sensors will facilitate shear force measurement for a range of applications.

2.    Compact Design: Minimalistic design will allow easy mounting and will help in avoiding any modification to the existing setup.

3.    Object material and geometry independence: Responsive to different materials and shapes of the objects will allow measurement of interaction with range of objects.

4.    Software compatibility: Compatible with existing software framework to avoid additional step of establishing communication with existing system.

5.    External environment independence: Responsive in different environments and for range of tasks to ensure operation in unknown environments.

Hardware Design

The sensor hardware was based on an Arduino and custom PCB designed around an Analog Devices AD7746 Capacitance to Digital Converter (CDC) chip. The connections between the circuit and the sensor were kept minimalistic and modular by using FCI clinchers to allow rapid integration of different sensor configurations. I2C communication protocol was used to communicate between the AD7746 chip and Arduino. The main system and Arduino communicated using rossserial protocol via USB.

Software Design

For the software it was essential to ensure the compatibility of the sensors with existing robotic systems. ROS is one of the most widely used software frameworks for robots, and was therefore chosen for the sensors software architecture. Writing a custom ROS package ensures that the sensor will be functional with any ROS version, allowing communication between the sensor and both old and new robot systems.