EU H2020 THING
The external page subTerranean Haptic INvestiGator Project is an EU H2020 funded project which brings together lead researchers to focus on haptic navigation for walking robots. The project will focus on developing new terrain sensing, estimation and control technology for the ANYmal quadruped. In addition to ETH Zurich, partners will include Oxford University, Poznan University of Technology, Universities of Edinburgh and Pisa as well as industrial partners such as ANYbotics, KGMH Cuprum (a major Polish mining company) and the City of Zurich.
Compared with wheels, legs can provide superior mobility and agility. Scaling obstacles, leveraging the environment (e.g. for support, for pushing-off), adapting to the condition of terrain, and executing dynamic manoeuvres like jumping or bounding, are some capabilities of legged locomotion. However, such advanced mobility requires advanced perception. To plan an aggressive manoeuvre, for example, the robot should know the geometry of the environment (i.e. where it can place its feet), but critically, the robot needs awareness of its physical properties (e.g. friction of a wet slope, compliance of soft dirt, stability of a rock formation), in order to decide where it ought to place its feet.
While legged robots have traditionally relied on vision, e.g. 3D-cameras or LiDAR, to survey geometry via point clouds, this modality is often limiting. Sensors may be ill-positioned (i.e. not near the ground or feet), have insufficient resolution, and may degrade in perceptually challenging settings with smoke, dust, poor lighting, or standing water. Fundamentally, vision is perception from a distance, and it is challenging to ascertain the physics of an environment without contact.
Thus, the aim of THING is to advance the perception ability of legged robots through haptic modality. The inspiration for this project is the ‘walking hand’ 1 : a robot capable of a high degree of mobility, while able to exploit legs and feet for haptic perception: sensing the shape, compliance, and friction of the environment as it interacts with it. Like fingers of a hand, legs can probe, stroke, and feel the terrain precisely where it matters most. Unlike wheels or tracks, legs can apply controlled forces to the environment and sense their consequences. We believe that such perception through physical interaction is an essential step towards autonomy: enabling robots to develop a physical construct of the world, and reason about possible outcomes of action.
This project has received funding from the European Union’s Horizon 2020 Research and Innovation programme (call: H2020-ICT-2017-1, RIA) under grant agreement No 780883 .