Open Lab 2024

Friday, 06. December 2024: 17.00 -19.15 The yearly event open to students to learn about the ongoing projects.  

Once a year we open our lab doors and invite students and the public to visit the ongoing research projects. Professor Roland Siegwart and Professor Marco Hutter will be on site. All members of both groups will be pleased to answer all your questions and will demonstrate their projects

Schedule:

16:30 - 19:30 check-in at LEE, main entrance on Leonhardstrasse 21. Pick up your wristband to allow your access to H & J-floor.

Group: A
17:00 - 18.00 visit LEE H and J floor

Group: B
18:15 -19:15 visit LEE H and J floor

After 19.15 the floors will be reserved for the lab family and invited guests.  

The space on each floor is limited (security reason). Everyone needs to be registered: students from 17.00-19.15 / guests from 19.15 onwards

Registration is closed!

if you registered and will not be able to come, please deregister, to give the space to another student. Send an email to :

Meet the researchers of the ongoing projects

The robotic revolution to save disaster victims and inspect confined spaces! RoBoa goes where no other robots, drones, or humans can go! ARC - We develop technology to create a positive environmental impact and to train the next generation of leaders for sustainable development. Scientific Exploration of Challenging Planetary Analog Environments with a Team of Legged Robots RobotX offers multiple opportunities for students as well as a CAS. ANYmal reaching for new heights Photo: Joonho Lee The transformer robot can do it all: quadrupedal locomotion, bipedal locomotion, and dual-arm manipulation. Magnecko: a highly versatile motion platform for climbing applications in industrial environments.  HEAP ( hydraulic  excavator for an  autonomous  purpose) is a customized Menzi Muck M545 developed for autonomous use cases as well as advanced teleoperation. AITHON: access worksites like never before! Tytan, a powerhouse of a robot! It’s got state-of-the-art actuators, can hit speeds of 20 km/h, and even carry up to 100kg without breaking a sweat.  You might even catch it lifting more than you at the gym! LEVA: Meet the future of autonomous payload transportation! Our robot takes the weight off your shoulders, literally. UpCircle's AI technology helps collectors, sorters, and recyclers manage baled waste. Don't waste your waste, work with us! Curious about the materials and mechanisms that enable robots to interact with the world and do incredible things? Come see us at the hardware booth! We will have numerous robot components on display, including actuators, PCBs, robot arms and legs, novel shape-changing mechanisms, and more—a fascinating assortment of artifacts from past and present research projects at the RSL. We hope to see you there! Accurate object throwing poses a challenge for robots due to gripper release uncertainties and the precision demands of robotic control. At RSL, we explored training robots to perform effective whole-body maneuvers using reinforcement learning. In this demo, our legged manipulator robot, ALMA, will showcase its ability to throw balls toward a target with its best effort. Gravis Robotics is a startup that turns heavy construction machines into intelligent and autonomous robots. Our unique combination of learning-based automation and augmented remote control lets one operator safely conduct a fleet of machines in a gamified environment. Our team has over a decade of academic experience honing the cutting edge of large-scale robotics, and is rapidly growing to bring that expertise into a trillion dollar industry through active deployments with market leaders. Swiss-mile collects insights and ease labor by connecting AI with the physical world using autonomous machines. The Aerial Robotics team at the Autonomous Systems Lab develops novel flying robots, that are able to physically interact with the environment. In contrast to regular drones, these robots can perform physical work where humans can't - for example drilling, screwing, assembly, or general mobile manipulation. Our team is developing and researching the complete stack, as high accuracy physical work in air comes with novel and unique challenges. We develop our own hardware and electronics with a custom software stack, research novel control and estimation methods and investigate data-driven and RL approaches for control, modular navigation and perception. The Perception and Navigation Team empowers robots to autonomously explore and map unknown environments, understand complex scenes, and perform precise reconstruction tasks with cutting-edge intelligence. Duatic AG builds human-sized robot workers for intralogistics operations. These workers navigate and handle objects like humans, resulting in seamless integration with existing infrastructure for fast deployment and minimizing upfront investment risk. The core technology is evolutionary actuators and a world-first, state-of-the-art robotic arm. This lets them create mobile manipulation solutions for industries beyond warehouses, including manufacturing and retail. MoMa is the Mobile Manipulation team at the Autonomous Systems Lab. Our research focuses on how robots can perceive objects in order to manipulate them: starting from standard house-hold objects and toys, and moving on to granular materials. We also work on discovering affordances and articulations in objects. At the Cybathlon 2024 competition we presented the first walking assistance robot for people with severe motor impairments. Come meet the team and try your hand at piloting a race task yourself! Our space robotics team works on multiple aspects of planetary robotics, such as hardware design, locomotion control, navigation, and multi-robot coordination. By bringing advanced robotic solutions to space, we aim to enable science and exploration in challenging areas like craters, caves, and boulder fields on the Moon, on Mars and on asteroids. VR-based Teleoperation: Demonstrating VR teleoperated manipulation in which a user interacts with a virtual object, and the robot arm mimics this action to pick up the actual object. The user's hand motion is retargeted onto an Ability hand, and the wrist motions are mapped to a Dynaarm for smooth and accurate control. The VR headset streams the scene from the robot's viewpoint to provide an immersive experience to the operator. The fixed-wing group researches modeling, control and planning methods for (hybrid) fixed-wing UAVs in aerial applications incl. active mapping, environmental monitoring and cargo transportation. The AI Institute aims to solve the most important and fundamental problems in robotics and AI. Headquartered in Cambridge, Massachusetts, the research-driven organization brings together top talents in robotics, AI, machine learning, computer science, and engineering, with the goal of developing future generations of intelligent machines.
JavaScript has been disabled in your browser