Science

From super-observers to bionic legs

The robot factory of the UG

Wall-E, the Terminator, Data: according to Hollywood, robots will be inescapable in the future. The Zernike Campus, too, is crawling with different kinds of robots. Six researchers demonstrate theirs.
16 September om 15:06 uur.
Laatst gewijzigd op 18 September 2024
om 16:27 uur.
September 16 at 15:06 PM.
Last modified on September 18, 2024
at 16:27 PM.
Avatar photo

Door Marit Bonne

16 September om 15:06 uur.
Laatst gewijzigd op 18 September 2024
om 16:27 uur.
Avatar photo

By Marit Bonne

September 16 at 15:06 PM.
Last modified on September 18, 2024
at 16:27 PM.
Avatar photo

Marit Bonne

Welcome to the robot factory! Curious about the projects? Click on the various robots (and don’t forget about the swarm flying above the factory).

Photo by Reyer Boxem

Ming Cao’s super-observer

The room is large and white. Builders are walking in and out. Cables and chains hang from the ceiling, protected by plastic bags. The place smells like a mix of metal and paint. 

‘This is where we’ll be building the aquarium for our robotic fish’, says professor of networks and robotics Ming Cao. His enthusiasm about the new lab at Nijenborgh is catching. ‘Ooh, and this is where we’ll be testing our walking and driving robots.’ 

Cao moves on to a different lab, where we find Clearpath: a squat, yellow robot with large wide tires. As his PhD student Bangguo Yu powers up the computer, Cao says: ‘This robot uses both a camera and 3D LiDAR, a laser sensor, to perceive the world around him. Tesla only uses cameras.’

Colours and contrast 

The Clearpath lets out a beep. ‘Look, this is us’, says Yu. He uses the robot for his research, using large language models such as ChatGPT to control him. For navigation, for example, or to find certain objects in an unfamiliar environment.

The computer screen shows a clear view of the room. ‘This is an RGB-D camera, which can detect the colours red, green, and blue in contrast’, says Cao. ‘It creates both a colour picture and a picture of the different contrasts.’

A robot can use ChatGPT to make connections

Clearpath’s 3D lasers allow it to perceive objects at a hundred metres. Its LiDAR – Light Detection And Ranging – rapidly emits laser pulses and catches them again to map out its environment. This creates a cloud made up of tiny dots that depict the world around the sensor. That way, the robot knows the location and identity of every object.

Yu is using ChatGPT’s enormous knowledge base for his research. ‘If I ask you to find a book on maths in this room, you immediately know to look for it on the bookshelf’, Cao explains. ‘But a robot can use ChatGPT to make these connections.’  

The AI programme is also really helpful to people who don’t have a background in engineering, since it uses normal, everyday language. ‘You don’t have to be a programmer to control a robot with ChatGPT; you can simply use the chat interface to tell it what to do’, says Yu.

Ethics

But these developments come with a lot of responsibility. ‘Everyone being able to control these robots can have unexpected effects’, says Cao.

If you want to use a moving robot in a care home, you have to make sure it doesn’t move too fast and put the elderly patients at risk. ‘You have to be prepared for the ethics and legal aspects that come with that’, says Cao. He’s working together with experts to investigate these legal aspects before the robots can be used.

Cao, also director at the Jantina Tammes School of Digital Society, Technology and AI, feels these collaborations are particularly valuable. ‘The robot discussion goes beyond just engineering.’ 

sluit x
Photo by Reyer Boxem

Bahar Haghighat’s smart robot swarms

They’re not much bigger than a sugar lump, just three centimetres. But assistant professor Bahar Haghighat’s motto is that bigger isn’t always better. Her tiny robots work together in large numbers, towards a common goal. 

The exact goal depends on how the robots are constructed and programmed. ‘You can design robot swarms that fly, swim, and walk’, she says. ‘We want them to move around and gather measurement data.’

Bee colonies

Haghighat’s work is inspired by existing swarms in nature, like bee and ant colonies. ‘Social insects are particularly good at working together in extremely large numbers towards a common goal’, she explains. ‘Somehow they understand how they have to coordinate their actions. How they do that is still largely a mystery for biologists.’ 

Social insects are good at working together in extremely large numbers

Haghighat is attempting to translate this behaviour to her robots so they can work together and know exactly what they have to do. She says the movement is particularly important. ‘Engineers have put static sensors on infrastructure for decades’, she says. ‘But with autonomous moving sensors that collaborate, you can get information of a much higher resolution about the state of a building, for example.’

Some of her miniature robots have magnetic wheels, which means they can drive on vertical surfaces. At the same time, they can perceive the object’s vibrations; if these deviate from the norm, the robots give out a signal that something’s wrong. Because they’re mobile and in communication with each other, they can very precisely map out where the damage is. 

Air quality

But the tiny sensors can also work well together in smaller numbers, even if they’re using someone else’s mobility. ‘We are working on a project that applies custom-designed sensor boxes on cars and buses that continuously measure air quality’, says Haghighat. As the vehicles drive around, the sensors track the air quality in an entire region or city. 

‘Only two decades ago, my work would have been science fiction’, Haghighat says. ‘But a few decades from now, I think we will be seeing swarms everywhere, for environmental sensing and to monitor infrastructure, for example.  They really are the future.’

sluit x
Photo by Reyer Boxem

David Lentink’s biologically inspired flying robots

If David Lentink could have any superpower, it would be flying. ‘Flying is the greatest thing there is’, the professor of biomimetics says. Birds, planes, flying robots: he’s fascinated by anything that can fly. 

For his research, Lentink tries to build robots that don’t exist yet; they have a different structure and different principles than most common robots. To do this, he combines the best of both worlds: science and technology. He takes knowledge of biology and applies it to his robots. ‘But sometimes, my robots actually help to answer fundamental biological questions.’

Birds

One example is the way birds’ wings change shape. ‘Is it a bird? Is it a plane? The answer is very simple’, says Lentink: ‘An airplane wing doesn’t change shape during flight, while a bird’s does.’ 

However, it’s extremely difficult to build an airplane with a shapeshifting wing. Engineers would have to put motors everywhere to make everything move. However, they’d also have to install sensors and a computer that control and measure movement across the entire wing. ‘Engineers have already figured out that if you have to do that for every single motor in the wing, the whole thing would become much too heavy and wouldn’t be able to fly.’

When you’re stuck, nature can help you make immense leaps in technology

He combines his engineering skills with his background in biology. ‘The first thing I looked at is how birds’ wings work.’ When he asked biologists for advice, they showed him a video of how a bird’s wing folds in and out when moved by hand. 

‘But they couldn’t tell me exactly how that mechanism worked’, says Lentink. So he and his students went to find out for themselves. ‘The answer knocked us all for a loop.’

Elastics

He found out that the largest feathers are attached to an elastic ligament and controlled by a slow twitch muscle. ‘When a bird stretches its wings, this ligament ensures the feathers are distributed equally across the wing, regardless of its shape’, he says.

‘I realised we could engineer that.’ So he built a robot on the same principle, which meant he didn’t have to control each feather individually. Lentink connected the feathers to each other with carefully calibrated elastics in an effort to mimic the ligament. ‘And that worked.’

Then, he encountered a new problem: sometimes, the elastics would stretch too far, causing holes to appear between the feathers. ‘Real bird wings didn’t do this, but we couldn’t figure out why.’

Velcro

It turned out that in birds, there is a kind of microscopic ridge of one-way velcro that latches on as soon as a hole between feathers starts to form. ‘That’s why birds don’t get holes in their wings when they’re flying’, Lentink explains. ‘That was even more reason for us to mimic real bird feathers in our latest robot wings.’

Lentink hopes his research can help in developing a more sustainable and efficient way of flying using shape-shifting wings. He also hopes it will lead to more fundamental biological knowledge, like his discovery of the feather velcro. ‘I believe in technology, but when you’re stuck, nature can help you make immense leaps in technology.’

sluit x
Photo by Reyer Boxem

Paul Vogt’s social robot 

On the Bernoulliborg’s third-floor robotics lab, a white, cuddly robot is blinking its eyes. Yawning loudly, it looks around with mild surprise. Its slightly protruding belly and happy laugh make it endearing, almost human. 

‘This is our QTrobot’, says assistant professor of social robotics Paul Vogt. The robot can’t walk, but it can move its arms. He doesn’t always look happy, though: sometimes he looks angry, and he can even cry. People can even talk to QT, for instance as a way to improve their Dutch. It’s all in how you programme him. 

Work stress

‘QT can be really useful to healthcare providers’, says Vogt. Robots like him can be of use in relieving some of the work stress in elder care, for instance, to help remind patients when it’s time for lunch. Patients can also have a chat with the robot.

This makes the elderly patients feel less lonely

‘It gives them a sense of autonomy, and helps them feel less lonely’, says Vogt. It’s also a good way for them to keep training their memory. ‘A colleague of mine at the Hanze University of Applied Sciences was doing a pilot project at a care home, and someone said to her: “The robot is going to tell me to lift my legs”’. Because of the robot, the elderly patients are better at remembering when it is time for them to get some exercise.

But there’s a long road ahead. A simple daily reminder is easy to programme, says Vogt. The challenge lies in knowing whether the message was properly received. 

Machine learning

‘Human communication is both verbal and non-verbal’, he says. The verbal part, language, can be easily programmed using language models, he explains. But the signals we give off when we fail to understand something, such as a questioning look, are harder to ‘teach’ to robots.

‘Some of these social rules can be programmed, but machine learning plays an important role as well’, says Vogt. In other words: interactions can teach the robot social norms and how to anticipate them. ‘But we’re not there yet.’

He has a passion for behaviour and social interaction, using his robots to decipher non-verbal communication. ‘By imitating human behaviour, you can learn how that works at its most basic level.’

sluit x
Photo by Reyer Boxem

Kailai Li’s robotic rescue dog

There’s a grey lump on the floor. As it slowly rises on its four legs, the spherical laser scanner below its chin is spinning rapidly. It’s detecting a room approximately four by four metres, two chairs, two cabinets, a desk, and two people. 

Kailai Li, brand-new assistant professor of computer science, doesn’t have a very large lab, but it’s sufficient to demonstrate how his robot dog Tudor works. ‘Paw’, Li tells the dog. Lie down, roll over, jump, a handstand, and even yoga poses: Tudor can do it all. 

Li and his new research group ASIG – Agile Sensing and Intelligence Group – develop multi-sensory modules and algorithms that the robot dog uses to autonomously move through unfamiliar environments. ‘For instance, after an earthquake’, Li explains. ‘You can then tell the robot to explore the area as quickly as possible.’

In order to move on to the next point, the robot first needs to know where it is

That’s not an easy thing to do, given the rough terrain, debris, and buildings about to collapse. ‘In order to move on to the next point, the robot first needs to know where it is’, says Li. It therefore requires various high-quality sensors that perceive the environment in real time. 

Self-aware

Apart from the environment, the robot also needs to know where he himself is to move around. ‘The robot not only needs to know what’s happening around him, but he also needs to be self-aware: how fast is he going, under which angle, and at which height?’ says Li. 

The robot uses the information passed on by his sensors to adapt his movement, like taking a higher step with his left front leg, or avoiding a pothole right in front of him. Ultimately, the robot will also have to recognise moving objects, such as people, and anticipate them. 

Li is testing the algorithms and sensor suites he develops in his lab, with the help of Tudor. While he hopes to get a bigger lab soon, Li is grateful for the support he gets from the faculty. ‘Students and colleagues share my passion and want to help me wherever they can’, he says. ‘I really appreciate that.’

sluit x
Photo by Reyer Boxem

Raffaella Carloni’s robotic leg

Are prostheses even robots? They definitely are, says associate professor of robotics Raffaella Carloni. ‘When people think of robot companions, they think of robots in their household’, she says. ‘But I think prostheses are the same; they are simply robots that are supporting and physically interacting with humans.’

Constructing a bionic leg isn’t like building a regular robot. ‘A human is also in the loop, interacting physically, which is a bit more complex’, says Carloni. ‘You have to think about the system as a whole.’

As an engineer, you cannot do everything yourself

That’s why she’s put together a large, interdisciplinary team. ‘As an engineer, you cannot do everything yourself.’ One example of this is the European Horizon2020 project ‘MyLeg’. ‘For this, I established connections with hospitals, patients, and companies to really try and push these results towards society’, says Carloni. She was awarded the Ben Feringa Impact Award for her work on the project. 

Tiny motors

What’s special about her robotic leg are the very precise little motors in the knee, also known as actuators, which support the patient and make it easier for them to move the leg. ‘The vision was to have actuators that functionally resemble human joints and muscles’, she says. It allows the patient more mobility and increases their quality of life.

She’s currently working with the UMCG to improve her robot even more. At the hospital, she has more patients to test the bionic leg on. Carloni is motivated and inspired by the direct contact she has with people who actually use the prosthesis. ‘Patients are eager to collaborate and their feedback is really valuable.’

sluit x

Dutch