World Ocean Day: Autonomous robot system picks up litter from ocean floor

News - 07 June 2021 - Webredactie 3mE

TU Delft is working with seven other partners on an autonomous system for cleaning the ocean floor. This system consists of a surface vehicle with two underwater robots, which are going to identify and collect litter from the ocean floor. Now that the crucial gripper component of the system has been completed, the SeaClear system is almost ready for field testing.

Waste problem

‘Coastal waters all over the world are polluted with litter, such as pieces of plastic, bottles or tyres. At the moment divers are cleaning up this waste from the seabed, especially in tourist areas. However, this is an expensive solution and can sometimes pose dangers for divers,’ says professor Bart De Schutter from TU Delft’s Center for Systems and Control. ‘That’s why we’ve joined forces with seven other partners in the SeaClear project to develop an autonomous system using underwater robots to remove waste from the seabed. And important to say, we are careful not to affect life on the seabed.’

Two underwater robots

‘The SeaClear system works as follows,’ continues project leader De Schutter. ‘We have a  surface vessel on the water and two underwater robots. The somewhat smaller robot is the observation robot. It scans the seabed with a camera and sonar. This robot maps out where the litter is located and what kind of waste it is. Once the observation robot has identified litter, it sends this information to the other underwater robot, which is equipped with a gripper. This robot goes to the litter and picks it up with the gripper and deposits it into a large basket. The gripper is designed with a frame structure so fish can easily escape when being picked up by accident. The robot can also distinguish between litter and aquatic life, such as fish and seaweed. We use advanced algorithms to make this distinction.’

Delft

TU Delft is the project coordinator of SeaClear, taking care of the overall management and coordination of the different work packages. In addition, TU Delft is making a significant scientific contribution to the project. This contribution is threefold: first, it concerns image recognition and the classification of the waste. Second, it concerns sensor fusion: the waste can be observed by means of sonar or video. Sonar has to be used when visibility is poor, such as in the Port of Hamburg, which is one of the two test sites. ‘We can combine the data from the video images with the sonar images. Video images are already labelled, but this kind of data is not yet labelled in the case of sonar data. That’s why we want to transfer labelled data from video images to labelled data for sonar.’ Finally, Delft University is focusing on the movements of the various underwater vehicles, which are connected to the base vessel by cables for the transfer of data and power. It’s important that these cables do not get tangled up. ‘This will definitely be a priority once we start to scale up the system,’ De Schutter explains.  

First tests in Dubrovnik

‘The gripper was the trickiest hardware component to develop, but that’s ready now too. Researchers of TU Munich have constructed a prototype, so now we can really start testing the whole system,’ De Schutter says. ‘We have two test sites: one in Dubrovnik and another in the Port of Hamburg. In September, we’ll carry out the first tests, in Dubrovnik. We’ll test the different components of the system and see if we can automatically recognise and pick up litter that we have placed on the seabed ourselves.’

Bart de Schutter: “We would like to expand SeaClear significantly in the future. This means we want to move towards scaling the system up with a large number of underwater robots that can be used anywhere. That would enable us to scan a huge area and really clean up the ocean.”

Prof.dr.ir. B.H.K. De Schutter

Meer informatie

Drs. F.J. Bosman MA

The robot can distinguish between litter and aquatic life, such as fish and seaweed. We use advanced algorithms to make this distinction.