Drones can be used to achieve captivating camera angles, but shots of this kind are expensive and the creative possibilities are often limited. ETH Zurich and TU Delft have developed an algorithm that allows drones to implement the desired picture compositions independently. Earlier this week, the researchers presented their findings at the SIGGRAPH Conference in Los Angeles.

Skyfall

The film Skyfall has its viewers spellbound as James Bond attempts to neutralise his adversary on the roof of a train as it races through the desert. This was an extremely expensive scene to film in terms of personnel, materials and technology. Several camera operators were deployed for hours on end at a number of different locations. And a camera crane even had to be mounted on the train’s roof for the spectacular close-up shots. 

Tobias Nägeli, a doctoral student in the Advanced Interactive Technologies Lab of ETH Zurich, is convinced that these scenes can be filmed with fewer resources. Together with researchers from Delft University of Technology, notably dr. Javier Alonso-Mora (Autonomous Multi-Robots Lab), and ETH spin-off Embotech, he has developed an algorithm that enables drones to film dynamic scenes independently in the way that directors and cinematographers intend.

Maintaining control of the shooting angle

Drones have been used in filming for a number of years, but good camera shots typically require two experienced experts – one to pilot the drone and one to control the camera angle. This is not only laborious but also expensive. It is true that commercial camera drones already exist that can follow a predefined person independently. But this means that the director loses control over the shooting angle, as well as the option to keep several people in the image at once. That’s why the researchers developed an intuitive control system.

To explain how this works, one can draw an analogy with robotic vacuum cleaners: the researchers don’t specify the exact path that the robot should take. They simply define the objective: that the room should be clean at the end of the process. If they apply this analogy to film, it means that the director is not concerned with exactly where the drone is at a specific point in time. The most important thing is that the final shot meets their expectations.

Translation

This process of translation between cinematographer and drone is the job of the algorithm. Parameters such as the shooting angle, the person to follow, or tracking shots by the crane and camera can also be defined before the flight. For safety purposes, these parameters are combined with spatial boundaries within which the drone can move freely. The precise path – and the timing of changes of direction – are recalculated by the drone 50 times per second, with GPS sensors providing the necessary data.

Use in sports broadcasts and inspections

The algorithms could see their first application not in a film studio but in television sports broadcasting; for example, ski races. This is an area with a huge demand for dynamic shots. But manually piloted film drones can present a hazard for the athletes, as seen from drone crashes in the past.

The algorithms could also be used for inspection of industrial facilities; for example, in the case of wind turbines that are examined for defects using drones. Or for transport purposes: it would be possible to define air corridors that could be used to transport blood or donor organs safely in an emergency. The drone could identify the fastest and safest flight path independently within this corridor.

Filming with drones: spectacular images thanks to a new algorithm

More information

Nägeli T, Alonso-Mora J, Domahidi A, Rus D, Hilliges O: Real-Time Motion Planning for Aerial Videography With Real-Time Dynamic Obstacle Avoidance and Viewpoint Optimization. IEEE Robotics and Automation Letters. 2017, 2.3: 1696-1703, doi: 10.1109/LRA.2017.2665693

Nägeli T, Meier L, Domahidi A, Alonso-Mora J, Hilliges O:. Real-time Planning for Automated Multi-View Drone Cinematography. ACM Transactions on Graphics, 2017. 36: 132, doi: 10.1145/3072959.3073712