Engineering the evolution of the Internet
Many of us don’t realise that the internet is under great pressure. This network of networks is becoming increasingly complex, almost to the point of being unworkable. Human configuration errors are causing more and more major disruptions – far more than even those caused by hackers, for example. Added to that is the pressure of climate change, and extreme weather that can damage networks. The answer lies in making the internet more flexible and robust. Fernando Kuipers, Professor of Internet Science, sees part of the solution coming from a self-regulating internet, in which artificial intelligence (AI) determines the network’s configuration and evolution.
An elusive network of networks
“Nobody knows exactly what the internet looks like,” said Kuipers. “First of all, the internet consists of many independent networks, all with different owners who often don’t reveal their network details. There is also physical complexity: cables in the ground, and in houses and office buildings, all connected by nodes. Such nodes can be gigantic ‘internet exchanges’, like the one in Amsterdam, which connect networks of providers such as KPN and Google, or simply the router in your home. This creates a tangle of connections and networks.” The resulting network of networks is a bit like a country’s road network, and just as a driver might need a satnav device to navigate, internet traffic must also be guided. This is called 'routing' and is made possible by an interplay of all kinds of protocols.
The complex internet tangle is continually developing and growing, and this growth is accelerated by the development of the ‘Internet of Things’ or ‘IoT’. Kuipers said that as well as our smartphones and laptops, all kinds of peripheral devices are now connected to the internet: “A common example is a smart fridge that automatically orders groceries when they are running out. But IoT devices can also be very small devices or sensors, with little energy or computing power. That creates a lot of additional issues in terms of network configuration.” Small devices like these are also not known for their strong security. Production margins are often small, and so insufficient effort is often put into secure and robust software. Kuipers said that an important issue arises: “We need to establish how we can ensure that these devices, and the networks they use, are configured more efficiently and more securely.”
reading time: min
‘I woke up in the middle of the night and just knew: we would build this platform’
Among other things, Kuipers has initiated the ‘Delft on Internet of Things’ (Do IoT) fieldlab at TU Delft. Within this collaboration, scientists, entrepreneurs and governmental services innovate together in the field of the IoT. Kuipers: “We offer companies a safe and advanced environment where they can test and improve their ideas and prototypes. At the moment, for example, we are working on 5G applications in autonomous transport and healthcare. In the longer term our ambition is to develop an open-source self-regulating 6G network – the successor to 5G – in which AI will play a major role.” Read more about the Do IoT fieldlab here.
Walking the internet maintenance tightrope
All of these innovations create major challenges. How, for example, are we going to ensure that we do not end up with a tangle of protocols, making the composition of the internet even more unfathomable? Kuipers: “You can look at this challenge on a number of levels. First, science can develop tools that help operators find optimal configurations and parameters. This is not only efficient – it can also lead to centralised knowledge of error characteristics, which could be a huge help when a mistake happens. For example, the complete failure of Facebook on 4 October 2021 was caused by a configuration error. We want to prevent that kind of error, but we also want to go further: the software itself also deserves greater attention.”
Operators may want to configure software optimally, but in practice no more than half of its features are often really needed for a properly functioning router. The rest are superfluous, and can even pose a risk. Kuipers: “Flexibility in the software is extremely important. We must no longer be bound by large, unwieldy programs, but rather we should move towards an internet where each router runs exactly the software it needs. With the recent emergence of programmable networks, this is now possible.”
One problem stands out: the internet already appears to be in precarious state of balance. Research shows that only a small percentage of internet failures is caused by hackers. By far the largest share is caused by configuration errors and hardware failures. Is it therefore wise to operate ‘on the live patient’? And if so, how can unnecessary risks be avoided? Kuipers: “These are difficult questions, yet we have no choice but to move in that direction. That’s why we’re working on tools and standardised processes that will allow us to adapt the internet safely.” A quick look at the figures shows that preventing internet outages is a worthwhile ambition. It is estimated that an outage in the Dutch network can easily cost 15 million euros per hour. And that Facebook outage? It cost the company billions of dollars in stock market value.
Genetic programming and the power of AI
One of the most promising technologies to make the internet more robust and resilient is AI. In fact, it is Kuipers’ ambition to develop a completely self-regulating network – a network that controls itself and makes adjustments, even in its software, based on changing usage, new technologies or sudden congestion. Kuipers: “We now have the first ‘proof of concept’ of such a self-regulating network, and we make use of ‘genetic programming’. Therein, just like in a biological process, small variations in the programming of the network occur continuously. If the algorithm notices that these have a favourable effect, it selects or combines them. The result is a network that does exactly what you ask.
Kuipers said that care is needed: “Before going any further, it must be stressed that these kinds of techniques are not designed to produce a ‘black box’. An integral part of the self-regulating network is a self-reporting component: operators must be able to request information about the status and progress of the network configuration and software at any time. That’s how you keep control.” Kuipers is now in the process of creating a ‘user interface’ so that the self-regulating program becomes usable in a realistic setting. Here, too, the thinking is very progressive: the operator would regulate the network through a dialogue with the AI, instead of an old-fashioned configuration screen. “What we want is something like Alexa or Siri, so you can really have a conversation. Tuning a network is so precise that the AI’s feedback on all of your choices is of great value.” To give an example, an operator might make a suggestion for further network development which turns out not to be specific enough. The AI would then ask, “Hmm…operator, what exactly do you mean when you say X or Y?” The language might not be that colloquial, but the message would be as understandable as possible.
A transparent internet
Kuipers is aiming for a clear point on the horizon: a safe and transparent internet – one where users can always see how their traffic flows, and where they keep control over it. This contrasts with the current position, where end-users have hardly any means of finding out about things like whether or not their internet traffic runs through countries or providers that do not conform to their values and standards. Kuipers: “On the grounds of privacy, there is a lot to be said for this. One of the great advantages of self-regulating, self-programming and self-reporting networks is that you get a clear picture of what is happening behind the scenes – and you can also adjust as needed immediately.” Kuipers said that in the future, we may all be able to set up our own internet profile, where we indicate preferences about how we want to use the internet: “For example, you might demand that all your traffic goes through operators who meet certain security requirements. Or perhaps even operators that meet certain sustainability requirements. In short: more transparency and control over the processing of what is ultimately your own data.”
Ultimately, Kuipers acknowledges that the internet remains a black box for the average user: “Even in the self-regulating internet, reality remains incredibly complex. But the advantages are clear: far fewer malfunctions and much more control and transparency. The average user will also benefit from this, whether they notice it or not.”