The quiet revolution of the intelligent machine


Artificial intelligence (AI) is not a new concept. Eighty years ago, scientists already explored neural networks. For decades, it was limited to ideas, proofs of concept and the occasional success story. But these last few years have seen a revolution. Brute processing power, huge amounts of raw data and new insights have initiated an acceleration. Suddenly, AI is everywhere.

Article from Objective 27, 2017

Most of us will not even realize it, but when you are writing a text message or an e-mail on your mobile, the AI helps to predict what the next word is that you will type. When Netflix shows you recommendations for movies that you might find interesting, this is done by collaborative filtering based on a self-learning algorithm. Decorating your friends with bunny ears or a cat's nose in Snapchat? Ditto. These are just a few random examples. AI, currently usually referred to as machine learning, has quietly worked its way into all aspects of daily life. The power of the machine: superfast pattern recognition and prediction based on historical data.

Processing power and hunger for data

Self-learning systems require data and processing power. The growth of the Internet has had a great influence on the success of machine learning. More than three and a half billion people are online. The Internet of Things is expanding at an astonishing rate. The amounts of data that are being produced are enormous. Social media, photos, videos, sensor data: all this data can be used as input for self-learning systems. When it comes to processing power.... Moore's Law is at risk of being overtaken by the physical reality, but the available processing power and speed are currently staggering thanks to techniques such as multicore and multithreading and the use of GPUs (graphics cards) for processing large amounts of data.

The possibilities of machine learning are basically infinite. And its successes are very real. For instance: Google Mail can predict whether an e-mail is important to you and provide suggestions for possible answers. Data centers save up to 40% in energy consumption thanks to control by intelligent systems. Speech recognition is making progress which experts would not have dared to dream of ten years ago. A professional player of the mind-game Go was recently beaten by the self-learning program AlphaGo - an achievement that was simply unimaginable until now.

The most spectacular developments (like AlphaGo) are achieved by deep learning. This requires a deep neural network trained by using large sets of data. This is how a computer, with or without support from pre-labeled outcomes, can analyze and decipher the often complex relationships between input and output. The result: the machine can make reliable predictions regarding output based on new input.

Machine learning in traffic management

Traffic management is one of the fields where machine learning can really proof its worth. The monitoring of traffic provides a constant flow of data. Measuring loops in the road record the intensity and speed of traffic. Video cameras provide insight in the movement of traffic on crossroads and traffic squares. Navigation systems generate accurate data concerning the movement of cars (Floating Car Data or FCD). Simultaneously, all sorts of environmental factors are recorded, e.g. weather conditions, road blocks and traffic incidents. Within this combination of data lies a treasure trove of information.

MobiMaestro, the traffic management platform of Technolution, offers interfaces to all possible traffic-related input and output, such as measuring loops, cameras and route-information panels. It is ready for the processing of FCD. A complete set of software modules gives traffic managers the tools to coordinate traffic with the support of pre-defined scenarios. These scenarios are based on thoroughly considered traffic models.

Traffic managers have excellent knowledge of the traffic in their area. They know: “If we keep this traffic light red for a little longer, we will encounter a traffic jam in three minutes time right here.” Or: “If the rain increases right now, there will soon be a traffic jam of five km in length.” It is not easy for a software system to makes such predictions. Subtle changes in reality are hard to model in scenarios. That is why we are researching how MobiMaestro can learn from the traffic data we have acquired over the past years. The system already recognizes deviations from normal traffic patterns. This is a valuable asset for a traffic manager. It considers the deviation and activates a scenario if necessary. In addition, we are working on automated tooling that can predict scenarios in traffic situations. The next step is bringing both systems together to learn which scenarios yield the best result. We hope to achieve that MobiMaestro will finally be able to make reliable suggestions for scenarios. Even in difficult situations.

Changing paradigms

Machine learning appears to be child's play. The connections between input and output are established using simple rules. It is nevertheless difficult, even after a training phase, to establish how the system makes its predictions. Working with intelligent systems also means getting used to changing paradigms and daring to let go of models and simulations. We need to trust that the data will not lie, that the system will find the right conclusions from the combination of data sets. Practice will offer the best proof. If the traffic manager can make quicker decisions due to the software's correct prognoses, the trust in the reliability of the system will increase steadily.

The cat is out of the bag

We are convinced that machine learning will soon be a crucial part of traffic management. The available big data and the repeated patterns that are typical in traffic, offer the perfect opportunity for implementing intelligence. Machine learning will also become more important in other domains. One thing is certain: whatever we might do for our clients, security, reliability and functionality always come first.

Artificial intelligence, or AI, is the general term for all techniques that mimic human intelligence via logics, what-if rules, decision trees and machine learning.
Machine learning is a form of AI in which computers are trained by statistic analysis of experience data and/or historical data.
Artificial neural networks consist of software units or ‘neurons’ that mimic the function of biological neurons.
Deep learning is a subset of machine learning in which multi-layers of (deep) neural networks are exposed to large sets of data. The software is able to recognize patterns in data on its own and perform tasks, such as speech and image recognition.



Related items

Analyzing your energy use with one measuring tool

Read more


Safe autonomous systems due to enabling technologies

Read more


EDSN and Technolution starting long-term collaboration

Read more


Technolution joins Flexible Power Alliance

Read more