Pong! The computer tennis game that Atari brought out in 1972 was the first global video-game blockbuster and is considered the forefather of the genre. Pong players simply tried to keep a ball in play as it moved back and forth. Only modest computing power was needed. The more realistic computer games became, the more programming was needed to support them, and consequently the more lines of code. But today the traditional “hard” coding of algorithms is pushing against its limits. Consider autonomous driving in urban areas. It is simply impossible to program a machine to prepare it for every possible scenario and accident in city traffic. It takes special software to resolve rule conflicts based on experience. The algorithms needed for this are known as artificial neuronal networks, and they are trained with machine-learning methods.
AI’s use for highly automated driving starts with the exact identification of what the car’s sensors “see.” What is easy for small children is a laborious exercise for machines. But computer-supported image recognition has made great progress in recent years. Deeply layered neuronal networks are the key. They are based on a multilayered system of extremely small computing units, so-called neurons. Every neuron passes its findings to the neurons in the layer lying beneath it while the rules governing calculation and transmission continually change. Neuronal networks have to be trained before they can really do anything. It is only when they have seen the images of several breeds of dogs that they can differentiate between a dog and a cat. Still, the process can be largely automated by feeding the machine with images and the associated descriptions, from photo databases, for example. The more layers in a neuronal network, the more complex the learning processes that drive it. This is where the frequently used expression “deep learning” comes from.
Once the environment is clearly recognized, the system has to make the right decisions – that is, decide whether to brake or accelerate, steer to the right or the left, or simply do nothing. The machine can adjust to the people sitting in the cockpit as it decides. The Proreta 4 research project – carried out jointly by the Technical University of Darmstadt and Continental – has demonstrated this. A model employing learning algorithms has been developed for an assistance system that aids drivers during left turns. Using data from real-life driving, it classified every driver into one of three driving styles based on defined driving maneuvers. It also specified the optimal timeframe for turning in 200-millisecond intervals. The determination is not fixed – it adjusts on an ongoing basis, for example, when another person takes the wheel or there is a change in a driver’s fitness. Knut Ehm, Head of Advanced Development at Continental’s Interior Division, praised the results. “These kinds of adaptive assistant systems that take drivers’ individual preferences into account are the future,” he said. Ehm no longer likes to talk about human-machine interfaces anymore. “That’s a term from the past,” he said. Today the job is to make a digital companion available to the human.
That companion could soon take over control temporarily, even if only on certain stretches of road at first – the so-called “operational design domains” (ODDs). They are a specific type of road in a clearly delineated geographic region, and the range of speeds is limited. For example, in 2021, BMW plans to begin testing its own fleet in an urban environment at speeds up to 70 km/h. Its future partner Daimler has announced similar tests. At that point, at the latest, AI computers will be on board vehicles for the first time. These computers are basically different from today’s control devices, which are implemented as embedded systems. The AI systems rely on high-performance chips from the world of computer games. Their graphic processors are capable of carrying out a great many computing operations in parallel – which makes them ideal for neuronal networks where the computing process also involves numerous small steps. A ZF hardware solution for future robo-taxis makes it possible to run 600 billion computing operations per second on its main computer. The system was presented at the Consumer Electronics Show in Las Vegas.
Artificial intelligence has long been used in many other fields in the auto industry. Patrick van der Smagt, who supervises a research group at the Volkswagen Data:Lab in Munich, puts it this way: “Artificial intelligence is not just appealing for autonomous driving but for many other facets of our company as well – for instance, in manufacturing or replacement parts distribution.” As an example, he cites internal company contract work on software for an electric racing vehicle. It predicts when the battery will be completely discharged. “This is important so the energy stored in the battery is completely used up at the end of the race, without having to stop prematurely,” van der Smagt said. “That can’t be solved with classic control technology because you have to predict the extremely complex relationship between driving behavior, changing boundary conditions and the battery charge over a fairly long period of time.”
“Every technical system has its own acoustic fingerprint.”
The Porsche Data Lab in Berlin is working on another application. Claudio Weck is examining how the sound of a system often reveals more about its technical condition than its external appearance. If you train an AI system devoted to pattern recognition using typical sound images, it can recognize deviations and trigger an alarm. “Every technical system has its own acoustic fingerprint,” Weck said. “Deviations are almost always indications of a distinct change in the behavior of the system.”
Whatever the field – from predictive maintenance and production control, to automated translation of training materials or operations management for marketing measures – there hardly seems to be an area of the auto industry where artificial intelligence would not be useful. In production processes alone, according to a McKinsey study, up to $61 billion could be saved industrywide. Automated quality controls are just one example of a valuable AI application. Many manufacturers are reallocating their IT budgets accordingly. The trend is fueling employees’ fears, but so far, there has been no known case of anyone losing a job due to the introduction of an AI solution. After all, there are still many obstacles on the road to autonomous systems.
In the May edition of our customer magazine ESSENTIAL, we describe how digitalization is changing the auto industry. Read more here.