What technical challenges is Uber facing

Autonomous driving: what are the technical challenges?

Parking in and out of parking spaces can already be done automatically with different vehicles. Parking assistants work either at the push of a button or via an app on the smartphone. Owners navigate their car out of a tight parking space like with a remote control.

It looks different in flowing traffic. No area is more complex for robotic vehicles to manage than urban public transport. In addition to cars, trucks, vans and motorcycles, they often share the road with bicycles and scooters. Every now and then children unexpectedly jump onto the road. An autonomous vehicle has to deal with all of these situations. In addition to the use of radar and ultrasonic sensors for a wide but close environment, very fast control and low speed are therefore necessary. Autonomous vehicles currently in use therefore only travel around 30 km / h.

Current systems recognize cross-traffic and intersection traffic as well as pedestrians and small animals on the road - but often too late. In order for autonomous vehicles to be able to travel safely in the city, road users have to network and communicate with one another. To do this, however, all vehicles and road users would have to be able to communicate. But that will only happen when all cars have this technology on board, which will take decades.

Computer calculates probabilities of what could happen based on movement and speed

In order to better understand the environment, the systems of autonomous vehicles organize their environment into groups, for example cars, houses, traffic signs and people. This is crucial because different movement sequences are to be expected from the individual groups: a house rarely jumps onto the road, and it can happen to a person.

Based on the movement and speed of a scanned object, the computer calculates probabilities of what might happen next. That requires fast and intelligent data processing. By means of deep learning, or artificial intelligence (AI), the systems should experience and learn more and more situations in the future and save the correct action from them. When it comes to recognizing a person's facial expression, however, the computer still reaches its limits. The computer does not yet understand a friendly nod from the driver who non-verbally explains to the pedestrian that he can cross the street.

If all road users and systems were to be completely networked in the future, the systems could coordinate with each other and issue warnings. With Car-to-X communication via radio or WLAN, the car knows even before the next intersection that a cyclist is around the corner. However, it will take a few more years for this to work in practice. Experts do not expect market penetration of over 70 percent before 2050.