3 steps to automation: HERE VP for Connected Driving, Ogi Redzic
Everyone is talking about autonomous cars and they’re raising a lot of questions and concerns. For example, they are often described as ‘robotic’. We disagree and think they can do a lot of good. Having a human element might also make some of those concerns go away and help people get used to them.
People wonder whether autonomous cars will have pedals or steering wheels; others argue that they will increase traffic and recently some have pointed out that they could be used as a weapon.
We strongly believe that autonomous cars will actually do the world good. They will bring less traffic, less pollution, more safety, and reduce the space needed for parking. We’re not the only ones to think so, TED speaker Bran Ferren, told us last May that autonomous cars are the first automotive revolution in 100 years and they will profoundly transform the cities we live in.
How can autonomous cars become a reality then? What happens under the hood to bring you comfortably to your destination? Ogi Redzic, VP Connected Driving at HERE, explained this at last week’s Automated Vehicle Symposium in San Francisco.
Ogi says that an autonomous car needs three location ingredients to work, and we’re using these ingredients to develop solutions for autonomous cars like the HERE-powered Mercedes S500.
Where exactly am I?
First of all, an autonomous car needs to precisely position itself along the road and to know its lateral position, too.
In other words, traditional maps aren’t enough anymore. That’s why we're mapping with 10-20 cm accuracy: with our mapping cars we’re collecting of billions of 3D points to model the surface of the road. We also include lanes, their boundaries, mark the lane center, and where the cars are supposed to drive. As you can see in the picture, the difference between traditional maps (green lines) and highly precise maps (red and blue lines) is stunning.
What lies ahead?
Highly precise maps give autonomous cars the power to get to their destination. Then car sensors do another part of the job: making sure that the vehicle gets there safely, recognizing its position and its environment, most notably by avoiding collisions and by reacting to unforeseen events.
However, sensor sight is limited, because of curves in the road or bigger vehicles ahead. A much smoother drive is possible if the vehicle is connected and can see around corners to control maneuvers beyond sensor visibility.
We call this ‘live road’ and that’s where our cloud services come into play. We are already working on this solution with Continental to bring autonomous cars to reality more quickly. When cars are connected with each other and to the cloud, it’s much easier for them to know what lies ahead, whether it’s wet roads, obstacles or incidents, making it easier for sensors to promptly react.
How can I get there comfortably?
Even if the technology is ready, the big question is whether and how humans will accept them. A self-driving car that works mainly with sensors makes for a robotic and hair raising ride.
We believe in fact that ‘humanised driving’ will be crucial to getting people to adopt them. Personalising autonomous cars driving behavior means that the cars will drive like their owners expect them to.
Our studies show that everything from the curvature of the road to the perception of open fields, trees and houses can affect driver behavior. By incorporating this perception to recreate human behavior in autonomous cars, we help people feel more comfortable inside them.
Have your say
Sign up for our newsletter
Why sign up:
- Latest offers and discounts
- Tailored content delivered weekly
- Exclusive events
- One click to unsubscribe