When Will Autonomous Cars be Mainstream?

Daniel Faggella

Daniel Faggella is Head of Research at Emerj. Called upon by the United Nations, World Bank, INTERPOL, and leading enterprises, Daniel is a globally sought-after expert on the competitive strategy implications of AI for business and government leaders.

Deep Learning Applications in Medical Imaging 7

Episode Summary: This week we speak with CEO and Founder of Nexar Inc., Eran Shir, whose company has created a dashboard app that allows drivers to mount a smartphone, which then collects visual information and other data, such as speed from your accelerometer, in order to help detect and prevent accidents.

The app also serves as a way to reconstruct what happens in a collision – a unique solution in a big and untapped market. In this episode, Shir gives his vision of a world where the roads are filled with cyborgs, rather than autonomous robots, i.e. people augmented with new sensory information that trigger notifications, warnings or prompts for safer driving behavior, amongst a network of cloud-connected cars.  He also touches on what the transition might look like in response to the question – when will autonomous cars be mainstream?

Expertise: Internet of Things, machine learning and big data, complex networks

Brief Recognition: Eran Shir is a serial entrepreneur and investor and founder of Nexar Inc. Previously, he was Entrepreneur In Residence at Aleph, a VC out of Tel Aviv, where her focused on fields like Internet of Things, big-data, crypto currencies, marketplaces and networks, and how these technologies disrupt important markets. Before Aleph, Shir ran the global Creative Innovation Center for Yahoo!, where he was responsible for Yahoo!’s ad creative technologies strategy and portfolio, as well as the development of its next generation ad creative personalization platform.

Shir joined Yahoo! in October 2010 when the search giant acquired Dapper, a dynamic ads and semantic web startup Shir co-founded in 2007. Shir holds a Master’s in Physics from Technion – Israel Institute of Technology and a PhD in Electrical Engineering and Computer Science from Tel Aviv University; he is also a graduate of Singularity University.

Current Affiliations: CEO and Founder of Nexar Inc; Board Member for Israel Center for Excellence through Education

Interview Highlights:

The following is a condensed version of the full audio interview, which is available in the above links on Emerj’s SoundCloud and iTunes stations.

1:50 – I want to speak first about where AI is playing  a role at Nexar…where does AI play its role in all those various sensors and all that technology?

Eran Shir: “We use AI in various ways for various use cases. Our main focus as an app is warning you about potential collisions, for that we’re tracking all the vehicles around you in real-time and detect and try to prevent potential accidents by warning you about dangerous situations as they occur; we use the vision sensor and tracking vehicles and basically deploying virtual sensors on other vehicles around you, and we leverage also a vehicle to vehicle network to share that information across longer distances so we can warn you about a dangerous situation that’ s happening five cars ahead of you immediately…”

5:20 – Is it possible that, say, there’s a particularly dangerous interaction during rush hour…is there a red light that will come on as we’re approaching it, or will we only (be alerted) if a car in front of us just had an accident 10 minutes ago and they’re still in the middle of the road?

ES: “A lot of accidents, when you’re driving in rush hour, is just someone ahead of you is pressing the brakes and you don’t notice because you’re distracted and I’m trying to warn you…that use case is called a forward-collision warning and that’s really the first use case we have implemented, but there are many others…one of the most dangerous things you can do is drive much faster or much slower than the swarm around you…that has a great correlation to risk and I want to detect the car’s swarm velocity, this is an example of another use case…”

8:00 – What’s the interaction with the user to prompt them ‘Hey, you better start moving’ or ‘Hey, you better pump the breaks’?

ES: “Obviously there’s an audio prompt…but there’s also a visual UI, and the visual UI is very simple, it tells you seconds to impact, in the case of the vehicle-to-vehicle use case where you don’t actually see the car that is now getting into an accidents, it tells you the distance and it actually tracks the distance as you close it, and it tells you in a very colorful way the level of dangers, so something that is very basic and very primary from our perspective, you can see it from the side of your eye, and that will be enough, or you can listen to the audio prompt and react to that.”

11:47 – For sensor fusion, talk the audience through what that means to you…

ES: “We have a smart phone set up on your dashboard basically looking ahead, so if someone hits you from behind, obviously the camera won’t see it, but even if in that situation.we are able to reconstruct exactly what happened, whether someone hit you at an angle of 175 degrees or 192 degrees at a force of 3.5G or 5.3G and whether there’s a chance of 10 percent or 30 percent, you got whiplash, so all of these things can be deduced from just a single phone sitting on a single place in the car, and it really doesn’t matter in that sense where you put it, and just doing a lot of sensor fusion, a lot of physics, a lot of calibration and machine learning, to reconstruct the scene.”

13:31 – It sounds like when you get started you probably needed a good amount of crash test data…where does all that get pulled from?

ES: “Thankfully that part is relatively solved, there is about 50 years of medical research on the impact of collision in various different shapes and forms across thousands and tens of thousands and hundreds of thousands of collisions…from that perspective, we didn’t develop new data sets but what’s really cool is we have data to enter into those models (now)…”

17:43 – What does the world start to look like when more and more vehicles have all of these sensors and are connected?

ES: “It should enable – eventually – an accident-free world; I’ll be a bit of contrarian here and say we really don’t need the deployment of autonomous vehicles at the massive scale in rode to prevent car accidents, there are a lot of great reasons why I can’t wait for autonomous vehicles…but safety is actually not one of them; if we deploy enough sensors, if we make sense of sensors in real-time and warn drivers in a smart way and connect those vehicles in a network, we can eventually reduce car collisions by at least tens of percent…

…on the societal impact…when you sense everything, then everything is accountable, you can start moving to personalized insurance at scale, you can start warning people about bad drivers next to them, you can start doing a bunch of things that will have the impact of changing peoples’ behavior, not just reacting in real-time but actually coaching people how to become better drivers and giving them the incentives to become better drivers, and I think we’ll see a lot of that in the next decade…”

22:34 – We’ve written before about self-driving car timelines, what do you think the autonomous vehicle market will look like in a few decades?

ES: “In the next 20 or 30 years, we will live in a hybrid world; initially there will be a few autonomous vehicles in specific places and lots of human-driven cars, and then over time it will gradually change, that’s the story, this is not going to be a big bang, if anything because you don’t have enough factories to replace all the vehicles on the road…during that time, we’re going to be in a very peculiar state, in which you’ll have human drivers and autonomous vehicles share the road…

…today we’re accustomed to emergent management, the road is managing itself because each person is interacting with other vehicles on the road in various was, and you can believe in people to take care of the situation in some respect; where you have lots of autonomous vehicles, you’re actually going to need to add a management layer…that understands what every car does in the atomic sense, one that can provide guidelines, hints and sometimes orders to the vehicles on what to do…”

Big Ideas:

1 – A driving issue in Nexar’s vehicle app solution is not liability but severity of collisions, a major problem for the insured and insurers; a reconstruction of the scene using collected data is helping to solve a decades-old problem, one of many robotics trends that society will witness over the next decade.

2 – AI and machine learning technology has a potentially huge role to play in changing people’s behavior on the road, autonomous vehicles aside. Over time, this type of guidance also serves as an overarching management player that will allow autonomous vehicles to expand at scale.

Stay Ahead of the AI Curve

Discover the critical AI trends and applications that separate winners from losers in the future of business.

Sign up for the 'AI Advantage' newsletter:

Subscribe