Machine Vision for Self-Driving Cars – Current Applications

Ayn de Jesus

Ayn serves as AI Analyst at Emerj - covering artificial intelligence use-cases and trends across industries. She previously held various roles at Accenture.

Machine Vision for Self-Driving Cars - Current Applications

MicKinsey estimated that by 2030, up to 15% of cars sold will be autonomous vehicles. We detailed our own timeline for self-driving cars, pooling quotes and insights from executives at the top 11 global automakers.

Additionally, in an article about the future of AI and self-driving cars, Stanford professor, mechanical engineer, and founder of the Center for Automotive Research at Stanford, Chris Gerdes, wrote that he is confident cars could eventually drive as well as the most skilled human drivers, and perhaps even better.

With all the current hype about autonomous vehicles, we researched the sector to see where AI comes into play by way of machine vision and to answer the following questions:

  • How far along is the sector and what kinds of autonomous vehicles are currently available if any?
  • What are some trends that spread across companies in terms of the way autonomous vehicles are developed and offered?
  • How could these trends affect the future of autonomous vehicles and the auto industry in general?

This report covers technology vendor companies offering transportation-related machine vision technologies across two major application areas:

  • Ride-hailing Services
  • Fleet Management

This article intends to provide business leaders in the automotive industry with an idea of what they can currently expect from autonomous vehicle technology. We hope that this article allows business leaders in the auto industry to garner insights they can confidently relay to their executive teams so they can make informed decisions when thinking about where and when to invest in autonomous vehicles.

At the very least, this article intends to act as a method of reducing the time business leaders in the auto industry spend researching machine vision and automotive companies with whom they may (or may not) be interested in working.

Ride-hailing Services


Zoox is a company that is developing fully autonomous vehicles with machine vision in order to launch its own ride-hailing service by 2020.

Zoox human drivers train the cars along approved urban routes or within test premises to allow the machine learning behind the car to build a 3D map of the environment. If necessary, the company also manually creates maps of the streets, noting unique road markings, sections or obstacles to ensure that the algorithms capture that data for safety. This is shown in the video below:

Former CEO Tim Kentley-Klay explains that because the car is designed to be fully autonomous, the person in the front seat commands the vehicle, and the car will drive itself. He did not clarify if the person in the front seat of the car used their voice to command the car to drive or another mode of communication.

The vehicle consists of four quadrants, each of which its own independent function, such as processing. It also has batteries similar to that of an aircraft to ensure that the car continues to operate if one quadrant fails. The quadrant that predicts collisions is designed to deploy an external airbag in the event of a crash.

The company’s vehicle is not yet fully built. At the moment, urban street testing is being done using a standard Toyota SUV fitted with the sensors.

Due to the fact that its car is not yet available, the company does not list any case studies, but it attracted $790 million in funding from lead investors DFJ, Lux Capital, Blackbird Ventures, Grok Ventures, and Thomas Tull.

Jesse Levinson is co-founder and CTO of Zoox. He has worked on self-driving technology for about a decade, having been a lead in the Stanford University Autonomous Driving Team, where he also earned his PhD in computer science.


Pony.AI launched an autonomous ride-sharing fleet in China consisting of four Lincoln MKZ s and two Guangzhou Automobile Group Chuanqis.

This initial fleet uses a LiDar-based (light detection and ranging) sensor fusion system, a remote sensing method that uses light in the form of a pulsed laser to measure distances. These pulses are combined with other data to create 3D maps of the real world.

In autonomous vehicles, this technology is capable of identifying objects in the environment, such as traffic lights, pedestrians, and other vehicles. The company also claims the vehicle can control the its trajectory using feedback from the sensors.

As seen in the video below, the autonomous vehicle is able to drive in rainy weather, reduce speed, or stop as it recognizes pedestrians and cyclists in its path, road obstacles, and traffic light signals:

The company has not released any case studies.

Pony.Ai raised $214 million in funding and established an AI Institute in Guangzhou, China, where Erran Li leads as Chief Scientist. Prior to, Dr. Li served on the machine learning team at Uber and spent 14 years at Bell Labs studying foundational technology in artificial intelligence and machine learning. He holds a PhD in computer science from Cornell University.  

James Peng is the co-founder and CEO of He holds a PhD in engineering from Stanford University. Earlier in his career, he worked as a research associate at Stanford, as a software engineer at Google, and as Chief Architect for Autonomous Driving at Baidu.


Hungary-based AImotive has developed the aiDrive software, which it claims is a scalable self-driving technology that uses computer vision. Once fully developed, the company aims to launch the technology as a ride-hailing service.  

According to the company, the software allows vehicles to drive through any environment, climate, and driving culture. The technology has been tested in simulated and real-world environments, but not yet in urban areas.

The company claims that its vehicles are able to manage parking, entries, exits, and merging lanes. Its AI system can predict the actions of moving objects around the vehicle, the company reports.

The company’s vehicles are equipped with six cameras to capture images about the world and calculate distances. The cameras color code the objects it captures. For example, it colors cars orange and roads green. This color coding is intended to control the path of the car. The machine learning behind the car is trained to stay on the green. We could not find a video demonstrating this process.

The company has raised $47.5 million in venture funding. It has not made any case study available, but claims that it has worked with Samsung Volvo, Kyocera, VeriSilicon, and Global Foundries. It also lists Nvidia and PSA Groups as partners.

David Kiss is the CTO of AImotive. Prior to AImotive, he was the Lead Scientist at AdasWorks. He also founded two online game development companies: SongArc and Turtle Games. He earned his Master’s degree in Computer Science from Eotvos Lorand University.

Cruise Automation

Cruise Automation seems to be training the AI of its fleet of autonomous vehicles with contractors that sit in the vehicles as the cars drive. Their vehicles are currently driving the streets of San Francisco, and New York among other cities.

The company does not describe the AI in detail but reports that each car is equipped with 10 cameras which take pictures at a rate of 10 frames/second. The company also explains that the sensors capture data such as road maintenance and environmental factors as the car moves through traffic and that its autonomous vehicles have the capability to adapt to real-life road situation,  such as variable traffic and weather.

The 1-minute video below shows Cruise Automation’s employees testing a ride-hailing app that requests for the company’s driverless car. A human still sits in the driver’s seat, but only as a safeguard in case the car’s technology fails:

Cruise Automation does not make available any case studies reporting success with their software, but it was acquired by General Motors for $1 billion in late 2017.

Kyle Vogt is CEO of Cruise. Prior to Cruise, he co-founded Twitch (acquired by Amazon), Socialcam (acquired by Autodesk), and He earned his Bachelor’s degree in computer science from the Massachusetts Institute of Technology.

Fleet Management


Nauto is a camera and sensor system that collects in-car and external data to help drivers in commercial fleets focus better on driving, as well as to train software for autonomous vehicles.

The company states that the system uses machine vision to collect data in the form of images and video footage about the driver, the vehicle, and the real world. This data can include GPS location, data about the vehicle’s surroundings, weather data, collision history, and driver behavior.

The system also uses facial recognition software to create a profile of the driver, including their posture, movements, action or inaction, drowsiness, inattention, driving time, Nauto device tampering, and drunk driving.

In fleet management, this data is used to detect and coach distracted drivers in real-time. The coaching opportunity comes in the form of a fleet management leaderboard called Visually Enhanced Risk Assessment (VERA) Score, which scores people on their safe driving habits or lack thereof. This process is explained in the video below from Amazon Web Services:

The company claims that the leaderboard and the rewards that come with it could potentially help drivers become more attentive while driving, keep vehicles active, build a trusted and safety-focused brand, and reduce losses from accidents, traffic violations, and claims.

The computer-vision equipped camera captures footage of the driver and feeds it into the machine learning system. If the system recognizes of distracted driving, it will remind the driver to focus on the road with an audio message that comes through the vehicle. The intensity of this message depends on the duration and severity of the driver’s distraction.

The company is building other safety features into the system related to tailgating and other road risks. For instance, when the system detects that the driver is too close to the vehicle in front of them, it could remind drivers to keep a safe distance with an audio message. The system could also warn drivers if safety hazards are present on the road ahead.

The company does not have any case studies, but Gary Tournier, owner of Transportation Management Corporation relays that, “with the Nauto system, the constant evaluation that we have enables us to target areas which we need to retrain the drivers.”

The company lists Delivery Authority, Allegiant Solutions, Edge, and Lucky 5 Logistics as among its clients. It also claims that it is partners with Toyota, Allianz, BMW, GM, Draper Nexus, NAFA Fleet management Automation, and SoftBank. It has received almost $174 million in venture funding.

Stefan Heck is Nauto’s CEO. Prior to Nauto, he served as consulting professor at Stanford University, teaching energy and transport innovation. He has served at McKinsey, Apple, and PARC. Stefan received his PhD in Cognitive Science from the University of California in San Diego.

Takeaways for Leaders in the Automotive Industry

AI may very well disrupt transportation spaces. With regards to city travel, it may have companies emphasizing ride-sharing over private ownership. Companies in the self-driving car space seem focused on eventually offering ride-hailing services to compete with Uber and Lyft. Most of the companies covered in this report are in the development or testing phase and have no case studies. However, these companies are already test-driving their autonomous vehicles in urban streets. Zoox, for its part, is building its own vehicles and autonomous driving system from scratch. At the moment, however, testing is done using a standard SUV.

Compared to funding for startups in other industries, capital for autonomous vehicles is quite large, running up to hundreds of millions of dollars. Some of the startups featured in this report, such as Cruise Automation, have been acquired by leading automotive companies.

These companies are also moving to lure top talent from premier technology universities, such as Stanford and MIT either, many of whom hold PhDs in computer science or mechanical engineering.

Additionally, they claim their autonomous driving systems have been trained and tested to recognize hazards, such as moving objects, traffic violations, vehicle speed, bad driving behavior, and extreme weather conditions. Some companies have built up their vehicles to be capable of full autonomy but only allowed to serve within a limited area  Other cars are nearly fully autonomous, although are still undergoing tests.

It is not clear if the development of driverless cars could serve persons who are unable to drive themselves, such as the elderly or disabled, or if these will be available to private individuals or just company fleets. It is also not clear if the cars will run on electricity or gas, how much a unit will cost, how they will be maintained, or how the software will be updated.

While developments in autonomous vehicle technology have leaped, many are skeptical about using the technology. A March 2016 poll by the American Automobile Association revealed that 75% of respondents were not ready to ride autonomous vehicles.

More recently, a study by IPSOS showed that Americans are almost evenly divided between those who want to use use autonomous vehicles (22%) and those who claim they will never use them (24%). The rest of the survey participants (54%) are adopting a wait-and-see position.


Header Image Credit: Fortune

Stay Ahead of the AI Curve

Discover the critical AI trends and applications that separate winners from losers in the future of business.

Sign up for the 'AI Advantage' newsletter: