AI for Self-Driving Car Safety – Current Applications

Ayn de Jesus

Ayn serves as AI Analyst at Emerj - covering artificial intelligence use-cases and trends across industries. She previously held various roles at Accenture.

AI for Self-Driving Car Safety - Current Applications

Allied Market Research estimated the value of the global autonomous vehicle (AV) industry to reach $54.23 billion in 2019, increasing to $556.67 billion by 2026 at an annual growth rate of 39.47% during that period. It follows that AI would find its way into the autonomous vehicle world. We detailed our own timeline for self-driving cars, pooling quotes and insights from executives at the top 11 global automakers.

According to the National Highway Traffic Safety Administration, autonomous vehicles and driver-assisting technologies have the potential to reduce crashes, prevent injuries, and save lives. However, a report from the American Automobile Association’s multi-year tracking study revealed that more consumers distrust driverless vehicles.

This year’s study showed 73% of American drivers would hesitate to ride in a fully autonomous vehicle. This was a significant change from 63% in late 2017. The survey also reports that 63% of American adults feel less safe sharing the road with an autonomous vehicle while walking or riding a bicycle.

We researched the space to better understand where AI comes into play in the safety of autonomous vehicles and to answer the following questions:

  • What types of AI applications are currently in use in autonomous vehicle safety?
  • What tangible results has AI driven in autonomous vehicle safety?
  • Are there any common trends among these innovation efforts? How could these trends affect the future of autonomous vehicle?

This report covers companies using and offering software to ensure self-driving cars are safe across two technologies:

  • Computer Vision
  • Virtual Simulations

This article intends to provide business leaders in the automotive industry space with an idea of what they can currently expect from Ai in their industry. We hope that this article allows business leaders in the automotive industry to garner insights they can confidently relay to their executive teams so they can make informed decisions when thinking about AI adoption.

At the very least, this article intends to act as a method of reducing the time business leaders in the automotive industry spend researching AI companies with whom they may (or may not) be interested in working with.

Computer Vision

Waymo

Waymo is a US-based company that offers namesake autonomous vehicles, which they claim can help auto manufacturers and ride-hailing businesses make the roads safer for both pedestrians and motorists using a combination of computer vision, audio recognition, and machine learning technologies.

Waymo claims users can experience a safer driving experience through the AV’s vision system, which is capable of object and event detection and response. The vision system’s cameras constantly scan the road for moving and static objects, such as pedestrians, cyclists, other vehicles, traffic lights, construction cones, and other road features it passes along the road through a 360-degree view of its surroundings.

Then, the camera captures images of these objects, which the machine learning algorithms send to the database to determine what objects it sees. To train the system to recognize objects, it is fed millions of images of this object, for example, children of different ages, weights, heights, hair or skin coloring, wearing a variety of clothes, from different angles, etc. So if a child crosses the street, the system’s collision detection and avoidance system recognizes the body crossing the street as a child and determines that the car has to slow down or stop to allow the child to cross safely before the car can proceed.

In its Safety Report, Waymo also claims that its AVs feature audio recognition technology, which is capable of hearing and recognizing the sirens of fire trucks, ambulances, and police cars and police motorcycles up to hundreds of feet away, so it can make way for these vehicles. In Waymo’s case, the sensors collected samples of the emergency vehicles’ sirens at various speeds, distances, and directions to enable the self-driving cars to respond safely to emergency vehicles on the road.

The company did not elaborate on its audio recognition technology but our research revealed that some audio sensors are machine learning systems that could also teach a vehicle to diagnose car problems themselves, according to an MIT Technology Review report. Sensors can be trained to listen to recognize the different sounds coming from engines or brakes and let the car owner know through the dashboard display which part needs to be checked or repaired.

The technology could also serve as safeguards for human drivers and passengers of AVs, for example, by listening for sounds that indicate the quality of the road, such as snow- or gravel-covered surfaces. It was not clear, however, if Waymo’s autonomous vehicles have this self-diagnosing capability.

Below is a short 3-minute video demonstrating how the company’s software works:

Waymo has not made any case studies available but claims to be helping Jaguar create an electric self-driving vehicle, to be called the Jaguar I-PACE. Waymo began testing the cars this year, and they are set to become part of its AV fleet starting 2020. A total of 20,000 I-PACE AVs will be produced. In the future, both companies will consider extending the partnership to Jaguar Land Rover models.

Waymo also lists partnerships with Walmart, AutoNation, Avis, among and others, serving to pick up customers and drive them to their business destinations in the Phoenix area. AutoNation will begin to offer its customers a Waymo vehicle while their own car is serviced. On the other hand, Avis customers in two Chandler, Phoenix locations will be able to ride a Waymo when dropping off or picking up their rental car.

Walmart and Waymo are also piloting a program where members of Waymo’s early rider program could receive grocery savings when they shop from Walmart.com. The riders will be able to ride a Waymo car to Walmart store for grocery pickup.

Dmitri Dolgov is CTO & VP Engineering at Waymo. He holds a PhD in Computer Science from the University of Michigan. Previously, Dolgov served as Head of Software at Google’s self-driving car project.

General Motors

By 2019, General Motors aims to offer ride-sharing services with perception software, which they claim can help operate self-driving cars safely in busy urban environments using a multisensor vision system.

General Motors claims the system can safely navigate city streets with a 360-degree view of the world. The system is fitted with five light detection and ranging (LiDARs) sensors, 16 cameras, and 21 radars.

Using laser light, the LiDAR measures the distance of both fixed and moving objects from the vehicle. The radars complement LiDAR in that they are able to perceive solid objects in low light conditions. Long-range sensors track speeding objects, such as oncoming vehicles, while the short-range sensors provide detail about moving objects near the AVs pedestrians and bicycles.

The cameras, aside from measuring the light intensity emitted by objects, also help to classify and track objects such as pedestrians, vehicle types, road details such as lane lines, construction zones, and signage. This is done through machine vision algorithms that search the company’s database for similar objects to determine what is it the system is “seeing.” Once the objects are recognized, the algorithms trigger specific decisions and actions.

For example, if a dog suddenly crosses the street, the system will search its database to determine that the object in front of the vehicle is, in fact, a dog. The system will then trigger the car to stop.

Below is a short 1-minute video demonstrating how the vision system works:

General Motors does not have case studies available. The vision system was developed by Cruise Automation, the autonomous vehicle subsidiary that General Motors acquired in 2016.

Kyle Vogt is founder and CEO at Cruise Automation. According to his profile, he graduated from MIT, but he does not make available his educational background on LinkedIn. Previously,  Vogt co-founded Twitch (acquired by Amazon), Socialcam (acquired by Autodesk), and Justin.tv.

Quanergy Systems

Quanergy Systems is a Silicon Valley-based company with about 200 employees. The company offers a software called Qortex for Transportation, which they claim can help autonomous vehicle manufacturers increase car safety using computer vision technology.

Quanergy claims that the application detects objects based on imaging data captured by the system’s LiDAR sensors. This data could include 3D images of people on the streets, other vehicles, buildings, animals, trees, and signs on the road. The underlying algorithms then measure the distance of the objects from the vehicle, as well as identify and classify them. Once an object is classified, the system triggers an action based on the real-time scenario.

For instance, if the system’s camera perceives a cyclist crossing the street, the camera will capture an image of the cyclist, and the machine learning algorithms will search the company’s database for similar images in order to determine that the object crossing the street is in fact a cyclist. Once the image is classified, the algorithms will inform the system and command the car to stop until the cyclist fully passes.

Below is a short 2-minute video demonstrating how Qortex works:

Quanergy has not made any case studies available but claims in a press release to have integrated its application into Cisco’s Smart City Connected Roadway solutions. Under this project, both companies are working to develop intelligent transportation systems.

Quanergy also lists OptoSafe as one of its past clients.

Louay Eldada is CEO and Co-founder at Quanergy. He holds a PhD in Optical Engineering from Columbia University. Previously, Eldada served as CSO at SunEdison, and CTO of DuPont Photonic Technologies. He was also a research scientist at IBM and Columbia University.

Virtual Simulations

Microsoft

Microsoft offers a software called AirSim, which they claim can help autonomous vehicle manufacturers test vehicle safety of and train their machine learning algorithms using deep learning, computer vision, and reinforcement learning for autonomous vehicles.

The company states that, in offering this open-source application for testing algorithms, it aims to make the development of self-driving cars available to more companies. Initially developed as a tool for game development, the new version of AirSim includes car simulations, new environments, and a programming interface that allows developers to run their algorithms.

Microsoft claims that users can download the application from GitHub on their own computer. A programming interface allows the user to interact with the application to upload training images, control the vehicle and perform other commands. The arrow keys can be used to drive the virtual car.

The application enables simulations in a 3D urban environment that includes downtown, semi-urban, vegetation, and industrial environments, as well as 12 kilometers of drivable roads encompassing more than 20 city blocks. Other environments, sensors, and vehicles are available from an online selection.

Then, AirSim can train the algorithms in two ways, according to the company. The first method is by pressing the record button, which will label each frame of the images. The company states that users are free to modify the labeling. Another way to generate the training data is by using the programming interface and having the developer write the codes to train the data and test it.

Below is a short 1-minute video demonstrating how AirSim works:

Microsoft does not make available any case studies for AirSim, but lists NASA Ames Research Center – Systems Analysis Office, Department of Aeronautics and Astronautics, Stanford University, Robotics Institute, Carnegie Mellon University, and Texas A&M as some of its users.

Ashish Kapoor is a principal researcher and research manager at Microsoft Corporation. He holds a PhD in Machine Learning and Affective Computing from MIT Media Lab. He has served at Microsoft for 13 years. His profile does not reveal other roles and companies where he served.

Takeaways for Business Leaders in the Automotive Industry

Based on our research, AI-based safety technology for autonomous cars mostly revolves around the computer vision. Our research uncovered safety reports by Waymo and General Motors. While these reports explain the safety features of driverless cars as failsafe measures, it was not clear if these were AI-driven. Only the vision systems were specified to use machine learning technologies.

Three of the four companies covered in this report are global leaders in either the technology or automotive industries, with billions of dollars in annual revenues. Waymo and General Motors, are applying the technology to their own driverless cars. While Microsoft does not have its own autonomous vehicle project, it has made available an open-source application that can help developers test their AI algorithms for safety.

Only startup Quanergy relies on venture capital to fund its production but recently announced a partnership with Cisco.

Waymo seems to have the strongest traction in the autonomous vehicle space, with its driverless cars already serving some areas of the Phoenix metropolitan area. It has also struck up partnerships with some businesses such as Avis and Walmart.

The company reports that its year-old Early Rider Program has driven more than 400 participants around the Phoenix area for free. In return, these riders provide feedback that help the company understand how self-driving cars respond to the daily needs of its passengers. The program contributes to the Waymo’s cars’ mileage 24,000 miles daily across the United States.

Waymo is ahead of the pack among all the autonomous vehicle projects covered in this report iin that it has secured approval to drive people around in a limited area, as well as business partnerships that could help promote the autonomous vehicle services for the future, Given that, the autonomous vehicle is still very much a project.

Autonomous vehicles are being developed to keep motorists and pedestrians safe on the road. However, it may take a few more years to see these driverless cars traverse the streets worldwide. At the moment, operations are allowed in highly select areas in the US. Most production is happening in the US, with some efforts in China and parts of Europe.

 

Header Image Credit: Science News

Stay Ahead of the AI Curve

Discover the critical AI trends and applications that separate winners from losers in the future of business.

Sign up for the 'AI Advantage' newsletter:

Subscribe