Artificial Intelligence & Autonomous Vehicles
Artificial Intelligence (AI) is a growing reality in the automotive industry. However, AI has been around as early as the late 1940’s when neurophysiologist, Grey Walter, created autonomous robotic turtles. These mechanical turtles were amongst the first machines to exhibit AI steering (*1). Computer scientist, John McCarthy, coined the phrase “artificial intelligence” back in 1955 and defines it as “the ability of a computer program or machine to think, learn, and make decisions. A machine that thinks and reacts as a human”. (*2)
AI steering is the second of three AI components: Action Selection and Locomotion being first and third, respectively. Action Selection is the strategies, goals, and planning around the use of AI. Steering involves path determination by way of path-finding algorithms. Lastly, Locomotion is the animation and articulation behind AI. Due to the depth of information surrounding all three components of AI usage in autonomous vehicles (AV), this article will focus on Steering and two related elements:
Let us fast-forward to 2019 where we have the technological means to collect and process enormous amounts of data. In order for AV to communicate and accurately operate with the highest safety standards, vehicles will process path-finding determination and algorithms using a robust system of sensors, radar, and cameras not only on-board the vehicle, but also in the surrounding environment (think roads, signage, and the ability to communicate with other AV).
Two components of Steering include Infotainment Human-Machine Interface (IHMI) and Advanced Driver Assistance Systems (ADAS) (*2). IHMI involves complicated systems that recognize speech and gestures, eye tracking, and driver monitoring. IHMI systems also include virtual assistance and natural language interfaces. ADAS focus on camera-based machine vision systems, radar-based detection units, driver condition evaluations, and engine fusion engine control units (*2).
Sounds complicated, right? You are exactly right. There are many layers to this onion of AI and autonomous vehicles, and the engineers and the auto industry have more work to do before the technology is accurate and safe. Let’s take a look at the infrastructure it will take to support AI in AV.
Read the following scenario, then close your eyes and visualize it:
You approach an intersection at the same time as another driver. You think you might have arrived at the intersection just seconds before the other driver, but you are not certain. The other driver is thinking the same about his timing. So you both hesitate until the other driver gestures to you with his hand as if to say: “you go first”. You proceed through the intersection. The other driver proceeds after you safely pass through. All is good!
The human perspective of driving uses human sensory functions such as seeing, hearing, and cognitive functions such as memory, logical thinking, decision making, and learning by experience. AV will be equipped with AI that will have the capability to learn the same sensory and cognitive functions as humans (*2).
Digital Sensorium (DS) is an umbrella term for all of the sensors, radars, and cameras used to process massive amounts of data that AV will generate and have to sort through in real-time (*2). DS allows the vehicle to “feel” the road, “see” other vehicles and analyze road infrastructure and every other object on or around the road. The Autonomous Driving Platform, or Cloud, will use AI to process AI algorithms to make logical decisions in real-time. The DS systems installed in the AV will communicate with the Cloud, which will store prior driving experiences and help AI update to the most accurate, real-time, driving decisions. Updates from the Cloud to the AV will occur several times each second (yes, second!). The more connectivity between vehicles and infrastructure, and the more driving experiences are recorded, the further the accuracy of decision making while driving an AV will become. All of this technology will require another layer to the infrastructure – 5G cellular network technology which will enable real-time (very fast), 24/7 connectivity. (*2)
When considering artificial intelligence in autonomous vehicles and how it will look, we must think about our current road infrastructure and how this might change in the future. For example, street lights and signage may likely be replaced with radio transmitters that will communicate with vehicles and The Cloud.
Even with all of the infrastructure in place, structured environments are easier to automate than non-structured ones. For example, a fallen tree limb, an emergency response vehicle, and recent accidents are just several of countless scenarios that will need programming into the AI. Car manufacturers, such as Ford and Google (yes, that Google), are working hard to capture ever-so-rare-occurring scenarios into their AI, and they are confident that by the time autonomous vehicles roll out, the technology will be robust, accurate, and safe. But can ALL scenarios be captured? I think we all know the answer to that question. That said, AI learning will take time, and we can expect unfavorable outcomes during the initial transition period. Which brings us to our last point - the question: “Who is liable when accidents happen?”
Currently, road infrastructure and driving rules and regulations vary in different cities, states, and countries. Likewise, these components are overseen by various entities such as city and state officials, lawmakers, and government. The question is which entity will have the power and reach to regulate and to standardize autonomous vehicle driving environments of the future?
We can also consider limiting the use of AV to certain areas or zones, but everything previously mentioned in this article would still need to be in place for the vehicles to operate safely and accurately.
The short answer (for now) of who is liable in the instance of an AV involved accident can be broken down as follows:
Currently, all AV are still in test mode and require a human driver to be alert, attentive, and act accordingly to avoid potential hazards in an accurate manner and avoid a collision. If an accident occurs in this situation, the driver of the vehicle is responsible for any incurred damages.
Once AV are standardized and all infrastructure and policies are in place, if the AV replaces a human but still has a steering wheel as an option for human interaction and a crash occurs, the AI software company for that vehicle is legally responsible. However, the owner may share some legal responsibility.
If the AV replaces a human and has NO steering wheel as an option for human interaction and a crash occurs, the automaker and the software company can be held legally responsible and NOT the passenger(s). (*3)
The Eshelman Legal Group
The attorneys at the Eshelman Legal Group understand that no matter how cautious you are, others may not be so careful, and accidents do happen. So we hope you don’t need to, but if you are in a situation where you need the advice of an personal injury attorney, the Eshelman Legal Group is here to help you. For over 40 years we have been assisting accident victims, and we are here to assist you too... because “We’ll make things right.”
Ask yourself this question… who does the adjuster work for? The adjuster works for the insurance company, they do not work for you.