Skip to content

Who Is Responsible When Self-Driving EVs Get Into Accidents? A Digital Technology Expert‘s Perspective

As electric vehicles (EVs) with self-driving capabilities become increasingly common on our roads, questions about liability in the event of accidents have come to the forefront. Recent high-profile cases, such as the Rafaela Vasquez trial, have highlighted the complexities of assigning responsibility when a self-driving EV is involved in a collision. In this post, we‘ll take a deep dive into this issue, exploring the current state of self-driving technology, the legal and regulatory landscape, and the factors that can impact liability in these cases.

The Current State of Self-Driving Technology in EVs

Self-driving technology in EVs is still in its early stages, but it is rapidly advancing. The Society of Automotive Engineers (SAE) has defined six levels of driving automation, ranging from Level 0 (no automation) to Level 5 (full automation). Most EVs on the road today fall into the Level 2 category, which includes features such as lane keeping assistance and adaptive cruise control. These systems require the driver to remain alert and ready to take control at any moment.

However, some manufacturers, such as Tesla and GM, are pushing the boundaries of what‘s possible with self-driving technology. Tesla‘s Autopilot system, for example, is capable of navigating highways and changing lanes without driver input, while GM‘s Super Cruise system allows for hands-free driving on mapped highways. Despite these advances, fully autonomous vehicles (Level 5) are still a long way off, and drivers are still ultimately responsible for the safe operation of their vehicles.

According to a report by the National Highway Traffic Safety Administration (NHTSA), there were 392 crashes involving vehicles equipped with Level 2 advanced driver assistance systems (ADAS) between July 1, 2021 and May 15, 2022. Of these crashes, 273 involved Tesla vehicles, 90 involved Honda vehicles, 10 involved Subaru vehicles, and the remainder involved a mix of other manufacturers. While these numbers may seem high, it‘s important to note that they represent a small fraction of the total number of crashes that occur on U.S. roads each year.

Manufacturer Number of Crashes
Tesla 273
Honda 90
Subaru 10
Other 19

Source: NHTSA, "Standing General Order on Crash Reporting for Level 2 Advanced Driver Assistance Systems (ADAS) and Level 3-5 Automated Driving Systems (ADS)," June 2022.

As the table above shows, Tesla vehicles were involved in the majority of reported crashes involving Level 2 ADAS systems during the time period studied. However, it‘s important to note that this data does not necessarily indicate that Tesla‘s Autopilot system is less safe than other Level 2 systems on the market. Factors such as the number of vehicles equipped with the technology, the types of roads and environments in which the vehicles are used, and the behavior of drivers using the systems can all impact crash rates.

The Rafaela Vasquez Case: A Turning Point in Self-Driving Liability

The tragic death of Elaine Herzberg, who was struck and killed by a self-driving Uber vehicle in Tempe, Arizona in 2018, brought the issue of self-driving liability into the spotlight. The vehicle‘s safety driver, Rafaela Vasquez, was charged with negligent homicide in the case, marking the first time a human operator of a self-driving vehicle had been charged with a crime.

According to investigators, Vasquez was watching a video on her phone at the time of the accident and failed to take control of the vehicle in time to avoid the collision. The case raised questions about the role and responsibilities of safety drivers in self-driving vehicles, as well as the liability of the companies developing and deploying these technologies.

In February 2023, Vasquez was found guilty of negligent homicide by a Maricopa County court. The verdict set a significant precedent for the assignment of liability in accidents involving self-driving vehicles. It demonstrated that, even when a vehicle is operating in autonomous mode, the human operator can still be held responsible if they fail to properly monitor the vehicle and intervene when necessary.

However, the Vasquez case is just one example of the complex web of liability issues surrounding self-driving EVs. In other cases, the self-driving system itself has been found to be at fault. For example, in 2016, a Tesla Model S operating in Autopilot mode crashed into a tractor-trailer in Florida, killing the driver. The NHTSA investigation into the crash found that the Autopilot system was not designed to handle the specific scenario that led to the crash, and that the driver had ignored multiple warnings to take control of the vehicle.

In another case, a Tesla Model X operating in Autopilot mode crashed into a highway barrier in Mountain View, California in 2018, killing the driver. The National Transportation Safety Board (NTSB) investigation into the crash found that the Autopilot system had steered the vehicle into the barrier due to a combination of system limitations and driver inattention.

These cases illustrate the complex interplay of factors that can contribute to accidents involving self-driving EVs, and the challenges involved in assigning liability when things go wrong.

Factors That Can Impact Liability in Self-Driving EV Accidents

As the previous examples demonstrate, there are a variety of factors that can impact liability in accidents involving self-driving EVs. Some of the key factors include:

Driver Inattention

As the Vasquez case demonstrated, driver inattention can be a significant factor in self-driving EV accidents. Even when a vehicle is operating in autonomous mode, drivers are still responsible for monitoring the vehicle and being ready to take control if necessary. If a driver is distracted or fails to properly monitor the vehicle, they may be held liable in the event of an accident.

System Malfunctions

Self-driving systems are not infallible, and malfunctions can occur. In some cases, a system malfunction may be the primary cause of an accident, which could shift liability away from the driver and onto the manufacturer or developer of the system. However, proving that a malfunction occurred and that it was the cause of the accident can be challenging, particularly if the system‘s data and logs are not available or are inconclusive.

Environmental Conditions

Self-driving systems rely on a variety of sensors and algorithms to navigate the road and avoid obstacles. However, these systems can be impacted by environmental conditions such as poor weather, low visibility, or road damage. If an accident occurs in challenging environmental conditions, it may be more difficult to determine whether the driver or the self-driving system was at fault.

Cyber Attacks

As self-driving EVs become more connected and reliant on software and data, the risk of cyber attacks increases. A hacker could potentially take control of a self-driving system and cause an accident, raising questions about who would be liable in such a scenario. While there have been no reported cases of self-driving EVs being hacked in the real world, the potential for such attacks is a growing concern among experts in the field.

The Legal and Regulatory Landscape for Self-Driving EVs

As self-driving EVs become more common, lawmakers and regulators are grappling with how to adapt existing laws and regulations to this new technology. In the United States, the National Highway Traffic Safety Administration (NHTSA) has released guidelines for the testing and deployment of self-driving vehicles, but there is no comprehensive federal framework for self-driving liability.

At the state level, the legal landscape is even more complex. Some states, such as Arizona and Nevada, have passed laws that specifically address the liability of self-driving vehicles and their operators. Other states have yet to take action, leaving questions of liability to be determined on a case-by-case basis.

One key issue that lawmakers and regulators are grappling with is the question of "control" in self-driving vehicles. Under traditional liability frameworks, the driver of a vehicle is generally held responsible for accidents because they are in control of the vehicle. But with self-driving vehicles, the question of who is in control becomes more complicated. Is it the human operator, who is responsible for monitoring the vehicle and taking control if necessary? Is it the manufacturer or developer of the self-driving system? Or is it some combination of both?

Another challenge is the lack of standardization in the self-driving industry. Different manufacturers and developers use different approaches to self-driving technology, and there is no consensus on what constitutes a "safe" or "reliable" system. This can make it difficult for regulators to establish clear guidelines and standards for the industry.

Despite these challenges, some experts believe that the development of self-driving EVs could ultimately lead to a simpler and more efficient liability framework. For example, if self-driving systems become advanced enough to operate without human intervention, liability could potentially shift entirely to the manufacturer or developer of the system. This could simplify the claims process and reduce the need for complex legal battles over liability.

The Role of Insurance Companies in Self-Driving EV Accidents

Insurance companies are another key player in the self-driving liability landscape. As self-driving EVs become more common, insurers will need to adapt their policies and practices to account for this new technology.

One potential approach is the development of specialized insurance products for self-driving EVs. These products could take into account factors such as the level of autonomy of the vehicle, the driver‘s experience and training with self-driving systems, and the vehicle‘s safety record. Some insurers are already experimenting with these types of products, such as Tesla‘s "InsureMyTesla" program, which offers discounted rates for Tesla owners who use the company‘s Autopilot system.

Another approach is for insurers to work more closely with manufacturers and developers of self-driving systems to better understand the technology and its risks. This could involve data sharing agreements, joint research projects, or even partnerships to develop new insurance products and pricing models.

However, the insurance industry faces its own set of challenges when it comes to self-driving EVs. For one, the lack of standardization in the industry makes it difficult for insurers to assess the risks associated with different self-driving systems. Additionally, the potential for cyber attacks and other new types of risks associated with self-driving EVs may require insurers to develop new underwriting models and risk assessment tools.

Recommendations for Safe and Responsible Deployment of Self-Driving EVs

As self-driving EVs become more common, it is important for all stakeholders – drivers, manufacturers, policymakers, and insurers – to work together to ensure the safe and responsible deployment of this technology. Some recommendations include:

Clear and Consistent Regulations

Policymakers should work to develop clear and consistent regulations for the testing and deployment of self-driving EVs, including standards for safety, liability, and data sharing. These regulations should be based on a thorough understanding of the technology and its risks, and should be regularly updated as the technology evolves.

Driver Education and Training

Drivers of self-driving EVs should receive comprehensive education and training on the capabilities and limitations of the technology, as well as their responsibilities as operators. This could include hands-on training, simulations, and regular refresher courses.

Transparency and Accountability

Manufacturers and developers of self-driving systems should be transparent about the capabilities and limitations of their technology, and should be held accountable for any safety defects or malfunctions. This could involve regular safety audits, data sharing requirements, and liability frameworks that incentivize the development of safe and reliable systems.

Collaboration and Data Sharing

All stakeholders should work together to share data and insights about self-driving EVs and their performance on the road. This could include the creation of industry-wide safety standards, the development of shared databases and research projects, and regular dialogue and collaboration between policymakers, manufacturers, insurers, and consumer advocates.

Ethical Considerations

As self-driving EVs become more advanced and capable of making decisions that impact human life, it is important to consider the ethical implications of this technology. Manufacturers and developers should work to ensure that their systems are designed to prioritize safety and minimize harm, and should be transparent about the values and priorities that are embedded in their algorithms.

The Future of Self-Driving EVs and Liability

As self-driving technology continues to advance, it is likely that the liability landscape will continue to evolve as well. Some experts believe that the development of fully autonomous vehicles (Level 5) could potentially shift liability entirely to the manufacturer or developer of the system, simplifying the claims process and reducing the need for complex legal battles.

However, others caution that even with fully autonomous vehicles, there will still be a need for human oversight and responsibility. For example, if a self-driving EV is not properly maintained or updated, the owner or operator of the vehicle could potentially be held liable for any resulting accidents or injuries.

Ultimately, the future of self-driving EVs and liability will depend on a complex interplay of technological, legal, and societal factors. As the technology continues to evolve, it will be important for all stakeholders to remain engaged and work together to ensure that self-driving EVs are developed and deployed in a safe, responsible, and ethical manner.

Conclusion

The question of who is responsible when self-driving EVs get into accidents is a complex and evolving one. As the technology continues to advance and become more common on our roads, it will be important for all stakeholders to work together to ensure the safe and responsible deployment of these vehicles.

The Rafaela Vasquez case serves as a stark reminder of the high stakes involved in this issue, and the need for clear and consistent regulations, education and training for drivers, transparency and accountability from manufacturers, and collaboration and data sharing among all stakeholders.

Ultimately, the goal should be to harness the potential of self-driving EVs to make our roads safer and more efficient, while also ensuring that the technology is developed and deployed in a responsible and ethical manner. By working together and staying focused on this goal, we can help ensure a future in which self-driving EVs are a safe and valuable part of our transportation system.

As a Digital Technology Expert, I believe that the development of self-driving EVs represents one of the most exciting and transformative opportunities of our time. However, I also recognize that this technology raises complex questions and challenges that will require ongoing collaboration, research, and dialogue to address.

By staying engaged and working together, we can help to shape the future of self-driving EVs in a way that maximizes their benefits and minimizes their risks. This will require a commitment to transparency, accountability, and ethical considerations from all stakeholders, as well as a willingness to adapt and evolve as the technology continues to advance.

Ultimately, the success of self-driving EVs will depend on our ability to build trust and confidence in the technology among drivers, passengers, and the broader public. By prioritizing safety, responsibility, and collaboration at every step of the development and deployment process, we can help to ensure that self-driving EVs live up to their promise and become a valued and trusted part of our transportation system for years to come.