Increases in the use of autonomous car technologies (e.g., advanced driver-assistance systems) are causing incremental shifts in the control of driving.
[ambiguous] There may be a need for existing liability laws to evolve in order to fairly identify the parties responsible for damage and injury, and to address the potential for conflicts of interest between human occupants, system operator, insurers, and the public purse.
[3] Increases in the use of automated car technologies (e.g. advanced driver-assistance systems) may prompt incremental shifts in this responsibility for driving.
And when there is some degree of sharing control possible (Level 3 or 4), a well-advised person would be concerned that the vehicle might try to pass back control at the last seconds before an accident, to pass responsibility and liability back too, but in circumstances where the potential driver has no better prospects of avoiding the crash than the vehicle, since they have not necessarily been paying close attention, and if it is too hard for the very smart car it might be too hard for a human.
In the situation of autonomous cars, negligence would most likely fall on the manufacturer because it would be hard to pin a breach of duty of care on the user who isn't in control of the vehicle.
Based on a Nevada Supreme Court ruling (Ford vs. Trejo) the plaintiff needs to prove failure of the manufacturer to pass the consumer expectation test.
[10] Meanwhile, research from the Insurance Institute for Highway Safety (IIHS) shows that Advanced Driver-Assistance Systems, which are seen as stepping stones to get to Level 3 and 4 autonomy, have helped reduce collisions by employing forward-collision warnings and automatic braking.
Consumer-expectations: "A product is defective in design or formulation when it is more dangerous than an ordinary consumer would expect when used in an intended or reasonably foreseeable manner.
[14] In defense of such liabilities, autonomous vehicle manufacturers could make the argument of comparative negligence, product misuse, and state of the art.
For manufacturers and developers of autonomous technology, liability exposures arise from the collection and storage of data and personal information in the vehicle and the cloud.
[26] When Mercedes launch its Drive Pilot mid 2021 in Germany, it is expected that Daimler would have to assume insurance liability, depending on the jurisdiction.
In the report “Autonomous Vehicle Technology” by the Rand Corporation, the authors recommend that policymakers consider approaches such as tort preemption, a federal insurance backstop, and long-term cost-benefit analysis of the legal standard for reasonableness.
In September 2016, the National Highway Traffic Safety Administration released a policy report to accelerate the adoption of autonomous car technology (or HAVs, highly automated vehicles) and provide guidelines for an initial regulatory framework.
[33][34] More broadly, any software with access to the real world, including autonomous vehicles and robots, can cause property damage, injury, and death.
In 2018, The University of Brighton researcher John Kingston analyzed three legal theories of criminal liability that could apply to an entity controlled by artificial intelligence.
[35][36] Possible defenses include unexpected malfunction or infection with malware, which has been successfully used in the United Kingdom in the case of a denial-of-service attack.
Autopilot only gives cars limited autonomy, and human drivers are expected to maintain situational awareness and take over as needed.
[38] Volvo has already announced that it will pay for any injuries or damage caused by its fully autonomous car, which it expects to start selling in 2020.
[38] Starting in 2012, laws or regulations specifically regarding autonomous car testing, certification, and sales, with some issuing special driver's licenses; have been passed by some U.S. states, this remains an active area of lawmaking.
University of South Carolina law professor Bryant Walker Smith points out that with automated systems, considerably more data will typically be available than with human-driver crashes, allowing more reliable and detailed assessment of liability.
Doug Ducey's new rules, implemented March 1, lay out a specific list of licensing and registration requirements for autonomous car operators.
This reflects the view that personal liability will fall as the responsibility of driving shifts to the vehicle and that mobility on demand will take greater hold.
In addition, with the view that the overall pie representing losses covered by liability policies will shrink as autonomous cars cause fewer collisions.
"[19] In an IEEE article, the senior technical leader for safety and driver support technologies at Volvo echoed a similar sentiment saying, “if we made a mistake in designing the brakes or writing the software, it is not reasonable to put the liability on the customer...we say to the customer, you can spend time on something else, we take responsibility.”[44] Starting in September 2023 in the United States, Mercedes-Benz takes liability for the Level 3 Drive Pilot as long as the "user operates Drive Pilot as designed,"[45] but "the driver must be ready to take control of the vehicle at all times when prompted to intervene by the vehicle.
"[46] In August 2023 a General Motors Cruise self-driving car drove into wet concrete in a road construction zone in San Francisco, California, and got stuck.