–by Aya Hoffman
Abstract: Just a day after the National Highway Traffic Safety Administration (“NHTSA”) closed its investigation into Tesla Motors’ autopilot technology, Chinese media reported another fatal collision involving a Model S sedan. Currently, victims face significant challenges in holding automakers liable for accidents involving assistive technology, but the market may be moving toward a strict liability standard as true self-driving cars become more widely available.
Although the NHTSA ended its investigation into Tesla Motors’ autopilot system without requiring a recall or fine, questions about the safety of such systems remain. Tesla, based in Palo Alto, California, manufactures an all-electric line of vehicles, including the Tesla Roadster, Model S, Model X, and Model 3. When it was introduced in 2015, Tesla’s autopilot system was the first of its kind in consumer vehicles.
The NHTSA investigation was initiated after a fatal crash in May 2016, involving a Tesla Model S sedan. Joshua Brown was killed after the autopilot system installed in his vehicle failed to recognize a turning tractor-trailer. On January 19, 2017, the NHTSA reported that the system was not defective at the time of the crash. The agency noted that Tesla’s autopilot system was designed to prevent rear-end collisions, and still required a driver’s full attention while in operation. Brown’s family hired a law firm with experience in product defect litigation to conduct its own investigation into the crash.
However, Tesla came under renewed scrutiny the following day, when another fatal collision involving a Model S was reported in China. On January 20, 2017, Gao Yaning was killed when his vehicle collided with a road-sweeping truck. In-car video suggests that the brakes were not applied prior to impact with the rear of the truck, but it is unclear whether the autopilot system was activated.
If the families of Brown and Yaning file suit against Tesla, they will face significant challenges. Despite the modern technology at issue, the available legal theories for product liability and accident compensation claims are traditional – strict liability, negligence, design-defects law, failure to warn, and breach of warranty. However, Tesla requires buyers to consent to contract terms which require drivers to keep their hands on the steering wheel at all times, including when the autopilot system is engaged. And technically, Tesla’s current autopilot technology is not “self-driving.” Although it can steer a car in traffic and make passing maneuvers, it is not connected to a navigation system and requires an alert and responsible human driver.
Interestingly, as truly autonomous vehicles enter the mainstream, it may become easier for drivers to hold car manufacturers liable for accidents involving self-driving technology. Currently, carmakers are not liable for most accidents, which are attributed to driver behavior. While fully autonomous vehicles are still in development, some carmakers, including Volvo, Google, and Mercedes-Benz, have already pledged to accept strict liability for resulting accidents. Although it may seem counter-intuitive, these companies are betting that advanced safety programming will significantly decrease the rate of accidents. Of course, the costs of liability will be passed onto consumers by way of increased car prices. However, some legal scholars suggest this increase may be offset by a decrease in the cost of insurance premiums for self-driving vehicles.
As is common with emerging technologies, early adopters face the most risk. It may be difficult to hold carmakers liable for accidents during this transitional period, compared to when fully self-driving cars are established in the market. This problem is compounded by the fact that car dealers may be uninformed about the technologies inside the vehicles they sell. In the spring of 2016, researchers from the Massachusetts Institute of Technology’s Agelab conducted interviews at car dealerships in the Boston area. The researchers went to the dealerships undercover and asked salespeople questions about common automated driver assistance programs, including adaptive cruise control, blind spot monitoring, and collision avoidance. Of the eighteen salespeople interviewed, only six provided “thorough” explanations of the technologies. According to the researchers, four salespeople gave “poor” explanations and two provided incorrect information that was potentially dangerous. Although this was a small study, it reinforces the need for consumers to educate themselves on the operating requirements and limitations of the technologies installed in their vehicles.
Even when cars are equipped with advanced technology, old-school methods still provide drivers the best protection against potentially fatal accidents – education and constant vigilance.