Blame-The-Driver Defense Of Tesla Will Be Tested In The Next Autopilot Trial

The head of the company, Jon McNeill, tested out Tesla’s Autopilot in a Model X six weeks before to the first fatal crash employing the technology in the United States in 2016. He sent an email with comments to Sterling Anderson, the chief of automated driving, including Elon Musk.

According to McNeill, the system operated flawlessly and with the fluidity of a human driver.

“I got so comfortable under Autopilot, that I ended up blowing by exits because I was immersed in emails or calls (I know, I know, not a recommended use),” he wrote in the email dated March 25 that year.

McNeill’s email is now being utilised in a fresh legal assault on Tesla about Autopilot; this information was not previously publicised.

According to previously unreported transcripts seen by the media, plaintiffs’ attorneys in a California wrongful-death lawsuit referenced the message in a deposition when questioning a Tesla witness about whether the business knew drivers would not look at the road while using its driver-assistance system.

On a public road, the Autopilot system may steer, accelerate, and brake on its own; but, it cannot completely take the position of a human driver, particularly in urban areas. As stated in Tesla’s explanation of the technology, the vehicle still needs a “fully attentive driver” who is ready to “take over at any moment” and is not autonomous.

The lawsuit, which is scheduled for trial in San Jose the week of March 18, concerns a deadly crash that occurred in March of 2018. It comes after Tesla prevailed in two earlier Autopilot trials in California by claiming the drivers there had disregarded the system’s warnings to pay attention.

This time, the deposition transcripts include testimony from Tesla witnesses that the automaker never investigated how fast and efficiently drivers could take over if Autopilot unintentionally drives towards an obstacle. This is evidence that the lawyers in the San Jose case are using to support their case.

A witness stated in testimony that Tesla considered implementing a camera-based system to monitor drivers’ attentiveness for as long as three years, but decided against it until 2021.

The device tracks a driver’s movements and warns them if they take their eyes from the road in front of them.

The lawsuit concerns the death of Apple engineer Walter Huang in a traffic collision close to San Francisco. Tesla claims that because Huang was playing a video game right before the accident, he abused the system.

Attorneys for Huang’s family are questioning Tesla about whether the automaker realised that drivers—including its own president, McNeill—would probably not or could not operate the system as instructed, and what precautions the company took to keep them safe.

If drivers follow their share of the road, experts in autonomous vehicle law predict that this case would put Tesla’s claims about Autopilot’s safety to the ultimate test to date.

According to Matthew Wansley, an associate professor at Cardozo Law School who has worked in the automated vehicle sector, Tesla’s understanding of typical driver behaviour may be crucial from a legal standpoint.

“If it was reasonably foreseeable to Tesla that someone would misuse the system, Tesla had an obligation to design the system in a way that prevented foreseeable misuse,” he stated.

Professor Richard Cupp of Pepperdine Law School suggested that Tesla may potentially undercut the plaintiffs’ plan by claiming that Huang purposefully abused Autopilot.

In the event of a victory, however, the plaintiffs’ lawyers might offer a template for future lawsuits involving Autopilot. With at least a dozen of these lawsuits pending, eight of which involve fatalities, Tesla could be subject to substantial financial awards.

There were no comments on the issue from Musk, Tesla and its attorneys, as well as McNeill, and Anderson. Both have departed from Tesla. McNeill sits on the boards of both General Motors and Cruise, the company’s autonomous vehicle division. Anderson is a co-founder of the autonomous technology startup Aurora.

Huang’s crash is one of hundreds in the United States where reports to vehicle safety officials suggested Autopilot may have played a role.

At least 956 collisions in which Autopilot was first reported to have been used have been investigated by the National Highway Traffic Safety Administration (NHTSA) of the United States. In isolation, the organisation opened over 40 investigations into incidents utilising Tesla automatic driving technologies that left 23 people dead.

In order to increase driver alerts, Tesla recalled more than 2 million Autopilot vehicles in December amid NHTSA examination. An update to the remote software was used to apply the fix.

Huang’s family claims that his 2017 Model X was driven into a highway barrier by Autopilot.

Tesla accuses Huang of not being attentive enough to take control of the vehicle. “There is no dispute that, had he been paying attention to the road he would have had the opportunity to avoid this crash,” Tesla stated in a court statement.

The evidence that jurors will hear has not yet been determined by a Santa Clara Superior Court judge.

In addition, Tesla is the subject of a federal criminal investigation concerning its self-driving car claims, which was originally revealed by Reuters in 2022. It said in October that it had been served with subpoenas pertaining to driver-assistance technologies.

Tesla hasn’t been able to fulfil Elon Musk’s dream of creating self-driving cars that don’t need human assistance, even with marketing features like Autopilot and Full Self-Driving.

According to Tesla, when a new tab is opened, Autopilot can navigate inside a highway lane and match pace with nearby traffic. The $6,000 step-up “enhanced” Autopilot adds self-parking capabilities, highway ramp navigation, and automated lane changes. Stop-light recognition and other autonomous functions for city roadways are added with the $12,000 Full Self-Driving option.

The plaintiffs’ attorneys in the Huang lawsuit are challenging Tesla’s claim that drivers can quickly switch back to driving in the event that Autopilot malfunctions in light of the McNeill email.

According to Bryant Walker Smith, a professor at the University of South Carolina who specialises in the law of autonomous vehicles, the email illustrates how drivers may grow complacent when utilising the technology and neglect the road. According to the former president of Tesla, the company’s message “corroborates that Tesla recognises that irresponsible driving behaviour and inattentive driving is even more tempting in its vehicles”.

According to a transcript, Huang family lawyer Andrew McDevitt read passages of the email aloud in a deposition. The entire contents of McNeill’s note could not be obtained by Reuters.

The plaintiffs’ lawyers also brought up Elon Musk’s remarks in the public domain while examining Tesla’s knowledge of driver behaviour. Musk stated at a press conference following a deadly collision in 2016 that drivers have trouble paying attention after using the system a lot.

“Autopilot accidents are far more likely for expert users,” he said. “It is not the neophytes.”

It was made evident by a 2017 Tesla safety analysis, a corporate document entered into evidence in an earlier case, that the system depends on prompt driver responses. Citing the document, plaintiffs in one of the trials Tesla won, Autopilot may make a “unexpected steering input” at high speed, potentially leading the car to make a risky move. The driver “is ready to take over control and can quickly apply the brake” in order to commit such an error.

In depositions, neither a Tesla employee nor an expert witness the company engaged could recall any prior studies the automaker had done on drivers’ abilities to take control in the event that Autopilot failed.

“I’m not aware of any research specifically,” stated the employee, who Tesla had selected as the most appropriate witness to discuss Autopilot.

The automaker claimed that the employee’s name was legally sensitive information and hence removed it from depositions.

McDevitt questioned Christopher Monk, the Tesla expert witness, if he could identify any experts in the field of people interacting with automated systems that Tesla had contacted when developing Autopilot.

“I cannot,” said Monk, who studies driver distraction and previously worked for the NHTSA, the depositions show.

There were no comments from Monk.

It was not possible to independently ascertain whether Tesla has conducted research on the speed at which drivers can regain control since March 2018 or the efficacy of the camera monitoring systems that it installed in 2021.

Since 2017, the National Transportation Safety Board (NTSB), which looked into five crashes with Autopilot, has advised Tesla to make improvements to the driver-monitoring systems in its cars, but it hasn’t said how.

In its report on the Huang accident, the agency—which carries out safety investigations and studies but is not authorised to force recalls—came to the following conclusion: “The Tesla vehicle’s ineffective monitoring of driver engagement, which facilitated the driver’s complacency and inattentiveness, contributed to the crash.”

Musk stated in 2016 that motorists would disregard up to ten alerts per hour to keep their hands on the wheel.

The Tesla employee stated during her testimony that prior to Huang’s tragedy, the business had thought about utilising cameras to track drivers’ attentiveness, but they didn’t implement a system like that until May 2021.

In public remarks, Musk has consistently opposed proposals for more sophisticated driver-monitoring systems, arguing that his fully autonomous cars will soon outperform vehicles driven by people and be safer.

“The system is improving so much, so fast, that this is going to be a moot point very soon,” he said in 2019 on a podcast with artificial-intelligence researcher Lex Fridman. “I’d be shocked if it’s not by next year, at the latest … that having a human intervene will decrease safety.”

Tesla has finally acknowledged that its vehicles require stronger security. In December, it announced that it was recalling cars equipped with Autopilot, citing concerns about its driver-monitoring systems’ potential inadequacy and the addition of alerts to help drivers “adhere to their continuous driving responsibility”.

Kelly Funkhouser, associate director of automotive technology at Consumer Reports, a reputable U.S. product testing company, stated that the recall did not entirely resolve the issue. Following the automaker’s patch, which showed the system failed in numerous ways to solve the safety issues that prompted the recall, its road testing reveal a new tab of two Tesla vehicles.

“Autopilot usually does a good job,” Funkhouser said. “It rarely fails, but it does fail.”

(Adapted from Reuters.com)



Categories: Economy & Finance, Entrepreneurship, Regulations & Legal, Strategy, Uncategorized

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.