COVID-19 Cases

Tesla Faces Lawsuit with 'Dozing Driver' Blamed for Fatal Crash after Model X on Autopilot


The surviving family members of a 44-year-old Japanese man who was struck and killed by one of his electric cars in April 2018 are suing Tesla.

Documents filed in a federal court in San Jose on Tuesday described the case of Yoshihiro Umeda, 44, as the first "Tesla Autopilot-related death involving a pedestrian" and alleged that the incident had demonstrated a "patent defect" of Tesla's technique.

Tesla, led by Elon Musk, has its headquarters in Palo Alto, California, and sells automated driver assistance systems for cars. Bloomberg first reported the court motion, lodged by victim's spouse Tomomomi Umeda and daughter Miyu Umeda.

Yoshihiro Umeda was killed on 29 April 2018, after being hit by a Tesla Model X that "suddenly accelerated" when a car switched lanes in front of it, the filing says.

It adds that the Tesla vehicle crashed into a van, motorcycles and pedestrians which had stopped after an accident at the side of an expressway near Tokyo.

The documents alleged that the Tesla car, which had its Traffic Aware Cruise Control (TACC) feature engaged, was speeding up after the car moved in front, "rapidly accelerating from around 15 km / h to about 38 km / h" before hitting the motorcycles and Umeda.

The driver was "found to dozing shortly before the crash," the filings note, adding plaintiffs expect the company to "lay all the blame" on that individual.

"If Tesla's past behavior of blaming the drivers of his vehicles is an example, Tesla will probably portray this accident as the sole result of a drowsy, inattentional driver to distract him from the obvious shortcomings of his automated ... technology," he states.

"Any such effort would be groundless. Mr. Umeda's tragic death would have been avoided, but for the substantial flaws in Tesla's Autopilot system and technology suite." According to the documents, the incident showed a "patent flaw," as the car did not alert the drowsy driver because his hands were on the wheel. They alleged that the system failed to recognize pedestrians and engaged a system of braking.

"Tesla failed to develop reasonable measures and safeguards against the dangers presented to drivers by these types of scenarios," the motion asserts. It goes on to describe the Tesla system as "fatally flawed," "half-baked," and "non-market-ready." "Tesla should be held guilty for its behavior and acts committed in marketing its vehicles with reckless disregard for motorists and the general public," it states.

Multiple crashes involving Tesla electric cars, some of them fatal, have been investigated by U.S. authorities in recent years, and footage has continued to surface from drivers sleeping inside semi-autonomous cars while speeding up busy roads in the U.S. In February, the National Transportation Safety Board (NTSB) ruled Tesla's "autopilot" system, mixed with driver distraction, contained the limitations

Tesla says her vehicles are designed to be the "safest cars in the world" and stresses online that it is not a self-driving system, despite being often named Autopilot.

The functions are designed to be improved over time and updated.

"Autopilot is meant for use only with a fully attentive driver who has his hands on the wheel and is ready to take over," explains the firm. "It doesn't turn a Tesla into an autonomous vehicle, and it doesn't allow the driver to relinquish responsibility." Tesla is sued for defective design in the Umeda filing, failing to warn, negligence and wrongful death. The company was contacted for comment on the case.

An Uber who was in self-driving mode reportedly killed a woman in Arizona in 2018. Despite a human behind the wheel, ABC15 reported it marked the first known fatal crash involving an autonomous vehicle and a pedestrian in the U.S ..

Related link:- Polestar 2 Electric Car Promises Maximum Safety