Widow of the first known "Full Self-Driving" fatality: We were sold a false sense of security
By bellecarter // 2024-02-18
 
A Tesla employee and a devoted fan of Big Tech mogul Elon Musk may be the first person killed by the company's driver assistance software, "Full Self-Driving" (FSD), when his Tesla Model 3 barreled into a tree and exploded in flames. In 2022, Hans von Ohain was on his way to play golf with his friend Erik Rossiter when the electric car swerved off Upper Bear Creek Road just to the west of Denver. According to Rossiter, the self-driving mode was struggling to navigate the mountain curves that forced von Ohain to repeatedly yank it back on course. On their way home, the car "just ran straight off the road," Rossiter told emergency responders based on the 911 dispatch recording. The two were found to have been drinking and an autopsy found that von Ohain died with a blood alcohol level of 0.26, which is more than three times the legal limit and a level of intoxication that would have hampered his ability to maintain control of the car, experts said. Still, an investigation by the Colorado State Patrol went beyond drunken driving, seeking to understand what role the Tesla software may have played in the crash. Rossiter, who was found to have a similar blood alcohol level, recalled himself jumping out of the car and trying to pull his friend out but the driver's door was blocked by a fallen tree. As he yelled for help on the deserted mountain road, he remembered his friend was screaming inside the burning car. Colorado State Patrol Sgt. Robert Madden, who oversaw the investigation, said it was one of "the most intense" vehicle fires he had ever seen, fueled by thousands of lithium-ion battery cells in the car's undercarriage. Von Ohain's cause of death was listed as "smoke inhalation and thermal injuries." Madden said he probably would have survived the impact alone. At the scene of the crash, Madden also said he found "rolling tire marks," meaning the motor continued to feed power to the wheels after impact. There were no skid marks, he added, implying that the driver appeared not to have hit the brakes. Von Ohain's widow, Nora Bass, said she has been unable to find a lawyer willing to take his case to court because he was legally intoxicated. Nonetheless, she said, Tesla should take at least some responsibility for her husband's death. "Regardless of how drunk Hans was, Musk has claimed that this car can drive itself and is essentially better than a human," she said. "We were sold a false sense of security." Her husband used FSD nearly every time he drove, Bass added, placing him among legions of Tesla boosters heeding Musk's call to generate data and build the technology's mastery despite the cars being far from actually being able to drive themselves. While Bass refused to use the feature herself, von Ohain was so confident in all it promised that he even used it with their baby in the car. "It was jerky, but we were like, that comes with the territory of new technology," Bass said. "We knew the technology had to learn, and we were willing to be part of that."

Tesla's software showing erratic behavior

Tesla owners have long complained of erratic behavior by the software, including sudden braking, missed road markings and crashes with parked emergency vehicles. Since federal regulators began requiring automakers to report crashes involving driver-assistance systems in 2021, they have logged more than 900 incidents, including at least 40 that resulted in serious or fatal injuries. (Related: U.S. government to probe Tesla after receiving 2,400 complaints of drivers LOSING STEERING CONTROL.) The electric car manufacturer has released FSD to about 400,000 customers and acknowledged that the software is in "beta" mode or is still in the development phase. But the automaker argued that its public release is an essential step toward reducing America's 40,000 annual road deaths. "The more automation technology offered to support the driver, the safer the driver and other road users," Tesla claimed in a short statement on X, formerly Twitter. Tesla user manuals cite a long list of conditions under which FSD may not function properly, including narrow roads with oncoming cars and curvy roads. According to the EV maker, drivers must control their cars and Tesla is not liable for distracted or drunken driving. Meanwhile, multiple lawsuits have begun challenging the view that drivers are solely responsible when Tesla's software allegedly causes crashes or fails to prevent them but Tesla has so far prevailed. Just last fall, a California jury found Tesla not liable for a 2019 "Autopilot" crash in which survivors said the car suddenly veered off the road. At least nine more cases are expected to go to trial this year. For years, Musk had preached the benefits of pursuing autonomous driving. In 2019, he predicted that it would one day be so reliable that drivers "could go to sleep" although, Tesla's user agreement requires the driver to stay engaged and ready to take over from FSD at all times. "We test as much as possible in simulation and with [quality assurance] drivers, but reality is vastly more complex," Musk tweeted last spring about a new version of the software. Tesla employees would get it first, he said, with wider release to come "as confidence grows." Visit RoboCars.news for related stories about self-driving vehicles. Watch the video below that talks about inconvenient facts about electric vehicles. This video is from the Galactic Storm channel on Brighteon.com.

More related stories:

Tesla to expand battery plant in Nevada using equipment and tech from CCP-linked company. Car rental company SIXT drops Tesla EVs from its fleet due to poor resale value, high repair costs. Tesla factory robot reportedly ATTACKS worker in violent malfunction that left "trail of blood."

Sources include:

SFGate.com Futurism.com Brighteon.com