Tesla Whistleblower Says ‘Autopilot’ System Is Not Safe Enough To Be Used On Public Roads


Image for article titled Tesla Whistleblower Says 'Autopilot' System Is Not Safe Enough To Be Used On Public Roads

Image: Tesla

A former Tesla employee in Norway has come forward as skeptical of the company’s ethically questionable practices with regards to its driver assistance software packages. In an interview with the BBC Lukasz Krupski said he was concerned about the readiness of the both company’s software and hardware for the task of assisted driving. The reams of leaked data Krupski took with him, including customer complaints about Tesla’s braking and so-called Full Self Driving package, seem to support his skepticism.

Krupski handed the data (100 gigs worth) off to German business newspaper Handelsblatt in May, saying his attempts to bring his concerns to higher-ups internally had been roundly ignored. He claims he found evidence in the company’s internal data suggesting that protocols involving the safety of driver assist technology had not been followed by Tesla. He brought up several documented instances of the well-known Tesla ‘phantom braking’ phenomenon, which I have experienced myself. It’s very unnerving and downright dangerous.

Tesla claims that its driver assistance software, which it intentionally confusingly calls “autopilot”, averaged one airbag deployment crash for every five million miles driven in 2022. This compares to an airbag deployment for ever 1.5 million miles driven without “autopilot” in the same period. The overall U.S. driver average is an airbag deployment for every 600,000 miles driven. Tesla’s quoted figures have not been independently verified.

According to an interview with the New York Times, Krupski was reprimanded and ultimately fired for taking photos of unsafe practices within his workplace, including a rolling table with a maximum load of 500 kilograms being used to hold a battery being removed from a car, which weighs much more. His bosses claimed that by taking the photographs inside a Tesla facility he violated company policy.

The data leaked by Krupski included lists of Tesla employees, often featuring their social security numbers, in addition to thousands of accident reports, and internal Tesla communications. Handelsblatt and others have used these internal memos and emails as the basis for stories on the dangers of Autopilot and the reasons for the three-year delay in Cybertruck deliveries. From NYT:

Mr. Krupski said he had gotten access to sensitive data simply by entering search terms in an internal company website, raising questions about how Tesla protected the privacy of thousands of employees and its own secrets.

Mr. Krupski has informed Tesla that he intends to sue for compensation, but he’s dead broke and can’t afford to bring the suit. He’s currently working with a lawyer in Norway offering his services free of charge while they try to raise the funds.



Source link