Tesla is being investigated in the wake of several accidents in which their cars were responsible for crashing into stopped emergency vehicles. This probe includes an emphasis on the “Autopilot” feature included in Tesla vehicles, something that was allegedly enabled for each of the accidents entailed in the investigation.
To date, there are seven accidents being investigated. These accidents account for 17 injuries and a single death, according to the National Highway Transportation Safety Administration (NHTSA).
The accidents in question span nine states and over three years, and the NHTSA reports that Tesla’s Autopilot was turned on in each of the cases. Autopilot has featured as a potentially lethal and profoundly misunderstood aspect of Teslas in the past, with a different agency finding Tesla partially responsible for a fatal crash involving the feature in Florida back in 2018.
The problem with Autopilot, or even Tesla’s cruise control (which can sense and adjust for obstacles) is that emergency vehicles and the indicators which accompany them are not easily read by AI, leading to key aspects of those features not triggering braking or swerving where a human operator would do so.
The solution is simple: Even while using Autopilot, Tesla drivers (and, ideally, every other driver on the road as well) need to remember to keep their eyes on the road and their hands on the wheel.
Tesla has made this clear as well, advising drivers that “current Autopilot features require active driver supervision and do not make the vehicle autonomous.” NHTSA released a similar statement explaining that “Every available vehicle requires a human driver to be in control at all times, and all state laws hold human drivers responsible for operation of their vehicles.”
Practicing attentive driving seems like the kind of thing that should go without saying, but the language used to release Tesla from liability does raise some questions. Gordon Johnson, an analyst, cites Tesla’s disclaimer for Autopilot users, “Tesla drivers accept Autopilot’s risks,” and explains that it is problematic in that other people on the road didn’t accept those risks.
For now, the investigation is ongoing; it’s worth noting that Tesla’s stock took a bit of a hit during the initial investigation announcement, but the company is used to controversy at this point.
Jack Lloyd has a BA in Creative Writing from Forest Grove's Pacific University; he spends his writing days using his degree to pursue semicolons, freelance writing and editing, oxford commas, and enough coffee to kill a bear. His infatuation with rain is matched only by his dry sense of humor.
![](https://149369349.v2.pressablecdn.com/wp-content/uploads/2022/07/ag-logo-smallest.png)