We’ve been following the spate of Tesla Full Self Driving (FSD) and Autopilot crashes with some interest and more than a little ire.
We believe that driver-assistance systems make driving safer and less fatiguing, but it’s clear that FSD and Autopilot are being represented to -- and taken by -- the public as way more capable than they are, and the sheer number of crashes involving Teslas using (and drivers mis-using) these systems bears this out.
We’ve decided to consolidate all of these stories into one place, because, speaking frankly, if we gave every FSD- or Autopilot-related incident its own post, we’d have to rename our publication from MotorTrend to TeslaFSDAutopilotCrashTrend.
Tesla fans, before you send angry emails, keep this in mind: In June ’21, NHTSA ordered automakers to report crashes involving Level 2 advanced driver assistance systems (ADAS) like FSD. (Yep, that’s right—of the six SAE-defined levels from 0, no automation, to 5, full automation, Tesla’s so-called "full self driving" is only Level 2, and therefore requires constant driver monitoring and is limited in its abilities to mostly acceleration, braking, and steering on certain marked roads.)
AutoTechnology