https://www.theverge.com/2018/3/30/1...ilot-statement
https://www.tesla.com/blog/update-last-week’s-accident
I get that Tesla has a creative marketing team that can make shit sounds better than it is, but when it comes to this, I really wish they just stick to the truth without trying to twist the narrative. Just 2 sentences with vague details, and 6 paragraphs on how safe they think AP is.
So, reading through all that, we know the following about the accident.
- Autopilot was on and drove straight into the barrier.
- Some point prior to the accident, zero reference to time, driver had some warnings. If you’ve ever driven AP, you’ll know that the hand detection does randomly go off even if hands are on the wheel (it detects hands by turning force on the wheel)
- the car detected hands off 6 seconds prior to the accident, 5 seconds to see AP barreling towards the barrier and driver did nothing.
So, most likely driver wasn’t paying attention, and I can tell you using these AP systems, I’ve been guilty of that. It gives you a false sense of security. My theory that the car followed the line into the false lane straight into the barrier is probably correct. Could’ve been just weird lighting that caused AP to behave differently because it “saw” the lines differently. Finally, this goes to show why we actually need LIDAR. Emergency braking did nothing, and this is by design on radar based systems, regardless if it’s Tesla, Mercedes, Volvo, or any other manufacturer. This is because radar can’t really tell stationary objects properly without a ton of false positives, rendering it useless. Optical recognition needs mega GPU power, and we’re not even close to being there yet, hence LIDAR as a stop gap. Because of this limitation and Tesla’s reluctance to use LIDAR, there are a ton of Tesla’s in the news running into stopped objects.