- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Tesla knew Autopilot caused death, but didn’t fix it::Software’s alleged inability to handle cross traffic central to court battle after two road deaths
Tesla knew Autopilot caused death, but didn’t fix it::Software’s alleged inability to handle cross traffic central to court battle after two road deaths
Calling it Autopilot was always a marketing decision. It’s a driver assistance feature, nothing more. When used “as intended”, it works great. I drove for 14 hours during a road trip using AP and arrived not dead tired and still alert. That’s awesome, and would never have happened in a conventional car.
I have the “FSD” beta right now. It has potential, but I still always keep a hand on the wheel and am in control of my car.
At the end of the day, if the car makes a poor choice because of the automation, I’m still responsible as the driver, and I don’t want an accident, injury, or death on my conscience.
Tesla is producing advertisement videos (already in 2016) that said “our cars drive themselves, you don’t need a driver”. What you said is worthless when Tesla itself is marketing their cars like that.
Please link that. I’d like to see it.
https://www.theguardian.com/technology/2023/jan/17/tesla-self-driving-video-staged-testimony-senior-engineer
this is the advertisement: https://www.tesla.com/videos/full-self-driving-hardware-all-tesla-cars
It literally says: “The person in the drivers seat is only there for legal reasons. He is not doing anything. The car is driving itself.”
And the best part is that the video was faked.
Cool ty. FSD is still beta for now. Maybe when it’s final. Fingers crossed.