An amusing anecdote: an engineer is out with the family of a man she was dating, and the father tried to turn on the full self-driving option of his Tesla, so she’s practically clawing her way out of the car.
But on the way back his dad started asking me “you work on self driving cars, yeah?” (I do, I’m a systems engineer and have job hopped between a handful of autonomy companies.)
He started asking me how I liked his Tesla and I joked “just fine as long as you’re the one driving it!” And he asked me what I thought about FSD which he’d just bought. He asked if he should turn it on. I said “not with me in the car” and he then laughed and asked how I was still so scared when I work with this stuff everyday.
I was like “Uhh it’s because I…” But stopped when he pulled over and literally started turning it on. I was like “I’m not kidding, let me out of the car if you’re gonna do this” and my boyfriend’s dad and brother started laughing at me, and my boyfriend still wasn’t saying anything.
His dad was like “It’ll be fine” and I reached over my boyfriend’s little brother and tried the door handle which was locked. I was getting mad, and probably moreso because I was tipsy, and I yelled at him “Let me the fuck out”
She’s a systems engineer who works on these self-driving cars, and she wants nothing to do with it? Does she know something the rest of us don’t?
Apparently, she does. Tesla has been faking demos of its self-driving cars, which I guess shouldn’t be a surprise to anyone following Elon Musk’s hype parade.
A 2016 video that Tesla (TSLA.O) used to promote its self-driving technology was staged to show capabilities like stopping at a red light and accelerating at a green light that the system did not have, according to testimony by a senior engineer.
The video, which remains archived on Tesla’s website, was released in October 2016 and promoted on Twitter by Chief Executive Elon Musk as evidence that “Tesla drives itself.”
But the Model X was not driving itself with technology Tesla had deployed, Ashok Elluswamy, director of Autopilot software at Tesla, said in the transcript of a July deposition taken as evidence in a lawsuit against Tesla for a 2018 fatal crash involving a former Apple (AAPL.O) engineer.
It’s OK, though, because they were trying to show what was possible, rather than what the car could actually do, even if Musk was claiming the car was driving itself.
“The intent of the video was not to accurately portray what was available for customers in 2016. It was to portray what was possible to build into the system,” Elluswamy said, according to a transcript of his testimony seen by Reuters.
Like, the idea of cars driving themselves and bypassing the fallibility of human drivers sounds nice, but it’s clear that the car’s software can be even more stupid and flawed than people. I wouldn’t want to share the road with these things, let alone be in a car controlled by some engineering gadget.
You know what I think would be far more useful? Software that detected when the driver was significantly impaired. You’re weaving all over the road, or you’re exceeding the speed limit, or it senses that you’re nodding off, and it fires off alarms to let you know you’re not safe, and if you exceed a certain frequency of warnings, it transmits alerts to the police. That would be a smart car, making sure that the driving software in the human’s head was operating adequately.
Knowing humans, though, there’d be a huge aftermarket in mechanics ripping out the safety measures.