Tesla owner says car's self-driving mode didn't see train before crash
Category: News & Politics
Via: perrie-halpern • 6 months ago • 27 commentsBy: Ben Goggin
A Tesla vehicle in Full-Self Driving mode appeared to fail to detect a moving train and stop on its own, leading to a chaotic accident depicted in a video that has been viewed millions of times on social media.
The car's owner and driver, Craig Doty II, told NBC News that he takes responsibility for the accident, but he said he also believes that Tesla's Self-Driving technology, or at least as it existed in his vehicle, is a defective product.
"I was the only one in the car. I was the only car in the accident. So yes, it was my fault, it had to be," Doty said. "But I feel it was more that the damn car didn't recognize the train."
"You do get complacent that it knows what it's doing," he said of the Tesla technology. "And usually it's more cautious than I would be as a driver."
The accident occurred on the morning of May 8. Doty, a certified general appraiser in Ohio, was driving at around 60 mph, according to a Tesla crash report. The speed limit on the road was 55 mph, according to Doty and a police report associated with the accident. Drivers can request crash reports from Tesla, which are generated using data individual cars send to Tesla servers. Doty requested a report from the incident and provided it to NBC News, along with video of the crash recorded by the car.
Craig Doty II's damaged Tesla.Courtesy Craig Doty II
In the video, the car speeds toward a railroad crossing with a moving train before the car veers suddenly to the right, slamming into a railroad crossing arm and skidding off the road.
According to video from the car and the police report, conditions were foggy, but dashcam video of the crash shows that moving boxcars and the telltale flashing red lights of an active train crossing signal could be seen at least five seconds before the accident.
The accident caused significant damage to the front right side of the car. Pictures taken by Doty show the car's mangled body and the front right wheel twisted up at a sharp angle.
Doty said that his Tesla failed to slow down as it approached the train and that he slammed on the brakes and took over the car's steering manually, directing it off the road to avoid the train.
"I was like there's no way it doesn't see the train," he said. "There's no way it doesn't see the flashing lights. Yes, it was foggy, but you can still see the lights."
Full Self-Driving mode, often shortened to FSD, is Tesla's premium driver assistance option. CEO Elon Musk has promoted it as a crucial part of the company's future. Tesla says parts of the technology are in beta mode, such as "autosteer," meaning they are still being tested. Tesla sells the product to drivers for $8,000 upfront or $99 per month.
FSD requires drivers to keep their hands on the wheel of the vehicle while it steers for them; Tesla refers to the application of the technology as "supervised." On its website, Tesla says, "Your vehicle will be able to drive itself almost anywhere with minimal driver intervention and will continuously improve."
The website says farther down on the same page: "The currently enabled Autopilot and Full Self-Driving features require active driver supervision and do not make the vehicle autonomous. Full autonomy will be dependent on achieving reliability far in excess of human drivers as demonstrated by billions of miles of experience, as well as regulatory approval, which may take longer in some jurisdictions."
Tesla did not respond to a request for comment.
FSD's capabilities and its marketing have been the subjects of scrutiny, both in terms of what it is able to do and whether drivers are fully aware that it still requires them to be attentive. In February, Tesla issued a software update to FSD over concerns about issues related to stopping at intersections. Autopilot, which is Tesla's standard driver assistance option, has also been the subject of probes by regulators.
A spokesperson for the National Highway Traffic Safety Administration said the regulator "is aware of this incident and is gathering more information from the manufacturer."
Doty said his hands were on the wheel for the duration of the incident. The Tesla-generated crash report he provided to NBC News also shows that the system recognized that his hands were on the wheel just ahead of the accident.
The Tesla report shows that the car maintained a speed of around 60 mph while in Full Self-Driving mode before Doty slammed on the brakes and turned the wheel.
Doty said he had owned the Tesla since last year and had come to rely on its Full Self-Driving mode, which he said took some of the burden out of his long commutes, estimating that he had driven 20,000 miles with the feature activated.
"It's convenient once you get past the awkwardness of it doing everything for you, doing it most times better than you would or more cautiously," he said.
But he had at least one similar experience in which, he said, FSD appeared to fail.
Doty said the car nearly hit a moving train in November after it approached some tracks after a sharp turn.
He said that the Tesla did not slow down but that he was able to stop, still hitting the crossbar and damaging his windshield. He said he chalked it up to the intersection's coming after a turn. Doty provided documentation of his exchanges with a Tesla insurance claims adjuster at the time that included a detailed description of the incident.
After the second train incident, a police officer took a report. The police report said that the car was in a fully autonomous mode at the time of the crash, though Tesla's Full-Self Driving mode is not rated as fully autonomous. It is currently rated as partially autonomous.
The officer also gave Doty a citation for "failure to control" a vehicle, which comes with a $175 fine. In a hearing Thursday, Doty pleaded no contest to the citation, and asked for leniancy, given that the car was in Full Self-Driving mode. Doty said the judge agreed to strike the citation if Doty proved by July that the damages to the rail would be fixed and paid for by Doty or his insurance.
"I think Elon Musk can show up and pay the fine," he said. "I understand that I am in control of the vehicle, but I don't go around causing mayhem and getting in wrecks and driving outlandishly out of control."
After the incident, Doty posted the video of the crash on an online Tesla forum, seeking more information about similar incidents. Someone took the video and posted it on X, and video of the crash has since been viewed millions of times.
Doty said a Tesla Collision Center told him Wednesday that the car was totaled. He said the company has not reached out about the video or the incident.
There have been some indications and previous incidents that have suggested shortcomings of Tesla's Full Self-Driving mode, many of which have been chronicled online.
In November 2022, a Tesla in Full Self-Driving mode abruptly stopped on a major freeway, according to The Intercept, causing an eight-car pileup. An NHTSA report published last month found dozens of crashes involving Full Self-Driving and noted the company's December Autopilot-related recall.
"These insufficient controls can lead to foreseeable driver disengagement while driving and avoidable crashes," the NHTSA wrote in its report.
Gate down and lights flashing and Tesla could not detect a train?
meh, bfd. I always wrecked my electric cars into my trains when I was a kid...
That happened a few times, operator error, I was running both
who was running that train? Elwood Blues?
Or his bro Jake.
Maybe caution should be the order of the day.
Too much reliance on automation has led to some serious aviation accidents, same thing will happen to automobiles where changing conditions and reaction times are much more sudden and quicker.
I do not care how much hype Tesla and other manufacturers offering the same technology put out. The technology just is still not mature enough and will not be for quite a while and the consumers using it will pay the price with monetary loss and ultimate injury or death.
And this is irony................
Just because a car can "drive itself" does not remove the responsibility of the person in the driver seat to ensure it is operating properly.
That he got too complacent is his fault and so is this accident.
If you read the article, the driver claims he was was at fault and takes responsibility. That being said, the sensors on the vehicle should have at least seen the train and warned the driver in advance.
Thanks I had read that but It doesn't change my statement
So I posted this story since my 90-year-old dad just bought himself one of these, and we keep trying to tell him that he has to be a competent co-pilot. I have to say that seeing this scared me to death. Oh, and I shared it with him today.
Although I'm fully aware that such automation is bound to be the future, the only vehicle anyone will ever succeed in getting me in voluntarily is one with a human driver.
After driving school, I started driving with a 1950 Ford Custom Coach that was entirely and completely manual, absolutely nothing electronic, nothing automatic, nothing digital, nothing even assisted, i.e. TOTALLY BASIC. Never got into trouble with it. The trouble started when the cars I drove were automatic and had advanced bells and whistles. I think there would be a lot less bad things happening on the road if drivers were forced to go back to cars that were totally basic - no automation, no autopilot, no electronics, nothing assisted, nothing digital, just absolutely manual in every regard. It could actually FORCE drivers to focus on their driving and the road and not on their cell phones or the passing scenery. This could all be part of my wanting to be beamed back to the early 1950s,
I hear you. I learned to drive on a 1967 Ford tow truck with a four speed manual transmission and a Holmes 440 towing rig on the back that was owned by my then brother in law's father who owned a towing business in Los Angeles. Leaned to drive on the LA freeways. As you said above, it was all manual with zero electronics or automation. I would like to think that ultimately made me a pretty decent driver
Pretty much all new modern cars come with hwy driving assist, lane keep assist, accident avoidance, blind side warning, pedestrian recognition and emergency breaking. The system in my SUV has engaged multiple times, but there are situations like fast a moving vehicle or pedestrians rushing into traffic that remain wildcards.
wtf? I took my mom's car keys away when she turned 88. she was a menace on the highway, even driving among all the other bluehairs in texas, and usually I'm against any safety measures that obstruct natural selection...
That assumes you can take away the keys. My dad is all together in the head and still runs a business. So I can't march into his house and take away the keys.
that's amazing, good for your dad! my mom was slipping a bit mentally...
I let her drive her little honda to the post office and grocery store in her tiny little town, but I put the fear of god in her on freeway trips to all her dr appts.
My use for a self-driving car would be to drive me safely home from the Bar
That is unless the car could not pass a built in breathalyzer that will probably become mandatory in later years.
I think I recall reading a while ago that a breathalyzer system was possible to connect to the starter, so one blows into it and if the limit is exceeded the car won't start.