Asking help from our readers always pays off. We recently got an example of that with the Russian Cybertruck, and this case is no different. We had a video of an entirely avoidable crash and no idea of what, when, or where it happened. We did not even know if it really happened to a Tesla Model S, as the video title says, but we had to include that information since it was the original one.
That was not enough to avoid bashing or rude messages trying to defend Tesla of a supposed attack. Some said it was clear it was not a Model S. That it was evident that it did not have Autopilot engaged. Those were the polite remarks. The rest is in the original text. Unfortunately, we had to see all of them in search of relevant tips. Feel free to do it yourself.
It did not matter that we said we were not sure about which car caused the accident despite the video title. We made it clear we wanted to confirm it was a Model S. We wanted to know where and when the whole thing happened. If anyone got injured. If the driver that caused the damages paid for them. We still do, but managed to answer a lot of these questions.
On December 22, on the same day the article was published, the user Sendit told us in the comments where the accident took place. It was in San Diego, more precisely at Mira Mesa Boulevard, at its corner with Lusk Boulevard. In fact, it ended closer to Oberlin Drive, as the map below shows.
After Sendit, we also got the same precise indications from Michael Reid and Alfonso Cuchi. That was how we managed to ask the San Diego Police Department about what occurred there. So far, the SDPD has not replied to our questions.
That would have helped us discover if it was a Model S involved in the crash, but now we count on them for other info. Yaro Shcherbanyuk, the video owner, also got in touch and helped us with the most crucial details of the story.
Shcherbanyuk works at Calimotive.com. He is a dismantler that is specialized in Tesla vehicles. The 2018 Model S 75D of the accident was a salvage unit sold by Copart.com. It had only 12,089 mi on the clock.
“I came across this car on the auction. To my surprise, it still had the USB drive plugged in. That was the first out of 50 cars. The accident was in early October.”
The video owner has no idea if Autopilot was engaged or not, but he also believes its wrongful use is a plausible explanation for the accident.
“My take is that Autopilot was enabled and the driver was not paying attention and only noticed he ran the red light just as he was about to hit the first car and attempted to swerve.”
To the situation on the video above, Tesla offers a very clear disclaimer:
“Autosteer is intended for use only on highways and limited-access roads with a fully attentive driver. When using Autosteer, hold the steering wheel and be mindful of road conditions and surrounding traffic. Do not use Autosteer on city streets, in construction zones, or in areas where bicyclists or pedestrians may be present. Never depend on Autosteer to determine an appropriate driving path. Always be prepared to take immediate action. Failure to follow these instructions could cause serious property damage, injury or death.”
That said, raising the hypothesis of it being in use as a plausible explanation for the stupid triple accident is something that can be used against the driver, not against the beta software or Tesla. Especially not with this disclaimer: you have been warned, right?
In this case, the only criticism Tesla could receive is that it can’t count on its customers’ common sense because it is not common. On the contrary: the company should ensure its beta software is being properly used as much as possible.
With the maps, Tesla could only allow it to be activated in areas where it can be. We know there are penalties for failing to use the software the way it should, but this accident shows it may be a good idea to make them even harsher. Tesla owners can decide to destroy their vehicles if they want but crashes normally affect people that have nothing to do with such decisions.
As you can see, it was indeed a Tesla Model S, despite the claims that it wasn’t. An almost new one, by the way. Sendit probably witnessed what happened, and we are waiting to hear him about that, as well as the SDPD.
It would be nice to hear what the Tesla driver would have to say about the circumstances, but we doubt he or she will ever show up. It could be worse: Autopilot could be brought to the conversation as a scapegoat – like it often is.
A candid report, such as the one Rich Benoit recently gave us on bad financial decisions, would be beneficial to everyone. Something not to repeat. A chance to learn from other people’s mistakes instead of committing them yourself. Such as judging a story without adequately reading it only because you fear what it tells. If you are after the truth and not confirmation bias, it should not matter.