The picture is confusing, it seems like the truck is on the highway, until I realized that this is probably here:
https://goo.gl/maps/sK9CRov2YQijCiXU9
How the hell the Tesla managed to hit a truck at what looks like highway speeds there is hard to imagine. If this was an FSD or autopilot failure, it would be by far the most egregious I’ve seen.
Tesla auto-pilot is simply just "not smart" .. despite all the hype it is functionally not much more than a scaled up "white line following" feedback circuit.
From the map it likely followed the traffic sweep in feeder to the rest stop and identified the parked vehicles as traffic ahead on what was assumed to be a highway.
By my understanding many Tesla models lack Lidar for ranging and speed estimates and operate solely on relatively limited PoV vision cameras.
Tesla has never used LIDAR. New Tesla's don't have radar either. But pre 2021 Tesla's use radar to measure the distance to moving objects. But the radar is fairly wide angle and for that reason items that are not moving are excluded from consideration because they can be billboards or buildings beside the road. This limitation also applies to radar use by other manufacturers for emergency braking sensors.
I am intensely curious what really does happen if you try to get a refund for FSD. (That, whether or not this was actually an FSD event.)
Presumably, you signed some BS about arbitration, so you go to arbitration. What then? Maybe their tame arbitrator says "f** off", so then you sue. Does Tesla contest it, or pay? If not, and it goes to court, does Tesla default and get ordered to pay?
Has anybody got a judge to order treble damages, for bad faith and for making you need to go to court just to get back what you paid for what they utterly failed to deliver?
It seems like there could be a specialty in suing Tesla for refusing a refund. And, there must be a fair number of cases either settled or adjudicated already.
Hmm, aren't there low hanging bars at bumper level on the side and end of the trailer, so that crumpling zones come into effect instead of decapitating the car with no chance of energy absorbtion through deformation (which seemed to not have been happening there)
I recall in Europe these became mandatory decades ago when a similar crash happened, killing one person, the family of which fought a long battle to have these bars become mandatory (which a couple of years later saved my father's life when he launched to overtake a truck, that suddenly decided to brake hard and turn left without notice at the last second).
Yes, underride or "Mansfield bars" (after actress Jane Mansfield who was killed in a similar accident).
Those are only rated for impacts to 35 mph. From the position of the Tesla in the image, it may have been travelling substantially faster.
The Tesla is also a deceptively massive automobile for its size, with much of that mass being low, in particular below the position of an underride bar, which might otherwise engage with the engine of a conventional vehicle. Effectively, the Tesla has a longer lever arm to apply force to the undderride bar. Whether or not that is a factor in Tesla-trailer collisons isn't clear, but might be something that warrants investigation and testing.
Just a reminder that nowhere in the article is FSD mentioned. For all we know it could have been human error.
If it was indeed an FSD failure then that’d be unacceptable. Regulations need to be tightenedt at the very least, Tesla fined heavily and FSD banned I suppose.
Sure. However, using the words "Tesla" and "crash" in the title of an article is a surefire way to get traffic. (Exhibit A: the conversation we're having.)
At least, the article is pretty factual, and does not had any inuendo about FSD.
I would like to know who is conducting the investigation, though (how much is done by the police, how much by automotive regulatory body, how much by Telsa itself, etc...)
Thoughts go to the family of the people inside the car, and to the people investigating.
What worries me are the comments on here. There's clearly a very strong bias at play in the hacker news community (regardless of what one ought to think of Tesla or Elon).
Even if it was on, failing to act would be the problem because you are supposed to pay attention while it is on. So, it's by definition a human error.
And with some drivers being a danger to themselves and others, you might legitimately wonder if fsd wouldn't be the better alternative.
People die every day on roads. Mostly because they are being people and get drunk, tired, distracted, reckless, etc. It's very likely the driver ticked one or more of these boxes. The simple explanations are usually the right ones.
It's an odd wreck, given that the car had to pull off of the interstate and barrel through a rest stop area then crash into the back of a semi trailer parked in the separate parking area for trucks only. It's hard to even do this accidentally. And the fact the wreck took place at 2PM means it's unlikely the driver was tired.
> And the fact the wreck took place at 2PM means it's unlikely the driver was tired.
Less likely, sure, unlikely, you'd be surprised about the amount of people driving for excessive durations or in a chronically or occasionally sleep deprived state.
Another commenter claims that the car is a 2015 model. If this is correct that car doesn't have FSD. 2015 Model S is AutoPilot 1.5 by MobilEye. I have one and it is very good at what it does although it cannot 'see' stationary objects very well.
But it cannot autonomously change lanes or turn off the highway into the rest area (Navigate on Autopilot only applies to later versions of Autopilot).
So even if Autosteer was engaged there was almost certainly a human error or miscalculation in the chain of events. Perhaps the driver had a heart attack and pressed the accelerator.
There is far too little information available for us on the outside to make confident pronouncements about what actually happened and who, if anyone, might be at fault.
Ahh ok, thanks for the info. In any case with FSD or autopilot, I don't think anyone should ever take their eyes off the road while in the drivers seat.
I think I would need a bit more detail than this. Most such accidents happen because of human error. There is no indication that there was a software or hardware issue, at this time.
Why would we make that assumption? It isn't mentioned anywhere in the article and FSD Tesla's, despite the hype and publicity, are a tiny fraction of the Model S's on the road.
At least bugs in regular control systems can be identified and fixed, with high confidence that the problem has been resolved.
If the system is heavily based on machine learning, there may be no such code bugs or algorithmic mistakes to find, no easy way to show exactly what went wrong and why a fix will prevent it from happening again.
Tesla computer vision is done using neural networks. The tech has migrated to make driving decisions using neural nets as well. The code is basically just an artifact used to create a digital mind. However, it's possible that digital mind is operating vehicles at the level of a chimpanzee.
How is Tesla stock still so high? Its market cap is $750B with a tiny amount of cars sold, Elon is ruining his reputation on a month over month basis, he gives his laid off workers one-week salary, and his stock price goes up with impunity. It's incredible!
How the hell the Tesla managed to hit a truck at what looks like highway speeds there is hard to imagine. If this was an FSD or autopilot failure, it would be by far the most egregious I’ve seen.