Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Two people in Tesla killed after crashing into parked semi at Florida rest stop (cdllife.com)
24 points by jader201 on July 8, 2022 | hide | past | favorite | 54 comments


The picture is confusing, it seems like the truck is on the highway, until I realized that this is probably here: https://goo.gl/maps/sK9CRov2YQijCiXU9

How the hell the Tesla managed to hit a truck at what looks like highway speeds there is hard to imagine. If this was an FSD or autopilot failure, it would be by far the most egregious I’ve seen.


Tesla auto-pilot is simply just "not smart" .. despite all the hype it is functionally not much more than a scaled up "white line following" feedback circuit.

From the map it likely followed the traffic sweep in feeder to the rest stop and identified the parked vehicles as traffic ahead on what was assumed to be a highway.

By my understanding many Tesla models lack Lidar for ranging and speed estimates and operate solely on relatively limited PoV vision cameras.


Tesla has never used LIDAR. New Tesla's don't have radar either. But pre 2021 Tesla's use radar to measure the distance to moving objects. But the radar is fairly wide angle and for that reason items that are not moving are excluded from consideration because they can be billboards or buildings beside the road. This limitation also applies to radar use by other manufacturers for emergency braking sensors.


I don't see anything in the article that suggests that any of that was relevant to this crash. Is there any other information out at this time?


There's a documentary about past Tesla crashes that hilights the recurrent issue with crashing at speed into stationary objects.

IIRC: Elon Musk’s Crash Course (althugh there may be several, I'd have to check my records for the one(s) that I've viewed).


So in summary, there is no evidence that this has anything to do with that at this time?


It is so common it is not news anymore?

I am intensely curious what really does happen if you try to get a refund for FSD. (That, whether or not this was actually an FSD event.)

Presumably, you signed some BS about arbitration, so you go to arbitration. What then? Maybe their tame arbitrator says "f** off", so then you sue. Does Tesla contest it, or pay? If not, and it goes to court, does Tesla default and get ordered to pay?

Has anybody got a judge to order treble damages, for bad faith and for making you need to go to court just to get back what you paid for what they utterly failed to deliver?

It seems like there could be a specialty in suing Tesla for refusing a refund. And, there must be a fair number of cases either settled or adjudicated already.


Most likely just the threat of suing will get you a refund under NDA.


Here is the original image that is not zoomed in.

https://i.imgur.com/YKtf8Jj.jpg (4032 x 3024 pixels)


Hmm, aren't there low hanging bars at bumper level on the side and end of the trailer, so that crumpling zones come into effect instead of decapitating the car with no chance of energy absorbtion through deformation (which seemed to not have been happening there)

I recall in Europe these became mandatory decades ago when a similar crash happened, killing one person, the family of which fought a long battle to have these bars become mandatory (which a couple of years later saved my father's life when he launched to overtake a truck, that suddenly decided to brake hard and turn left without notice at the last second).

edit: seems like in 2015, 2% were killed this way in Germany (where it's been mandatory for decades) vs 16% in the US https://www.dekra-roadsafety.com/fr/protection-anti-encastre...


Yes, underride or "Mansfield bars" (after actress Jane Mansfield who was killed in a similar accident).

Those are only rated for impacts to 35 mph. From the position of the Tesla in the image, it may have been travelling substantially faster.

The Tesla is also a deceptively massive automobile for its size, with much of that mass being low, in particular below the position of an underride bar, which might otherwise engage with the engine of a conventional vehicle. Effectively, the Tesla has a longer lever arm to apply force to the undderride bar. Whether or not that is a factor in Tesla-trailer collisons isn't clear, but might be something that warrants investigation and testing.


> aren't there low hanging bars at bumper level on the side and end of the trailer

I don't know if all of them have it, or if it's mandated, but a lot of trailers do have those bars on the rear end.


This is exactly what I was thinking, and why I think it’s bigger news (vs. me just hearing about it a day later via non-HN sources).


This was a 2015 Model S by the looks of it, so no FSD. Likely the MobilEye AP system if that was being used at time of crash.


Or maybe the human was driving it? There's no indication that I can see that the vehicle caused this.


> This was a 2015 Model S by the looks of it

Based on what?


The article says “2015 Tesla”. The picture is of a hatchback. What do you think it is?


I completely missed the year mentioned in the article, thanks.


Just a reminder that nowhere in the article is FSD mentioned. For all we know it could have been human error.

If it was indeed an FSD failure then that’d be unacceptable. Regulations need to be tightenedt at the very least, Tesla fined heavily and FSD banned I suppose.

But we are not there yet; let’s wait and see


Sure. However, using the words "Tesla" and "crash" in the title of an article is a surefire way to get traffic. (Exhibit A: the conversation we're having.)

At least, the article is pretty factual, and does not had any inuendo about FSD. I would like to know who is conducting the investigation, though (how much is done by the police, how much by automotive regulatory body, how much by Telsa itself, etc...)

Thoughts go to the family of the people inside the car, and to the people investigating.


The article seems very factual and well-written.

What worries me are the comments on here. There's clearly a very strong bias at play in the hacker news community (regardless of what one ought to think of Tesla or Elon).


>However, using the words "Tesla" and "crash" in the title of an article is a surefire way to get traffic.

You reap what you sow.

Tesla created hype environment and now pays for the bullseye they painted on their forehead


Even if it was on, failing to act would be the problem because you are supposed to pay attention while it is on. So, it's by definition a human error.

And with some drivers being a danger to themselves and others, you might legitimately wonder if fsd wouldn't be the better alternative.

People die every day on roads. Mostly because they are being people and get drunk, tired, distracted, reckless, etc. It's very likely the driver ticked one or more of these boxes. The simple explanations are usually the right ones.


It's an odd wreck, given that the car had to pull off of the interstate and barrel through a rest stop area then crash into the back of a semi trailer parked in the separate parking area for trucks only. It's hard to even do this accidentally. And the fact the wreck took place at 2PM means it's unlikely the driver was tired.


> And the fact the wreck took place at 2PM means it's unlikely the driver was tired.

Less likely, sure, unlikely, you'd be surprised about the amount of people driving for excessive durations or in a chronically or occasionally sleep deprived state.


Shouldn't all automated system assumed to be malfunctioned when an accident happened until proved otherwise?


Was an automated system even being used when the accident happened?


Wow that's quite a shocking picture of the crash. It's hard to imagine how FSD could screw up that badly.


Another commenter claims that the car is a 2015 model. If this is correct that car doesn't have FSD. 2015 Model S is AutoPilot 1.5 by MobilEye. I have one and it is very good at what it does although it cannot 'see' stationary objects very well.

But it cannot autonomously change lanes or turn off the highway into the rest area (Navigate on Autopilot only applies to later versions of Autopilot).

So even if Autosteer was engaged there was almost certainly a human error or miscalculation in the chain of events. Perhaps the driver had a heart attack and pressed the accelerator.

There is far too little information available for us on the outside to make confident pronouncements about what actually happened and who, if anyone, might be at fault.


Ahh ok, thanks for the info. In any case with FSD or autopilot, I don't think anyone should ever take their eyes off the road while in the drivers seat.


How would you feel as an engineer if this was your post mortem report?


I think I would need a bit more detail than this. Most such accidents happen because of human error. There is no indication that there was a software or hardware issue, at this time.


Just to nitpick: Human error during programming is still human error.

I realize this car didn't have "FSD" so i don't even know why it appears on HN.


I agree, but just to be clear, when I said "human error" I was referring to errors that may have been committed by the operator of the vehicle.

I think "operator error" would be everyone's default assumption for any other make of vehicle, or if the make had not been mentioned.


One of the few times that describing a software incident as a post mortem might be appropriate.


If we assume that this was FSD, why would the driver not take control if the car is exiting the interstate?


Why would we make that assumption? It isn't mentioned anywhere in the article and FSD Tesla's, despite the hype and publicity, are a tiny fraction of the Model S's on the road.


Gives new meaning to words emblazoned across the trailer's doors.

TRSLA


What a grisly photo.


The truck seems to be white again…


so much for FSD...


Level3 next year!

(ignore that he's said this for like 10 years in a row now)


Where did you see any indication that FSD was used?


These TSLAQ idiots get more desperate every year. 2015 Model S literally cannot have the current Autopilot or FSD systems.


It has to be a nerve-wracking job when a bug in your code / algo can cause deaths.


At least bugs in regular control systems can be identified and fixed, with high confidence that the problem has been resolved.

If the system is heavily based on machine learning, there may be no such code bugs or algorithmic mistakes to find, no easy way to show exactly what went wrong and why a fix will prevent it from happening again.


Sounds very similar to what happens when a human makes a mistake. How do we know that wasn't the case here?


Tesla computer vision is done using neural networks. The tech has migrated to make driving decisions using neural nets as well. The code is basically just an artifact used to create a digital mind. However, it's possible that digital mind is operating vehicles at the level of a chimpanzee.


If FSD was active, this is it.


Don’t they turn it off a second before imminent impact, so that they claim it was off when the crash happened?


They investigate crashes in which Autopilot was turned off less than 10 seconds before impact



How would that have helped in this specific situation? I fail to see the relevance.


How is Tesla stock still so high? Its market cap is $750B with a tiny amount of cars sold, Elon is ruining his reputation on a month over month basis, he gives his laid off workers one-week salary, and his stock price goes up with impunity. It's incredible!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: