Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
People are reporting collisions with Tesla’s Smart Summon feature (thedrive.com)
159 points by Alupis on Sept 29, 2019 | hide | past | favorite | 259 comments


If you're incredulous that Tesla would let people oversee a moving car at 15mph in a parking lot, wait until you hear what happens next after the car gets to them!

I think driving is a special case for safety, because we have put humans in charge of controlling 2000+lb cars at high speed in close proximity with other humans doing the same thing, and we've either designed or retrofitted this activity into almost every dense human habitation on the planet. Obviously we would never make such a decision now, but it gives a special context to today's safety decisions. It is a major ongoing source of death and injury as well as a major source of pollution, and bringing that era to a close is a special priority. In the United States roughly 100 people per day die in car accidents, and many more are seriously injured, so you could say that every day we can accelerate the end of human driving saves a hundred lives.


I agree that having humans drive cars is inherently unsafe and that self-driving cars will eventually be safer when the technology matures. However, that doesn't justify Tesla releasing features before they're ready. That doesn't move us towards a self-driving future; in fact, it sets us back.


100 people per day out of how many people driving per day? I'm pretty sure it's a very small percentage. How does that compare to the percentage of self-driving car accidents relative to the number of self-driving cars? My gut feeling is that if you replaced every car on the road with today's self-driving cars, the accident rate would go way up.

There's probably a danger-maximum somewhere before 100% replacement, though. Some mix of highly-predictable self-driving cars and poorly-predictable human-driven cars will cause the most accidents, and a lower number of either will cause fewer accidents.


There's probably a danger-maximum somewhere before 100% replacement, though. Some mix of highly-predictable self-driving cars and poorly-predictable human-driven cars will cause the most accidents

That's an interesting point. Since the technology will be improving at the same time adoption is increasing, I don't know if we'll be able to observe that maximum, but I think you're right. One thing to consider is that once there are a significant number of self-driving cars on the road, they'll start communicating in a much faster and richer way than human drivers are able to communicate. If self-driving cars are able to achieve human collision rates with human drivers, they will be able to achieve much, much lower collision rate with each other. That effect might lower the percentage at which safety starts to improve.

Another variable will be the built environment. Over the years we have developed road systems that are highly enriched with features to assist human drivers, and of course we will begin to do the same for self-driving cars.

Nobody is talking about those improvements right now because they aren't relevant to getting the first cars on the road. The engineers working on self-driving cars have to tackle the hardest version of the problem that will ever exist first, and only then can they start making it easier.

Confession: for years I predicted to my friends that the first autonomous passenger vehicles to coexist with untrained members of the public would be in a theme park or a city center closed to other vehicles, with sensors and other assistance built into the environment, because I thought we'd have to start with an easier version of the problem.


Small clarification: it’s 5 mph for now, not 15.


I think you’ve sort of articulated the problem yourself. Cars are inherently incredibly dangerous, and also ubiquitous.

Therefore, even a subtle change to how cars work has the potential to cause literal carnage. And this isn’t a subtle change.


It also has the potential to prevent literal carnage. The risk might be high, but so is the reward.


If only there was a technology that would guarantee the transport would stay on a path. If it could carry a lot of people , that would ve a bonus...


Shelter ourselves straight into impotence.


Faith that pre-programmed cars are going to outperform humans is equivalent to predicting general AI.


There are plenty of narrow AI algorithms that outperform humans.


Honest question, Can somebody explain me the use case for this feature? I don't get it.

I mean, if the summoned car came to me in a totally autonomous way, it would be useful in very large parking lots, like an airport, campus, etc... Or I could call it while still leaving my apartment/office, and the car would be there by the time I get to the door. But if I need to have the car in my line of sight, just standing still while watching it and holding the button, with me being aware and responsible of the car not bumping into any obstacle, then it means I'm pretty close to the vehicle, so why not just walk the very short distance to it?


Minority rule: if a minority of disabled people can get around by summoning their car, then - all costs being equal - their families will always buy a Tesla. Apple's ecosystem also benefits from the iPhone having the best accessibility features, thus drawing in all developers working on accessibility software.

https://nassimtaleb.org/tag/minority-rule/


Why are people stating this ? You can not do basic things like zooming the page for a preset factor in Safari.


That's what you think is important.

Blind people for example don't care about zoom factors, but they care very much about the fact that they can completely control and use their iPhone using text-to-speech and certain gestures.


Exactly, I bought an android partilly for accessability features. Apples are better intergrted, but much more limited.


I've had a user complain the lack of a physical button on the new iphones making everything a guessing game for the seeing impaired now.


That happened with my Macbook Pro w/ touchbar.

There was an issue with the graphics, and it initially booted into an assistive mode for first-time setup, and I wondered how a visually-impaired person would use the touchbar.


Do you mean that this feature is marketed to disabled people, as a disability aid?


They won't have to. Disabled people and those caring for grandparents prone to falling will seek out a Tesla with summoning if it helps them get around or can drive them to the hospital.


My brother just had serious back surgery. We've been dropping him at the store entrance, and picking him up. He can sit fine, but walking is difficult. I can see this being a benefit for certain people with injuries or disabilities.


If it was major surgery, that's the only way they should ride in a car for now, as a passenger.


You still need to be in sight of the vehicle, so the car would still have to be located near the store entrance. And as this article shows, you have to have a good, unobstructed view of the car in order to not accidentally run into something.


And it doesn’t seem like you can leave your car at the entrance, which also means you are only saving yourself half of the round-trip.


You can tap a location on the map and have the car drive their... and it seems to generally respect the rules of driving in a parking lot (going up and down the lanes instead of through parking lots) - I wonder if you could get it to park itself...

If not that capability will probably be added in the near future.


Are there no dedicated parking spots for the disabled in the US?


If it's anything like the UK, plenty of non-disabled people use them regardless - because they drive a massive Q7 which doesn't fit into regular spaces, or just because they're closer to the store/school/whatever.

Pretty dispicable if you ask me, but there does seem to be a large segment of the population that just don't give a shit.


There are, and they're usually the closest spots, but picking someone up and dropping them off right in front of the store is usually closer. (Oftentimes, doing pickups/dropoffs there is explicitly disallowed, because stopped cars obstruct the view of pedestrians for drivers of moving cars.)


Most people won't use it everyday. However, maybe it's raining outside and you don't have an umbrella. Or you got a shopping trolley full of stuff. Or the spot is so tight, it's easier to get into your car after it pulls out.


I'm replying to this comment only because there are numerous other comments talking about the use of this for people who are in some way physically impaired.

What's the state of designated parking spaces for such people? Just yesterday I crossed the parking lot at my local supermarket and passed a dozen extra-wide spaces right by the door, all marked with the generic symbol, painted in blue, and clearly marked for use by such patrons only. As it was most of them were empty, but that's good; every shopper with that need, currently at that store, had been able to park right by the door.

A short way beyond those spaces, are spaces marked for those shepherding children, which is less vital but still helpful.

So what's the state of this facility - extra wide designated spaces right by the door specifically intended for the people who need it - across the US and other countries? If it's being done, then this "smart summon" seems more of a gimmick. If it's not being done, it feels a little like an unnecessarily high-tech solution to a low-tech problem.


In the U.S., there are laws that determine how many such spaces must be designated in parking lots, based on the size of the parking lot and/or venue.

I have a friend who is physically handicapped and very much needs to park in these spaces. The only problem he has ever told me about is finding a spot to park at work. Apparently it's somewhat common where he lives for middle-aged women (his description) to game the system and obtain handicapped placards. He says he knows they are not really handicapped because he sees them walking briskly through long corridors at work without distress. Sometimes too young people will borrow an older relative's placard to park in those spaces.

Incidentally, he would never be able to afford a Tesla. He buys three- to five-year-old Hondas and drives them till they can't be fixed.


While I can't comment on the exact instances of people using handicapped placards, I would like to bring up the existence of invisible disabilities [1]. These are disabilities that are not immediately obvious to onlookers. For example, people with xeroderma pigmentosum[2] have a very severe sensitivity to sunlight (sunburn within a few minutes, vastly increased risk of skin cancer). A handicap space would be needed to minimize time spent in the sun, but no issue would be present walking inside a building.

[1] https://en.wikipedia.org/wiki/Invisible_disability

[2] https://en.wikipedia.org/wiki/Xeroderma_pigmentosum


Also, appropriate accommodation for a handicap helps keep one more functional. So when one gets appropriate accommodation, one can appear more or less "normal" for short periods, such as time spent in public spaces. Take away that accommodation and life can come apart at the seams.


> In the U.S., there are laws that determine how many such spaces must be designated in parking lots, based on the size of the parking lot and/or venue.

> I have a friend who is physically handicapped and very much needs to park in these spaces. The only problem he has ever told me about is finding a spot to park at work. Apparently it's somewhat common where he lives for middle-aged women (his description) to game the system and obtain handicapped placards. He says he knows they are not really handicapped because he sees them walking briskly through long corridors at work without distress. Sometimes too young people will borrow an older relative's placard to park in those spaces.

Firstly, it's very possible to be handicapped at any age, so this "too young" idea is pretty disgusting.

However, if people are using the placards of others, and if this is in the US, your friend should report these people to the police. The placard also comes with an identification card with the person's information on it, which can be used to verify their disabled status.


so this "too young" idea

He left out a comma.


Yes, sorry, and I'm usually a pedant regarding commas.

My friend has been handicapped since he was a child, incidentally. I'm well aware that some people have an entire life filled with difficulty.


My apologies for misinterpreting!!


The primary use case right now is pulling your car in and out of your own garage at your own house.

As a Model 3 owner, I also went and tested out the capabilities in a few parking lots, just for fun.


https://mobile.twitter.com/abgoswami/status/1177773811497721...

It doesn't seem like its even safe for that though.


I use summon every night to pull my car out and it works flawlessly. Based on that damage it’s not clear how that could be caused by anything in a garage.


I think parking-lots make more sense than a garage.

1. Garage is full of tight spots. If something "goes wrong", you have to lift your finger off the cell-phone (capacitive sensor, so its innately laggy to a few dozen milliseconds), that signal then has to be trasmitted over the internet, to the car which then applies the brakes.

2. Or... you could be in the car itself... and directly connected to the accelerator / braking system. No delays if something goes wrong.

------

At least with a parking lot, you can argue that you didn't feel like walking (even if you need to keep the car within visual range). But in a garage + visual range, it means that you're giving up control for what amounts to a party trick. Its not like Electric vehicles give off fumes, so there's basically no disadvantage to getting in the car inside the garage instead.


Tesla is well aware that people will use the feature in ways that you describe, and its main concern, as with its other statements to the effect that drivers should pay as much attention to what is going on as if they were driving, is to avoid liability.


It clearly is just the beginning of what is possible. And a cool thing to show people.


My father is almost 80 and had hip surgery. Part of the danger of getting old & feeble is becoming isolated.

Any chance to enable mobility without compromising safety, theirs or others, is a huge win.


It's to show your friends and colleagues, it has no practical purpose given the conditions and requirements for the use of the feature.


Your life circumstances may not present practical uses... But I can assure you they exist.


In what circumstances will you have 100%, unobstructed line of sight to:

1. Your car.

2. Your car's avenue of travel. [1]

[1] This is the important part. If you can't clearly see the entire path of travel, you could run a child over. Tesla obviously doesn't give a shit about that, because their collision detection doesn't work, and because the feature does not require this kind of line-of-sight to work.

The former is hard, but they want to be first to market, so they don't care about not building it right, and the latter would make this feature near-useless, so naturally, they don't ship this sort of fail-safe.


This is a critical part of the overall full self driving equation. Any vendor working on this problem will be spending inordinate amounts of resources on this as parking lots are one of the more challenging environments.

In certain locations (especially in winter) this will be a very nice feature. This is especially true to anyone with disabilities or extreme laziness. ;-).


This is a step in their progression toward self driving. No one should expect this to be flawless from the start, similar to Waymo's self driving work.

Tesla getting this out now should allow for the feature to begin improving at a quicker pace due to more data.

A lot of naysayers will certainly harp on the number of wonky videos to hit the web, but this will quickly decrease once the newness of the feature wears off and improves.


It’s worth noting that many of us consider self driving to he essentially analogous to full AI in its complexity and difficulty, and thus believe that we are literally nowhere near anything that fits this description.

No amount of “data” is going to solve the problem. Driving a car requires making a full mental model of the immediate world you’re in and creating accurate predictions of what everything else in that world is about to do, many of those things being sentient beings with whom you’re communicating through your own actions.

Nothing that Tesla is doing is getting much closer to that. The next major landmark on this timeline is a team successfully passing the Turing test, not a car moving across a parking lot.

If you share that opinion, that makes the Tesla a lethal toy.


What’s a “full mental model of the immediate world”? I’d argue that nobody has anything approaching “full”. You have “enough to get by” and that’s all computers need as well. This isn’t a full sentient AI, it’s just sensing the world and navigating a path, at an industrial scale.


No, it’s not path finding. The DARPA races were pathfinding and getting from point A to point B hasn’t been the problem for some time.

Driving around other people requires theory of mind in order to make decisions based on what the people around us expect to happen

I’m teaching a friend to drive right now - it’s all about eye contact, being waved through, stopping and starting predictably and without surprises, you’re always looking around for pedestrians and checking if they see you and predicting, according to everything you’ve learned about humans so far, whether they’re about to step into the street

Deep learning on photographs is not going to cut it.


Sure, but that's why there are armies of programmers writing rules for self driving cars. The "theory of mind" is being developed in the form of control code. It's not easy, but it is accelerated by the data collected in iterative development just like Tesla is doing.


It’s just not. How does it tell when the other car is waiting for them to move? What does it do when two people go into an alley at opposite directions and one has to back out? What happens when there aren’t any markings in the lot, or when people ignored the markings and parked in a different pattern? And so on.

These aren’t “edge cases” that “data” will sort out. These are common everyday occurrences in parking lots that require something like consciousness to navigate and involve subtle communication with other participants.


You could also argue that humans create such models in a way that is automated in confidence of predictability and familiarity. Familiarity and predictability being a byproduct of how the human chooses to make certain actions and judgements routine. Given this, humans exhibit quite a lot of overconfidence when it comes to driving: see phone use. Over time a human will make many more mistakes that any amount of paying attention would of prevented them.


If it isn’t fully functional it shouldn’t be in a consumer product which is capable of killing people when things go wrong.


Ever more want for Convenience.


My wife has recently had trouble with people parking so close to her that she couldn't get into the car. If her car could have reversed itself it would have been fantastic.

That, or jerks could not park people in (especially when they're heavily pregnant).

After that I just told her to either park in the disabled spots or the mother/child spots. She has more right to them than the lazy boomers and BMW drivers who consider them their private parking spots.


> After that I just told her to either park in the disabled spots or the mother/child spots. She has more right to them than the lazy boomers and BMW drivers who consider them their private parking spots.

I was with you till that statement. I know several people with disabilities who have been yelled at for parking in the handicap spot when they don't "appear" disabled.

People don't need to be wheelchair bounded to be disabled, and calling boomers "lazy" who use the spots (presumably legally) is in really poor taste imo.


Oh, absolutely, and those people have a big blue badge here in Europe. Like my grandparents, for example.

There are also plenty of people who should have a badge but don't and who never park incorrectly (like my father-in-law who can hardly walk now thanks to his terminal cancer).

However my complaint is legitimate; it's disproportionately people around the 60-70 age group or the suited twit in a big car who thinks she's better than anyone else. At least with the older group they have a small excuse - the bigger spots make it easier to get out of the car which can be an issue if you're older and unfit.


Your complaint is absolutely not legitimate - you're saying it's fine for your wife to park in disabled spaces just because some other people are arseholes too.


Can't speak for elsewhere, but in the UK there are disabled badges for (partially) this reason. And for parent and child spaces, you can tell by the lack of children with the 'adult'.


> And for parent and child spaces, you can tell by the lack of children with the 'adult'.

Can you? Is it not allowed in the UK for one parent traveling alone to use such a space for the purpose of picking up their spouse and children?


We can construct all sorts of hypothetical scenarios, but when a person arrives sans children, and with no booster seat, which is a legal requirement up until age 12 (or a certain height). Plus most of my experiences with parent and child parking is supermarkets, that tend to have drop off points anyway.


> After that I just told her to either park in the disabled spots or the mother/child spots. She has more right to them than the lazy boomers and BMW drivers who consider them their private parking spots.

Come on, seriously? You really think it's OK to just park in disabled spaces just because some other people are arseholes too?

As someone with a disabled child, I frequently encounter this kind of self-entitled, selfish behaviour - and I just can't fathom it. How can you be that uncaring?


"You are still responsible for your car and must monitor it and its surroundings at times within your line of sight because it may not detect all obstacles. Be especially careful around quick moving people, bicycles, and cars."

Is this a joke? I have huge respect for Tesla and SpaceX, but how do they expect to release a feature that will self-drive a car (even tiny distance) and someone will oversee a moving car?

The software engineer in me just died.


>Is this a joke? I have huge respect for Tesla and SpaceX, but how do they expect to release a feature that will self-drive a car (even tiny distance) and someone will oversee a moving car?

Think of it as a marketing/business idea (let's sell this "advanced technology to early adopter suckers and we can try and spin any accidents as operator fault") rather than an engineering idea...


Except with the added bonus that some people might get injured or killed. When will software engineers take this shit seriously?


While at the same time "brings your car to you while you are dealing with a fussy child". It's like marketing wrote one part, legal the other, and neither read what the other wrote...


Disclaimer: this product is completely useless.

I think Tesla may be setting themselves up for trouble here. These sort of “using the product as clearly intended voids the warranty” disclaimers often don’t hold up.


I agree. Why even bother using a feature that cant babysit (manage) itself. Yikes. I rather it not be enabled at all if its going to risk my vehicle. Thankfully I am letting the masses QA those cars. I will stick to my dumb car in the meantime.


If you walk through parking lots, you unfortunately don't have a choice but to be part of the QA process.


This is very true. Sounds like a class action waiting to happen.


It's not unreasonable to assume someone will have a line of sight for where a vehicle is being summoned from, just like when in the vehicle itself and you drive forward you're paying attention to its surroundings as well.

I have seen some videos released with summons that just a quick horn honk would have likely prevented another driver backing up their vehicle from bumping into, with enough force to do some front bumper damage, a Tesla being summoned in a parking lot; it moves slowly too, and that could also be a factor in surrounding drivers not gauging or having the same kinds of reactions.


I can have line of sight of my vehicle, but in a parking lot, there may be a small child, obscured by other vehicles, in the path of motion of my car.

What do you think is going to happen, when a Tesla runs one over?

Hint: They are going to blame the driver for using this feature in a parking lot. Their fanboys will defend the hell out of that statement. They will be oddly silent on the subject of 'If you're not going to use this feature in a parking lot, where the hell are you supposed to use it?'

A 'self-driving' car that is incapable of collision avoidance is not yet ready for the market.


Ever walk up to your car in the parking lot then realize... that's not your car? Now identify your car from across the parking lot. In California, where every other car is a ticky tacky Tesla (and they all look just the same). There are already anecdotes of people wondering why their car isn't summoning only to realize ... that's not their car.


Fair point - so certain scenarios likely would benefit from visual and perhaps audio signals, lights flashing - certain lighter horn noise. Perhaps even listening to audio for someone nearby to be able to say "stop" and the vehicle would stop - or respond similarly to other horn noises.


New electric cars have a low speed noise maker in the US (required by law after sometime this year), but most Teslas on the road now were built before this was a requirement.


I think that's one of the stupidest additives of constant noise pollution - noise pollution being one reason people zone out and stop paying attention to their environment, rather than people being aware that they're about to step onto a road.


As someone who grew up before smartphones, with plenty of noise pollution, I am sure it's not the noise. People cross the street stairing at their hands.


Not everyone can see.


And those people are going to be much less likely to randomly, unexpectedly walk out in front of traffic where it's not a designated crosswalk. I bet their hearing is refined, heightened, to more easily notice quieter electric vehicles as well.


I think the requirement isn’t in effect until 2020:

https://www.theverge.com/platform/amp/2019/7/1/20676854/elec...


> In California, where every other car is a ticky tacky Tesla

I live in California, and if I see five Teslas in a day it's an unusual cluster.

Now, Silicon Valley probably has a greater proportion of Teslas, but Silicon Valley isn't typical of the state.


Really? Even just during my 10 minute walk to work in Minneapolis, MN I'll probably see five every day.


I’m in Orange County and I usually see 8-12 on a short drive. Easily 30-40 across a normal day with a 15 minute commute.


Live in SV and my commute i see average of 30 M3s, about 10 S and 5 X mostly on 101. Also i see about 5 delivery truck packing about 5 cars each every day. Thats a lot of cars.


Why do people say that highly populous areas are “not typical”?


Median household income in San Francisco-Oakland-Fremont metro area is over $100K, for the state as a whole it's about $70K.

That's one reason the former isn't typical of the latter.


>It's not unreasonable to assume someone will have a line of sight for where a vehicle is being summoned from

And do what? Magically jump in the car and take over the breaks when a kid suddenly appears on a bicycle going towards the car?

Or are they only supposed to summon the car in the Mojave, where there's fully open line of sight and few passers-by?


I think the summon feature requires user to keep pressing a button on the phone to work. Lifting finger from the button will stop the car I believe.

I’m saying it by watching the videos. Someone knowledgeable do correct if I’m wrong.


That doesn't stop Tesla from driving into live traffic and stopping there. Look at the videos.


So instead of a foot on a pedal connected mostly directly to the brakes, it is a finger on the button of a phone app over the internet? That is madness, imagine if your car's brake pedal had to go over the public internet and instead of a pedal is was a button in a probably poorly coded phone app.


> So instead of a foot on a pedal connected mostly directly to the brakes, it is a finger on the button of a phone app over the internet? That is madness, imagine if your car's brake pedal had to go over the public internet and instead of a pedal is was a button in a probably poorly coded phone app.

it's a deadman's switch not a brake pedal. if you lose connection it stops the car. if you lift your finger it stops the car. if the car senses an object it should stop the car (but might not because it's not 100% yet)


What’s the latency tolerance for this? I have to imagine that, given mobile internet conditions, it’s in the range of seconds. It also raises the question - is it really a dead-man’s switch, or more like a heartbeat being sent every N milliseconds? How many heartbeat signals can be lost?

It’s not as simple as a deadman’s switch; it can’t be if it’s using a mobile phone and the internet.


This is a false problem, the engineer have coded the button the other way: Only moves if you receive a signal from the app. If you lose this information in less than 50ms, just stop the vehicle.


It's not a solvable problem in this context. Humans are not computers, and cars have no self. The car would melt a hole in the ground and not even approach par with a cat. It's understandable that people are making this mistake in the context of lifetimes of sci-fi anthropomorphizing machines.


You are correct.


No, they can stop the car pretty much instantly. No magic required.

And for the case you mention, a kid on a bicycle, the car will stop itself if needed for safety.


But magic _is_ required. A reasonable field of view to assess the hazard or approaching kid on the bike might, quite easily, be available from the car, but not 20m away looking toward the car.

Case in point. That car in the video that turns into the car park looks like comes from behind the guy with the phone. Someone in the car, or the self driving sw, should have seen it far earlier. Maybe even noticed an indicator, or deceleration.


And that scenario of a kid on a bicycle not being seen until they're almost in front of you exists while a driver in the vehicle too: collision detection will certainly be able to have a faster reaction than a human driver as well.


Fortunately the cars do have cameras facing in every direction. Clearly not sufficient to stop every fender bender but you could say the same about human drivers.


The car should stop itself for safety, but may not have the awareness to do this successfully. Bicyclists are explicitly listed in the warning about paying attention to the surroundings of your car while it is being summoned.


Paying attention to surroundings is good.


>And for the case you mention, a kid on a bicycle, the car will stop itself if needed for safety

That's what it's reported as NOT doing, tho (and also what Tesla asks you to keep an eye on yourself while summoning, in their small print)


Is there video of such an incident - did you see a video of such an incident? Not saying it's not possible, however we have to keep in mind what's "reported" as clickbait "journalism" vs. reality.


It wasn’t even reported... he/she’s just making stuff up.


First of all, that's an ad-hominem (and rude).

The title of TFA is about how "People Are Already Reporting Collisions With Tesla’s Driverless Smart Summon Feature".

So clearly it's not the case that "the car will stop itself if needed for safety"...

Your point is that if it was for a kid on a bicycle it would, but for the reported situations (not being kids on bicycles) it doesn't?

Because the more obvious conclusion is that it doesn't work properly period, and that can bite people, where the case involves an incoming bicycle or not. Especially since we have reported cases of Teslas hitting bicyclists in the non-summon mode:

https://www.youtube.com/watch?v=I3m1U2KiGgI


> That's what it's reported as NOT doing, tho

False. Nowhere is that reported.

Yes Tesla advises you to be careful.


If you think harder about not, no one really turns on autopilot except on main roads or highways, hence almost no training data. Parking lots is like a big rectangular area with stuff moving all over. So with more training data, they should have it down, but these incidents gonna lower their rates of getting training data.

Is a chicken and egg problem.


And if the pharma companies didn’t have to abide by ethical guidelines for clinical trials they could collect more data. There are good reasons to forbid that.


I thought Tesla collected training data everywhere all the time?


And they shadow test while hooman is driving.


of course its unreasonable, but thats a liability clause added by a lawyer.


How is it unreasonable if the things are already having collisions? This looks like a half-baked feature not even close to ready to be pushed out. Tesla is becoming a meme at this point and it has to be coming from the erratic man at the top. An engineer worth anything wouldn’t launch something like this.

Sorry if I’m fired up about this but he just reminds me of a bad boss I had that only cared about hype and how things looked and expected the laws of mathematics and physics to bow to his delusions.


It’s possible many of these collisions are caused by other drivers.

I mean even parked cars get hit by drivers every day.

There is a bit of news hysteria effect happening here.


There’s literally only two examples in the post and one is an example of the car stopping before a collision. The other example is another car backing into the Tesla.


In the first video from the article, the Tesla would be found to be at fault in most jurisdictions as the driver backing out was well into the backing out maneuver before the Tesla even rounded the corner. The Tesla without question did not have the right away at that point.


In most jurisdictions, the vehicle traveling in reverse is always the give-way vehicle. Practically, I expect that the insurance companies would agree to each pay their own driver’s costs and count it as an at-fault accident on both sides.


A self-driving car's fender bender is more newsworthy than a fatal accident with a human driver.


As you should know as a software developer, development of major new capabilities is an incremental process, and for something like self driving cars it’s expected that there will be a transitional period before self driving. Summon in parking lots is a good transitional step.

Remember Tesla is not just training cars here; humans also need to learn to deal and not freak out. There will be fender benders along the way. Most of them probably caused by other drivers, but bugs are also a possibility.


>As you should know as a software developer, development of major new capabilities is an incremental process

Hopefully software developers with such an idea are not allowed in medical devices, aviation, missile guidance, space industry, industry robotics, and automobile...


Developments in all the areas you mention start with baby steps and incrementally progress toward greater and greater capabilities. This should not be news to anyone here.


Development yes. Releasing them as products (which is the case we're discussing) when they have people-killing faults, no...


Plenty of people die in cars every day. Cars are released products, right? You are making no sense here. And the collisions reported with smart summon are fender benders not “killings”. No need to hype up the drama level.


>Plenty of people die in cars every day. Cars are released products, right?

Yes, but current cars killings are due to operators (drivers) errors. Not in a self-driving parking-summoner killing people.

Companies don't (or aren't supposed to) release production cars with known people-killing faults. And when some are nonetheless released (which happens sometimes, e.g. a defect found in the break system) companies get fined and/or people go to jail for those. It's not all A-OK because "people die in cars every day" anyway...

>You are making no sense here.

Oh, the irony.


Sure, but usually we don't let Joe Blogs control it or even get in it or near it until there's a level of reliability and safety in its behaviour.


I think you’ll find that the behavior that is unreliable here is that of other drivers, not the technology. So what you’re saying doesn’t apply in this situation.

Even if it did apply, I believe the technology is safe and reliable, because from what I have seen Tesla does extremely extensive testing. So this meets your filter as ready to be released to the public.


Of course it applies!

Human drivers are reliable in the presence of other, unreliable drivers. We deal with this kind of uncertainty every time we get behind the wheel.

It's not a stretch to say that we should expect at least that much from automated driving.


To that point, I believe this technology was not released willy nilly. The poster seemed to be assuming it was. I agree that a good level of safety and reliability is important.


Nothing has ever been or will ever be perfect revision 1. That doesn’t mean it shouldn’t happen. The real world will always be where flaws get light shined on them.

To your point, no one is still using Model T’s, flying Wright Flyers, or still living with a Jarvik 7 heart implant (note the 7 implying incremental progress). Further, the entire history of space exploration is built on learning from failures and explosions.


If we replaced 'Tesla' with 'Boeing' and '737 Max', would you still feel the same?


The "humans are the bugs" excuse is pure denial on the pre-programmed proponent side.


Nobody called humans the bugs.

But sure, humans sometimes do make mistakes.

And when a Tesla is involved on either side, news hysteria effect seems to kick in.


As a motorcyclist, these videos are sickening. Especially the guy purposely using smart summon to cross LIVE traffic (not in a parking lot).

These kind of low speed collisions are not a big deal to car drivers. But for someone on a motorcycle, having a Tesla with no driver obliviously pulling out into live traffic in front of a motorcycle (even at low speeds) can cause significant injury.

Please, people... don't be stupid.


>Please, people... don't be stupid

It’s more than stupidity, it’s a sense of entitlement.


when I ride to work I take a completely different route that is significantly longer just to avoid two intersections and a couple of blind high speed crests.

The long route also a long twisty section, so that helps too :D


Yes or for cyclists...


>But for someone on a motorcycle, having a Tesla with no driver obliviously pulling out into live traffic in front of a motorcycle

I don't get it. Do motorcyclists use whether someone's in the driver's seat to determine whether the car is moving or not? Shouldn't they be looking at the brake/head lights?


The point is not to needlessly increase the risk for other people using the road for the sake of showing off your car’s cool party trick.

But to directly answer your question: in fact, most people on a motorcycle will try to look at the driver, when possible, to check if the driver is looking at them to gauge whether the driver sees them or not. In the case of a Tesla, it’s impossible to know if the car’s sensors know you’re approaching...

Also, I don’t know any human beings that share a similar driving pattern as the Tesla while it’s being summoned. It seems to start and stop unpredictably and does so in a very jerky nature (and most humans do not drive as slow as in the videos, and yes sometimes driving slowly is more dangerous than driving regular speeds, and most humans also won’t drive on the wrong side of a lane, which the Tesla often does in videos I’ve watched... it’s just very unpredictable


A common technique for motorcycle riders, which also works for pedestrians, bicyclists and even car drivers, is to make eye contact with other drivers before pulling in front of them. The vast number of MC collisions happen because the other driver doesn’t see them. When the other operator isn’t even in the car, you lose a key safety tool.


The point is that a small "fender bender" between two big cars might be a fatality if a motorcycle is involved.

The "no driver" phrase relates to the likelihood.


Could be that you cant see motorcycle coming because your car blocks the view.


The feature being a worldwide premier, with no past experience, I honestly expected that Tesla had added more sounds and visual warnings when the Summon feature is running, such as flashing the car's headlights and emitting regular "bip" (like when a big truck is doing a reverse manœuvre), to catch the attention of other drivers, or even warn blind people when the Tesla is near.

Then, once the feature is polished with more real world data, they can always remove the warnings later.

The video with the white car coming through especially is scary. From all the sensors and marketing materials I know from Tesla, I would expect the Tesla to have detected that white car way sooner and stopped automatically. There was really no ambiguity to what decision to take here.


Does the Model 3 have a concept of stop signs (not just recognizing them, but understanding arrival order and how long to wait) and right of way?

Or are they expecting to get away with just avoiding cars and pedestrians while breaking driving rules because it's a parking lot and blaming owners when it inevitably falls short.

Tesla's track record leads me to believe the latter, but I'd love to find out something to the contrary.

-

Actually it looks like this is still controlled by holding a button? So yeah, guessing it has no concept of those andalmost-crash in video #2 would be expected behavior. Marvelous.


If four teslas arrive at an intersection at the same time, do they all crash? :)


You say this as a joke, but I suspect they have been programmed to drive fairly aggressively, though probably to come to a halt before any collision that it can predict. If so, we might soon see two or more Teslas deadlocked, their fenders inches apart.


So, the author managed to find 3 concrete examples.

One where someone else backed into the Tesla.

One where no collision occured.

One where the tesla ran into a garage.

Only the last seems at all worrisome...

I vote for changing the headline to singular collision.


But this is HN, so the comment stating “It's insane people are putting these death machines on public roads.” is voted above yours.


Right, the first was an at fault incident by the non-Tesla driver for sure.

In the second, the vehicle was allowed to drive across a traffic lane. And in fact it attempted to do so, but stopped at the threshold when it saw a vehicle approaching (which also hit the brakes). I fail so see how this isn't exactly the behavior claimed and desired.


Hmm, Tesla definitely crossed into the road in that video.

If driving into live traffic and stopping in middle of it is acceptable, then okay, self driving cars are here!


Are we looking at the same video? https://twitter.com/eiddor/status/1177749574976462848

It pulls out as if it were going to continue across the traffic lane and very clearly stops right at the threshold. At no point is it in the traffic lane, nor in the path of the gray vehicle that stops.

It's true that it "looked like" a human driver who was going to continue straight out, so the other driver (correctly) hit the brakes.

But spinning this like the Tesla "stopped in the middle of traffic" is just plain wrong.


The Tesla stopped when / because the 'driver' released the button. Watching the vehicle until that point, it hadn't slowed or made any hint that it was about to stop or do anything but continue forward, oblivious, and that -would- have caused an at fault collision.


Sorry, what's the evidence for the reason why the vehicle stopped? I mean, it stopped. That's what it's supposed to do. You have some cite showing that it wasn't a sensor detection?

And of course it hadn't "slowed", at this speed (looks like about ~1 m/s) the time taken to stop (at about the .7G a car on its wheels can achieve) is about 4 frames of that 30 Hz video, and it happens over a distance of about 7cm, most of which is the vehicle just rocking forward on its suspension. Be real, we're talking about parking lot maneuvering here.


Evidence? The author of the video saying he took his hand from the 'button'?

Okay, now we need to do some math. The evidence that a 4,000lb vehicle coming to a stop even from 5mph in... 3 inches? "Most of which is the car rockingon its suspension". Removing distance from the equation, because g forces are related to time of deceleration, using your numbers, https://rechneronline.de/g-acceleration/ says more like -2.3G which is far from a gentle stop.

You can also plainly see the other car decelerate too, over far more than '4 frames'.


In the twitter thread for the second incident the owner claims he commanded a stop (released the button) prior to the event but is uncertain as to whether the car stopped itself. So that makes at least three people who thought the summon feature was not going to yield to cross traffic.

I suppose that doesn't really fit the narrative you're trying to spin here.


All evidence so far suggests that self-driving cars are more dangerous than human drivers when similar conditions are taken into account.

The comparison numbers that Tesla parrots are between Teslas with modern safety features and in good weather conditions, and the entire rest of road fatalities, including motorcycles. Comparing similarly bodied cars in similar weather conditions, Tesla's self driving has more reported collisions per miles driven.


That’s not accurate. The original Tesla AP safety study was done on the Tesla fleet before vs. after AP was deployed. It’s about as close to a clean, controlled experiment as you can get. Same cars, same drivers, same safety equipment, roughly the same road conditions. They also didn’t even take into account whether AP was driving, only whether it was available on the car, so they aren’t looking at just the safest miles driven (highway miles).


http://www.safetyresearch.net/Library/NHTSA_Autosteer_Safety...

"As a consequence, the overall 40 per-cent reduction in the crash rates reported by NHTSA following the installation of Autosteer is an artifact of the Agency’s treatment of mileage information that is actually missing in the underlying dataset."


HN discussion here: https://news.ycombinator.com/item?id=19127613

Of course, the original publication of NHTSA's botched study is still the number one HN search result for both "NHTSA" and "Autopilot," and it still routinely gets referenced in comments. As the saying goes, the lie made it halfway around the world before the truth had put its pants on.


Do you have some sources on this? Every study I am finding even comparing like for like shows SDVs are about the same or safer. Also the facts I was able to find from Tesla do a lot of comparisons, not just the obscure ones you cite.


> Do you have some sources on this?

Fatalities of human-driven cars is ~1.5 per 100 million miles driven, and that's across the entire range of driving conditions.

The autonomous car industry (level 3) totaled under 10 million miles driven as of the Uber kill, putting them way beyond despite benefiting from only good conditions.

Tesla just reached 1 billion miles on autopilot early this year and has 5 fatalities, that's slightly better than the human-driven rate but (again) one can assume this is in good conditions pretty much exclusively.


Autonomous cars have not yet reached level 3! They are level 2 at best.

>> A Level 2 autonomous vehicle can control both the steering and the speed at the same time, essentially offering partial automation. The human must remain ready to take full control in an emergency and is ultimately in charge of and responsible for whatever the vehicle does. GM’s Super Cruise and Tesla’s Autopilot are examples of Level 2 systems.

>> Level 3 automation is controversial. The car not only manages steering and speed, but is responsible for monitoring the environment around it and detecting challenges that require human intervention. In normal conditions, a human driver would not need to pay any attention at all, but if something went wrong with the system, the person would have to be ready to take over right away.

All current autonomous cars require human safety drivers, or the full attention of the human driver (like in Teslas).

From:

What are these ‘levels’ of autonomous vehicles?

https://theconversation.com/what-are-these-levels-of-autonom...


Only in good conditions? Have you seen the videos of the Uber kill? Come on... 95% of humans would have had the same outcome.


The dashcam footage released by Uber was misleading; their cameras and sensors were far from as good as the human eye. The Tempe police concluded 85% of humans could have stopped the car in time: [0]

> In simulating conditions of the crash and consulting their textbooks, detectives found that Herzberg could have been seen 143 feet down the road by 85 percent of motorists.

[0] https://www.phoenixnewtimes.com/news/self-driving-uber-crash...


And as every other level 3 car it was using LIDAR, meaning visible light conditions were irrelevant. Even if the environment had really been as dark as the misleading dashcam footage it would have had very little relevance to car sensors unless that had been extremely heavy fog or rain.


I drove the section of road. If I wasn’t distracted by my phone, I would have stopped properly 100% of the time.


Downvoted... I brought up the phone distraction because the Uber safety driver was distracted by watching video on her phone.


At least one major study has found that there is currently not enough information to decide with certainty whether self-driving cars are safer than human-drive cars:

  Kay Findings

  •Autonomous vehicles would have to be driven hundreds of millions of miles and
  sometimes hundreds of billions of miles to demonstrate their reliability in
  terms of fatali-ties and injuries. 
  
  •Under even aggressive testing assumptions, existing fleets would take tens
  and sometimes hundreds of years to drive these miles—an impossible proposition
  if the aim is to demonstrate their performance prior to releas-ing them on the
  roads for consumer use.
  
  •Therefore, at least for fatalities and injuries, test-driving alone cannot
  provide sufficient evidence for demonstrat-ing autonomous vehicle safety.
  
  •Developers of this technology and third-party testers will need to develop
  innovative methods of demonstrat-ing safety and reliability.
  
  •Even with these methods, it may not be possible to establish with certainty
  the safety of autonomous vehicles. Uncertainty will persist.
From:

RAND Corporation Driving to Safety. How Many Miles of Driving Would It Take to Demonstrate Autonomous Vehicle Reliability? Nidhi Kalra, Susan M. Paddock

https://www.rand.org/content/dam/rand/pubs/research_reports/...


I agree; that's the kind of comparison I think we should be making. As I said in another comment, if self-driving cars replaced humans the day they became safer than human drivers, they would become our second most deadly consumer technology next to guns. Wringing our hands over fender benders isn't a good start to the conversation. I think instead of worrying that people aren't scared enough, we should think about how hard it will be for people to accept the idea of self-driving cars killing and mutilating people every day while at the same time being a safety upgrade over the status quo they're used to.


They are not going to be fooled by a false equivilence between self killing hammers and self killing cars.


>The comparison numbers that Tesla parrots are between Teslas with modern safety features and in good weather conditions, and the entire rest of road fatalities, including motorcycles. Comparing similarly bodied cars in similar weather conditions, Tesla's self driving has more reported collisions per miles driven.

I'm interested in learning more about this, because I'd originally taken Tesla's figures at face value. Are there any sources you'd recommend?


I co-wrote an article [1] that covers some of the flaws in Tesla’s comparisons a few years back. Not sure if their stats have gotten any better in recent years, but the errors back then were pretty basic.

1: https://www.thedailybeast.com/how-tesla-and-elon-musk-exagge...


One of the issues raised in the article was insufficient data.

I guess there’s a lot more data now!


The best data source I'm aware of is NHTSA's Traffic Safety Facts tables, but the data is a bit hard to filter by. https://cdan.nhtsa.gov/tsftables/tsfar.htm

The raw data is available in some other form available by one of the linked sites, but I cannot find it in the rat's nest of hyperlinks right now.


Nobody who has an inkling how hacky self-driving is would try this.


I cringe whenever my friend tell me his Tesla M3 will be full self driving and he'll be able to earn passive income in the future.


and still you have a team of really gifted engineers that worked on this, an executive or a bunch of executives that pushed for this and a bunch of tesla owners that will give it a try.


It is bold to imply that what tesla does is anything but directly harmful towards the goal of ending human driving.

Tesla and uber do a terrific job of pushing the acceptance of self driving and risking postponing that with their recklessness.


Maybe; it's hard to say. I think the reputation of human driving is so low that it's hard to go lower, but opinion on that is pretty polarized. As a bike commuter in a quickly growing city, I'm aware that there are a lot of people who think driving is a fundamentally safe activity that is only made dangerous by unwelcome novelties such as cyclists, pedestrians, etc. But there's also a lot of awareness that it's really a shit show: humans are terrible drivers who kill people by the thousands every year. It's such an outlier that I don't think normal safety standards apply. I think the only standard that self-driving cars should need to meet is being comparable to human drivers.

If self-driving cars replaced humans the day they became safer than human drivers, they would become our second most deadly consumer technology next to guns. Crazy to think about, isn't it? As safety improves, self-driving cars can save thousands of lives per year and still be our second most deadly consumer technology. We should get used to that idea and look forward to cranking the death rate lower and lower as technology improves. Humans behind the wheel will always be a menace.


Well they are nowhere near human drivers. And yes, human drivers are terrible.

Thing is that human drivers vary a lot. So we have excellent drivers and we have awful drivers. And also we have tired drivers, bored drivers, angry drivers and we have sporadically inattentive drivers.

If we were to aim for "at least the same number of accidents as a human per mile" that would be a pretty disastrous AI. Because the AI would be much more uniform in its performance.

A human driver will send text messages and won't even slow down before hitting a stand-still truck right in front of them. A human driver will mix up the brake pedal and speed straight through an intersection.

An AI that does comparable damage (obviously wouldn't make the same mistakes) would be pretty darn scary to sit inside or be near.

There is also the very important factor that a self-driving car needs to behave as a human to not cause accidents just with it's presence.

The PR disasters that await, when people die because a self-driving car did something that a conscious human would never do people won't react rationally. Tesla are just begging for a, well deserved, ban on self-driving cars and attach a stigma to them. All while killing some people.

There is no silver lining to it.


It’s straight statistics. You’re completely ignoring the harm done by human drivers; quantify that vs self driving and you’ll reconsider which is more reckless.


Taking move fast and break things to the next level


Yup. The following is from a previous HN Tesla discussion:

   Tesla Will Have Full Autonomy in 2020, Musk Says

      Tesla has an advantage here in that they
      don't feel the need for their autonomy
      to be particularly safe.
These problems and accidents are a direct result of Tesla's laissez-faire attitude toward autonomous driving. They do whatever the fuck they think they can get away with, consequences be damned.

So far, they've been allowed to get away with quite a lot.


SV will be SV, the real question is what is the NHTSA doing. The US wants to deregulate itself into chaos apparently.


Please don’t use HN’s code formatting for quoting text, as it isn’t readable on mobile. https://i.imgur.com/wiBoZyb.png


Sorry. Yeah, that's ugly.

I've used code formatting before, and I've tried to keep the lines short. But obviously not nearly short enough for some mobile users.

Let me turn it around. How would you format those two sentences for mobile, while still maintaining my intent of showing that the second was a response to the first?


The same way emails do, approximately:

> Quoted paragraph here. Many words are in this paragraph.

> Second paragraph here.

Reply here. Note that you’ll need to leave a blank line between paragraphs if you’re doing extended multi-paragraph quotes, which is unwieldy - but so are multi-paragraph quotes. For those, either quote less (if you’re replying), quote interleaved with replies (if it’s a multi-segment reply), or reply first and quote second^.

^ if it’s external content you’re including from an external source that you’d like to reference from your comment, as opposed to a footnote.

Here’s the relevant segment from the HN faq, for the record:

https://news.ycombinator.com/formatdoc

Blank lines separate paragraphs. Text surrounded by asterisks is italicized, if the character after the first asterisk isn't whitespace.

Text after a blank line that is indented by two or more spaces is reproduced verbatim. (This is intended for code.)

Urls become links, except in the text field of a submission.


I think you really have to be oblivious to the current state of affairs on the road not to understand what he's getting at. As someone riding a bicycle on city streets, he is absolutely right. It's not safe out there right now. Even the current safety statistics are skewed by the scarcity of non-car road users in most places and the amount of advanced machinery in a car dedicated to keeping people (the ones inside the car, at least) alive in a serious accident.


For the sake of argument let’s say they’re not perfect but better than human drivers, is that good enough?

Technology is inherently dangerous, but if everything that could kill someone had been held back we’d still be workshopping fire.


> For the sake of argument let’s say they’re not perfect but better than human drivers, is that good enough?

When we'll get there we can discuss it.

We're not even close.


> We're not even close.

What makes you say that? I've certainly known quite a few people who've gotten into fender benders in parking lots.


I would say the final level was reached by Boeing by actually killing people.


Ok, so, from the article:

1. Lexus backed into Tesla

2. Nearly collided, on a busy lot, both cars stopped in time.

I’m really not seeing a problem yet.

Just an alarmist, fake news article.


Are you joking?

In the first case the Lexus was in the wrong, but a human would have intervened with that loud auditory device affixed to the steering wheel, or moved further out of the way.

The Tesla freezes deer in headlights after it turns into the path of the car backing up. I wonder if it even signaled before turning. The fact the owner is worried that the other driver thinks they were driving leads to be believe it probably did something a human driver should have, not signaling makes sense

-

In the second case it ignored right of way and almost caused a serious crash!

In another it ran into a wall!

If anything tuis article is being incredible charitable by trying to focus on the cases where it doesn't fail...


>> In the second case it ignored right of way

This is exactly the thing that people should !not expect! from it in the very beginning. It's an owner's responsibility to _monitor surroundings_ and make car move when it's safe to do so. Car will navigate, but it's not a full self driving obeying all the rules of city traffic.


The real world is not a train line. A car that moves on a track while you hold a button, then stops when you let go is not safe.

Even when it is safe to move from your point of view, it doesn't exclude you from having to signal to other drivers.

It doesn't exclude you from situations that were safe but are no longer and require the car to move out of the way in a maneuver more complex than a few feet forward or backwards.

-

Not to mention all the scenarios Tesla touts in the launch are also perfect completely contradictory to monitoring surroundings.

You wouldn't drive with a hand full of groceries, or your baby crying in your lap, because you'd be driving distracted.

Why is Tesla bragging you can control the car while dealing with these distractions when they haven't taken the steps to make it safe to do so?


Call me old fashioned, but I expect empty cars to stay put.

If I collide with an empty car that’s moving, I’m suing the owner for for forgetting to engage the handbrake.


Related - if you like spending time on reddit: please consider visiting r/realtesla. Sometimes it's a bit too negative, but mostly I find it a nice counterbalance to that creepy "everything is always awesome" subreddit.


I like the Tesla cars and definitely thinking of getting one but the self driving stuff, I'll let that age for a decade or two.


Own a TM3, unless Tesla can make "smart" summon work the idea they will ever have full autonomous driving down is just a pipe dream. What the car can do on a long drive is nothing short of amazing compared to any other brand but treat it like your kid with their new driving license, you watch over it like a hawk.

At most I use it to slide out of a space where someone got too close, usually on purpose, or as a party trick. Let it drive out of my sight, oh, hell no.


Isn’t that the intended use case? You’re not supposed to let it drive out of your sight.


It looks it is running the same software as my vacuum robot.


Tesla. I'm disappointed. You say your cars are the safest on the road. That your self-driving AI is safer than humans. You spar with regulating bodies along those lines-- arguing that you're protecting consumers. And then you roll out a fluff feature that causes accidents.


Then you shouldn't be trusting Tesla's marketing. Did you think Elon musk had personally made break throughs that ML scientists at Google are struggling over?


As much as google likes to brand themselves as an AI company it is pretty much all marketing fluff. What google has actually deployed is all super simply modeling because of the scale they have to run at. They can’t deploy expensive models because they have to run across a billion+ users. They have made some contributions on the NLP side but mostly they just have the best consumer packaging of ai.


maybe, if road means freeway. There is no way the "self-driving" AI is safer than humans in the context of busy city streets, construction areas, parking lots, you name it.


This feature was pushed out way before it was ready, all so Tesla can recognize some deferred FSD revenue in Q3, which ends on Monday.


Apparently the car doesn't try to read road markings, I stand by my comment I made 2 days ago in response to that.

"Isnt that below minimum viable product for something expected to be used with other people around?"

https://news.ycombinator.com/item?id=21089373


Honestly what's interesting here: People are rushing to try out the v1.0 software of a feature from a car company, potentially risking their expensive vehicle because FOMO of all the other Tesla owners posting about summon before them.


I love Musk but this is a bad feature because he hasn't looked at the actuarial case for not having this. TONS of car accidents happen in parking lots.


This seems like a huge opportunity for insurance fraud like you see often in Asia. Someone arranges to be struck by a car (ie flings themself in front of it), then splits the medical payout with a crooked doctor.


Most of these look like user stupidity rather than anything wrong with the car. Especially the video of the guy who trys to summon his car across a road.

I imagine summon is pretty dumb, but all Tesla's talk of autopilot and self-driving cars has made people believe it is a lot smarter than it actually is.


If it's a user facing feature. And it fails catastrophically if some preconditions aren't met. And it doesn't have the ability to verify those preconditions. Then allowing users to invoke it is probably not user stupidity. It's just poor design.

The same topic is being explored in the 737 MAX clusterfuck where Boeing failed to provide sufficient safeguards OR sufficient training for non-expert users to manage catastrophic failure. People are stupid, yes. But you can't blame people being stupid as an excuse for insufficient safety engineering, just like you can't blame drag force being proportional to the square of the velocity. These are just the constraints we have to engineer around.


True! I have the impression that Tesla is trying to "innovate" itself out of the corner they are finding themselves in. Already said multiple times, but there are valid reasons why the automotive sector is working the way it is. Sure, as everywhere a lot of stuff is just there because it was always there. Go ahead and brake that. But understand first which ones can be ignored and which ones not.

Kind of worked for SpaceX by ignoring all the political bullshit in aerospace. I'm slowly wondering whether that was luck or intention, so.

Just how the latest Tesla features ever became road legal is beyond my understanding so.


A lot of stuff is just there because it was always there. Go ahead and brake that. But understand first which ones can be ignored and which ones not.

Chesterton’s Fence:

In the matter of reforming things, as distinct from deforming them, there is one plain and simple principle; a principle which will probably be called a paradox. There exists in such a case a certain institution or law; let us say, for the sake of simplicity, a fence or gate erected across a road. The more modern type of reformer goes gaily up to it and says, “I don’t see the use of this; let us clear it away.” To which the more intelligent type of reformer will do well to answer: “If you don’t see the use of it, I certainly won’t let you clear it away. Go away and think. Then, when you can come back and tell me that you do see the use of it, I may allow you to destroy it.

—G.K. Chesterton, “The Thing,” 1929


Tesla vehicles are the safest ever made - I'd say allow Tesla to innovate away from whatever stagnancy the status quo auto manufacturers have allowed. User education is part of good design - and perhaps driver tests will need to start including proper use of such systems. They're road legal likely because it's innovative and it hasn't been discussed nor laws written yet - it takes time for the regulatory processes to work.


> Tesla vehicles are the safest ever made

Citation needed.


I mean, I guess I can clarify that they're safest mass consumer vehicle?

I'll let you read through these links to decide where you stand:

https://www.greencarreports.com/news/1124439_nhtsa-to-tesla-...

https://www.cnn.com/2019/08/07/business/tesla-model-3-safety...

...

It sounds like the regulatory body doesn't like that Tesla is claiming they got a 5.4 rating out of 5, as they say they don't give out higher than a 5 rating - and it sounds like Tesla is using public data points to calculate that 5.4. We'll have to wait to see if this goes to court to see if Tesla's methods for stating their 5.4/5 rating is unreasonable - or if the guidelines perhaps are if they "cap" what rating a vehicle can reach based on comparable data; as Tesla states in one of those articles alone, ~40% of vehicles have a 5-star rating - so they argue it's important to be more specific to allow consumers to better differentiate.


Interesting comment and one that conflicts with what most automakers are saying in regards to where the industry is going.

This thinking is what may cause many automakers to fail.

https://www.automotiveworld.com/articles/is-the-auto-industr...


Let’s say you make an amazing new consumer gas central heating boiler, and shout about it on Twitter to your adoring followers. And if someone switches it on without opening the water valves, it explodes and kills them. You say “oh, the manual says not to do that”. How do you think that would fly?

Consumer products have to be designed to be safe against consumers, to a large extent. You can’t just add a feature, add a disclaimer that says, in effect, “don’t use this”, and blame the user if they use the feature.


I don’t understand if this argument is sarcasm or not? I don’t have a boiler but I have a water heater and know if I don’t manually start the pilot light after having shut off the gas I could die.

Technology has always required basic understanding to avoid harm.


How is it 'user stupidity' when customers are using a feature as it is advertised, especially since the distances traveled are absolutely trivial?


The car only moves forward as long as you hold a button on your phone - as soon as you let go it stops. You are supposed to observe the car moving at all times. So logically, if you see that the car is about to collide with something, you let go of the button and the car stops. What else is it if not user error then?


> if you see that the car is about to collide with something

What about when the car is about to collide with something you cannot see from a remote point of view, or when the positions and/or velocities are distorted from the longer perspective? There are many common situations where hazards that would usually be visible from the driver's seat are obscured when looking at the car from a distance.

Also, people generally have to be trained and pass a test demonstrating basic competency at driving and pass a vision test. These tests don't include the slightly different skills/vision required to safely judge driving a car from a distance, so why would you assume a licensed driver is even capable of operating dangerous machinery to do something outside the scope of that license?


If you can't see enough to know if your car is going to crash maybe you should not drive it [ie don't tell it that it's safe to continue]?!


How can you gauge approach angles properly from outside the cabin.

It’s bs


If you can't then you shouldn't be using this feature, simple.


You’re eyes are a couple inches apart. There are geometric limits to our ability to perceive depth and therefore angles.

You can’t, I can’t, no can perceive angles appropriately from outside the car at distances that keeps the feature safe and makes the feature useful (50 yards out)


Most parking involves one or the other end of the vehicle being close to something else. How do you stop in time if it starts out going the wrong way?

That sounds like the issue with the chap who tried it in his driveway.

(My feeling is that if the feature isn't more reliable than a 95th percentile human for the subset of functionality, it probably shouldn't be released.)


I can't believe that this is a real thing.

Are we so lazy that we can't walk to our cars now?


We waste vast amounts of space on parking spots. We destroy perfectly good land just in case someone might want to park there.

This could change the shape of shopping and make the world a better place.


And? This doesn't do anything about that if you still need to be line-of-sight.


That won't always be the case.


And so if a person accidentally drops their phone the car eventually stops when it meets an unmovable object? Is this correct?


No, if you release the button it'll stop (barring some kind of fault). It will also stop before hitting anything (again, barring some kind of fault).

It's not ready, though. The Tesla might not technically have been at fault in this clip, but driving is full of situations where one party makes a mistake, and another takes corrective action to prevent an accident.

A self driving system that is never "at fault" but crashes regularly into drivers that are at fault is not good enough. It needs to drive defensively.

The timing of this release is suspiciously close to the end of Q3. It concerns me that this was rushed out the door early to recognize some revenue.

https://twitter.com/DavidFe83802184/status/11777611732713922...


> No, if you release the button it'll stop

OIC, thanks for the info that I was too lazy to DuckDuck


Seems like you have to keep pushing a button, so it should stop as soon as the phone is let go.


AI taking over and teasing owners.

Or just a bad idea for fat and lazy people.


Pretty much - ppl are too lazy to walk to their bloody cats and drive their own cars because they want to keep starting at the car video they’re watching


It was a freaking typo guys, Jesus Christ. iPads suck to type from some times.


The carelessness and dangerous way this entire thing has been "rolled out" is alarming.

People are touting this as some sort of "amazing" feature, I think this guy Elon Musk is a crazy whackjob who puts people's lives at risk.

All the videos I saw from Tesla were " ideal well-lit" situations; well of course! University graduate students have Legos and basic robotics that do that crap. Getting it to work in "normal" (read "edge-cases) is the most difficult part. And they conveniently just ignored that part.

It's insane people are putting these death machines on public roads.

Before people claim "this is just a parking lot", imagine how many parents use their stroll-cartd with babies in them or how many kids run around in parking lots.


I think "death machines" is a bit hyperbolic, the speed is capped at 5mph, and I imagine that Tesla is better at avoiding hitting humans in parking lots than a normal driver since it has sensors all around it. Avoiding hitting people isn't the hard part, navigating in a predictable way and following the unwritten expectations of driving in a parking lot is, and that is something Tesla doesn't seem to do a great job of yet.


yes. you have a couple thousands pounds of "self-driving" technology that does not do well in parking lots (that's like 95% of where I usually park my car). It's definitely not going to hit or kill anyone. Also, it's the responsibility of pedestrians 100% to look for cars.

Give me a break. You can either handle what your feature advertises or you cannot. This is in the cannot bucket - the same way autopilot is in the cannot bucket.

The sad story with Tesla is that they should probably focus on the one thing that works beautifully on the car and that is the fact that it's electric. Focus on making a beautiful, insanely fast electric car. People love it already - just stop w/ the gimmicks.


There is a big different between not doing well in parking lots, and being dangerous in parking lots. From the videos I have seen, it doesn't do well in parking lots because it is too cautious - it is safe, but not very functional. Now of course we have to wait longer before determining if it is safe or not, but I would imagine that it is not hard for Tesla to avoid hitting people, and if it does hit someone it is only going at walking speed anyways.


To me doing unpredictable things, in a parking lot, is dangerous. I would like to not be hit, period.

And good luck with the liability when accidents happen. You explain that it was the car and the car manufacturer explains it was you. In the meanwhile I'm in a wheel chair or worse for the rest of my life.

I am also not a big fan of experimenting if something is safe or not using others' people lives. I would think that if you're going to put something out there you will be liable if it's malfunctioning.

You want to have really cool features that don't work in the real world? Play with them on your private estate - just don't force people to be guinea pigs.

Also as a side rant: we live in a world where a significant chunk of people doesn't have clean water or access to internet, but hey I can remotely summon my car! (but have to be sure it's in my line of sight...). How about we focus on giving everyone clean water and internet and stop with the gimmicks?


The feature is in beta, and it is enabled on cars lacking the latest hardware. I think it would be reasonable to expect issues especially for those cars.

I know it's very popular to cherry pick incidents and attack Tesla these days, but, nobody else is pushing the envelope. And nothing about this feature is putting anyone at significant risk. These are inconveniences that have occurred, and nothing else. In the meantime, the pursuit of self driving cars is pushed forward greatly through the collection of this data.

In the end, this will benefit everyone.

For now, we can sit back and enjoy the cringey videos this will produce.


> nobody else is pushing the envelope

Many other companies do. They just stop short of releasing obviously substandard and dangerous technologies into the wild.

That's what test tracks were built for, not public streets.


To add to this, there are a number of production cars today with level 2 automation equivalent to or better than Tesla's. None of them made a big fuss about it.

The only envelope Tesla is pushing with 'autopilot' is in misleading marketing that has been directly linked to drivers misunderstanding the capabilities/limitations of the technology.


It's bad enough to share the road with bad drivers now I have to share the road with poorly written software.


You think cars were running perfect software before Tesla?


No, but they didn't drive alone.


> The feature is in beta

I. Don't. Care.

The way I see it is that if someone's Tesla crashed into my car in the parking lot then as far as I am concerned it is 100% the fault of the owner and the owner should compensate me 100%.

The fact that it wasn't technically their fault but instead it was bad software matters nil to me. To me the only person I interacted with is the Tesla owner, and that's who I will put blame on. I don't care if the owner is able to defer blame to someone else. They can do that, but I want "mine" from the owner.


> 100% the fault of the owner and the owner should compensate me 100%.

The owners fault?? Nope--I feel its Tesla's fault... entirely. Well them and the DOT for allowing this experiment to be carried out on public roads. Meanwhile Google still asks me what a truck on a hill looks like via captcha's... oh yeah this self driving tech is ready to go!


What part or letter of the word "beta" don't you understand? Can't you just not use the functionality until the company says that it is complete? There is plenty of other functionality in a Tesla car, why not use that?


The Public road is not a beta testing ground.


The comment you’re replying to is talking about their non-Tesla being hit by a Tesla operated by somebody else. How do we force other drivers to not use this feature? What if it wasn’t released until it’s less likely to collide with other people’s property


consolasfont is talking from the POV of the non-Tesla owner who got hit by a Tesla.


So you are fine with beta testing safety relevant features in the wild which could harm, injure and even kill people? You know that cars are not some app?




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: