Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The “failing hilariously” bit is critical for this road-tripping use case.

It’s only going to take one bad suggestion that leaves someone in a dangerous situation to lose faith in simply handing over a whole day’s itinerary to an LLM. Honestly that can go so bad very easily.

We are decades into the GPS navigation era and I still don’t trust the route my vehicle suggests. I have been burned so many times that we literally still compare routes from different providers for a new trip.

 help



> I have been burned so many times that we literally still compare routes from different providers for a new trip.

I heard this often but what has been the issue in practice? The worst that happened to me is Google Maps suggesting I cross a bridge that was washed away by the last typhoon, but that's hardly Google's fault.

Only in very remote places has Google Maps failed me, at least for driving directions (for trails it's another story...)


> It’s only going to take one bad suggestion that leaves someone in a dangerous situation

I feel like if one bad suggestion can leave somebody in a dangerous situation, many other things must have failed before, such as informing oneself of the general condition of roads in a given place and the current season, having a fallback plan in case digital navigation fails or a road is unexpectedly closed etc.


You expect the human to do all the actual work of planning the trip, but leave the only interesting parts to the LLM?

If you are doing all that, why do you need an AI?



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: