Hacker News new | past | comments | ask | show | jobs | submit login

I see a lot of negativity, but this is actually one of the first things I tried to do with ChatGPT when it got Bing integration. I was in an unfamiliar town for a wedding and I had forgotten my tie. I asked ChatGPT where I could buy one downtown. (If you're curious, it utterly failed, with two of the three suggestions not selling men's clothing at all, let alone ties.) I would love to be able to be on the highway and ask how far the next rest stop is or if there are any places coming up where I could get a sandwich. Google has that information, including menus in many cases, but it's difficult to access, especially while driving.



What do you think an LLM should be able to better than regular search results (possibly with speech-to-text input, and text-to-speech output - which is how the interface to an LLM would work anyway) for this kind of use case?

If the answer is "regular search results are crap because of irrelevant sponsored placements being put before the thing I want", that's not a function of the search engine; that's revenue-maximising choice the search engine provider makes... so why do you think that won't happen to LLM results by the company that provides it?


It would be excellent if it worked. Considering generative AI's current reputation, the skepticism is entirely warranted.


I was looking to buy a backgammon board locally. Using google was unsuccessful, but google maps oddly worked for me (excepting one big box store which had them online..).


Like the 10 previous AI-related Google announcements, this sounds great, but is unlikely to ever materialize.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: