Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Not really: it's arguably quite a lot worse. Because you can judge the trustworthiness of the source when you follow a link from Google (e.g. I will place quite a lot of faith in pages at an .nhs.uk URL), but nobody knows exactly how that specific LLM response got generated.


Many of the big LLMs do RAG and will provide links to sources, eg. Bing/ChatGPT, Gemini Pro 2.5, etc.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: