Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

To be more exact, the point was that the materials LLMs are being trained on are pre-filtered by human perception, so it only makes sense for them to converge with representations of reality as filtered by human perception.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: