Hacker News new | past | comments | ask | show | jobs | submit login

Yes, wholly agree. The special parts are in language. Both humans and AI are massively relying on language. No wonder AIs can spontaneously solve so many tasks. The secret is in that trillion training tokens, not in the neural architecture. Any neural net will work, even RNNs work (RWKV). People are still hung up on the "next token prediction" paradigm and completely forget the training corpus. It reflects a huge slice of our mental life.

People and LLMs are just fertile land where language can make a home and multiply. But it comes from far away and travels far beyond us. It is a self replicator and an evolutionary process.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: