Hacker News new | past | comments | ask | show | jobs | submit login

When I ask ChatGPT4 what the length of its context window is, it tells me (4096 tokens).

When I ask Gemini, it basically tells me "it depends" with a few paragraphs of things I generally don't care about and then suggests I ask for a ballpark estimate (1k - 3k tokens).




Beware of hallucinations with this kind of question. An LLM doesn't have knowledge about itself unless that was fed into a system prompt by the developers somewhere on the backend, or if it's Internet connected and does a search for itself. While they often do so in terms of e.g. basic stuff like its name and that it's an AI, context windows start veering into advanced details and I would much rather rely on official documentation on the service in this case.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: