Hacker News new | past | comments | ask | show | jobs | submit login

ChatGPT gets the rules to the pokemon trading card game wrong. It will tell you you can use 4 energy a turn. Convincingly. Not sure how it hallucinates this. The rule is 1 per turn.



A few days ago I asked ChatGPT if “pannekake” and “kannepake” are anagrams of each other.

It correctly stated that they are, but when it went on to prove that this was the case, it generated a table of the frequencies of the individual letters in these two words, and the table looked like this.

    Letter | Frequency in | Frequency in
           | “pannekake”  | “kannepake”
    - - - - - - - - - - - - - - - - - - -
    a      | 2            | 2
    e      | 2            | 2
    k      | 2            | 2
    n      | 2            | 2
    p      | 2            | 2
This reminded me that yes indeed, AI just isn’t quite there yet. It got it right, but then it didn’t. It hallucinated the frequency count of the letter “p”, which occurs only once, not twice in each of those words.


Anything that has to do with individual words doesn't work well, but as I understand, this is an artifact of the tokenization process. E.g. pannekake is internally 4 tokens: pan-ne-k-ake. And I don't think that knowing which tokens correspond to which letter sequences is a part of the training data, so it has to infer that.


Could it have been referencing Blastoise's Deluge ability? Jacob Van Wagner used it in the 2015 championship to use 4 water energy in one turn.


I just asked it, and it said you can attach 1 per turn. And then it continued something about using supporter cards to look for more energy cards, and trainer cards to switch them. (Which it also considers as using or playing those energy cards.) Not familiar with the actual rules, though. :)


Ah I was using my friends server which has a slightly different model running - thanks. It's one of the divinci models I think? Don't know much - it's code oriented. So I guess it's not 'ChatGPT' but a GPT model he built a chat on.


Isn't it just garbage went in, got weighed as a more reliable source than it should have been and thus garbage came out. Good old GIGO... It's just here, ChatGpt, as much as I love it, is amazing at imparting the impression that its shit don't stink.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: