I've encountered LLM generated comments that don't even reflect what the code is doing, or, worse, subtly describe the code inaccurately. The most insidious disenchanting code I've ever seen has been exactly of this sort, and it's getting produced by the boatload daily now.
I really don't understand what is going on. I try to, I read the papers, the threads, I think about it. But I can't figure this out.
How can it be that people expect that pumping more energy into closed systems could do anything else than raise entropy? Because that's what it is. You attach GPU farms to your code base and make them pump code into it? You're pumping energy into a closed system. The result cannot be other than greater entropy.
Hum... In theory the closed system includes a database with most of the humanity's written works, and the people that know how the thing works expect it to push some information from the database into the code. (Even though, I will argue that the people that know how the thing works barely use it.)
The reason LLMs fail so often are not related to the fundamental of "garbage in, garbage out".