Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If you end up creating something sufficiently similar, yes in fact you do. Or rather, you have done a copyright infringement and retroactive payment may be one of the remedies.

This also applies to AI, just worse because:

A) AI is not a human brain, and pretending that the process of human authorship is the same as AI is either a massive misunderstanding of the mechanics and architecture of these systems, or plain disingenuous nonsense.

B) AI has no capability of original thought. Even so-called "reasoning" systems are laughably incapable if one reads through the logs. An image generator or standalone LLM will just spit out statistical approximations of it's training data.

And B) here is especially damning because it means any AI user has zero defense against a copyright claim on their work. This creates enormous legal risks.

The model for copyright trolling is trivial. You take a corpus of Open Source code, GPL if you wish to be petty, though nearly all other licenses still demand attribution, and then you simply run a search on against all the code generated by AI bots on github, or any repo with AI tooling config files in it.

Won't be long before the FSF does something similar.



But open models are only about 8 months behind closed models. So even aggressive copyright-enforcement would only create an 8 month delay.

This is essentially a LimeWire problem. And OpenAI is essentially Spotify.

Even with revenue sharing, 99% of artists will get nothing (just like streaming), and revenue will be much lower than before (just like streaming compared to record era).

Only IP giants like Disney would see any real income.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: