That's meaningfully different. "Can't be copyrighted" doesn't mean "can't be sold", or "someone else owns the copyright". It just means someone can copy and resell the generated portions without payment/licensing.
I'm not sure. I'm not an expert, but it doesn't seem that different from including public domain text and art in your game.
I assume that, if it is true that Valve isn't allowing games with generated images, it's because (they feel) the legal status could change, not because of the current status.
There's also a quality argument. If Valve lets a bunch of slapdash AI hackjobs onto the store that were developed in a week by people who don't know anything about game development, and that makes it harder to discover well made games, that's a meaningful business risk for them. They're responsible for curating the steam store.
That is a shallow regurgitation of their opinion that has been repeated out of context in headlines, but it misses their point. The Copyright Office's opinion can be better summed up as:
1. Copyright protects work that humans create
2. Humans sometimes use tools to create their works, that is okay
3. Y'all make up your mind whether your AI is some sentient being or whether it's just a tool. We're just lawyers.
If the wind blows and your typewriter falls off a shelf and writes a novel, it isn't subject to copyright either. That doesn't mean that all works written using a typewriter aren't subject to copyright. It means a human must be part of the creative process.
But what if the wind blows, and my laptop falls off a shelf and writes the source code for windows 95, but reindented, with some implementation details and variable names changed?
It’s pretty clear that the “neural networks are just a tool” ruling is going to have to be revisited eventually (and probably soon).
> But what if the wind blows, and my laptop falls off a shelf and writes the source code for windows 95, but reindented, with some implementation details and variable names changed?
Simple. If it wasn't created by a human, it's not eligible for copyright. The law is quite clear about this.
Microsoft gets the copyright to Windows 95 because they wrote it with humans. You wouldn't get it because you didn't write it. Your laptop wouldn't get it because it isn't a human.
> It’s pretty clear that the “neural networks are just a tool” ruling
I think you misinterpreted the above. There is no "“neural networks are just a tool” ruling".
The copyright office never said neural networks were or were not a tool.
They said if a human makes a creative work, and they happen to use use a tool, then it is eligible for copyright. As it always has been.
All they said is what every lawyer already knows, which is that a work has to have an element of human creativity in order to be eligible for copyright.
But, if my laptop’s implementation of windows 95 is not eligible for copyright protection, then I can freely redistribute it because no one can use copyright law to stop me, in a runaround of Microsoft’s copyright on windows 95 (which the laptop generated version is clearly a derivative of).
This is exactly the ambiguity Valve is concerned about.
But the hypothetical world in which your laptop falls off a shelf and randomly writes Windows 95 is a fake one.
LLMs aren't random number generators running in isolation.
They're trained on copyrighted material. If they regurgitate copyrighted material, we know where it came from. It came from the training material.
Valve is rightly concerned that non-lawyers have no clue what they're getting themselves into when using the current generation of AI models. The inability to determine whether an output is a substantial copy of an input is not a free pass to do whatever you want with it, it's a copyright infringement roulette.
There are way too many people in this industry who believe that building a technology which makes compliance impossible is the same thing as making compliance unnecessary.
> US Copyright Office has stated unequivocally that AI works cannot be copyrighted, or otherwise protected legally.
The “or otherwise legally protected" piece is outright falss (and would be out of their scope of competence if true), the other part is true but potentially misleading (a work cannot be protected to the extent that AI, and not the human user, “determines the expressive elements of the work”, but a work made with some use of AI where the human user does that can be protected to the extent of the human contribution.)
The duty to disclose elements that are created by generative AI in the same guidance is going to prove unworkable, too, as generative AI is increasingly embedded into toolchains with other features and not sharply distinguished, and with nontrivial workflows.
US Copyright Office has stated unequivocally that AI works cannot be copyrighted, or otherwise protected legally.
The US patent office is studying the effects of AI on the patent system and asking citizens and businesses for comment.
If that’s not enough for you, I don’t know what would be.