I don't see why ai wouldn't be able to copy that 'personality'.
Everything ai can read or watch, it can copy. So someone showing his personality on YouTube video, Instagram or even tiktok, you get all the video, feed it into an AI and you get an enhanced clone.
1. We are decades away from an AI being able to livestream convincingly, cheaply enough for the average person to use it.
2. At that point, identity systems will prevent an AI from just copying another creator’s content and passing it off as their own. YouTube, Facebook etc. have an interest in keeping real humans on their platforms and not just bots.
They have an interest in keeping real humans consuming content because they’re the ones watching the adverts. The creators are just a byproduct of the pipeline.
If YouTube could replace humans with AI, and thus reduce all that annoying overhead that comes with dealing with creators, and do so without reducing the number of ads that are watched, then I bet Google would be ok with it.
And what, every creator with a public presence and YouTube channel is going to be okay with Google stealing their identity?
This is a sci-fi fantasy with no connection to reality. In the real world, IP matters a lot and lawyers determine what’s allowed on YouTube, not demigod AIs.
Why would Google/Youtube need to copy existing identities? Given a sufficiently powerful AI it can create a whole new world of characters and content.
I also think your comment is an example of the "just world fallacy" (https://en.wikipedia.org/wiki/Just-world_fallacy). Google/Youtube has vastly more resources, lawyers and lobbying power than individual creatives.
What do you think Hollywood actors, studios, and agents are going to do in this situation? Just disappear and be fine with Google copying their work and destroying their careers?
Not to mention the fact that the minute it becomes obvious to society writ large that YouTube is not filled with real people, and is only AI fake bots, it immediately will become blasé and uninteresting to the vast majority of people. Humans like humans, not facsimiles of humans.
Way too much sci-fi AI worship on this site sometimes, it borders on deification. “An infinitely powerful being will have no restrictions.” Human culture isn’t a math problem that can be replaced and solved with computing power.
Google can’t even seem to make search results that work anymore. Color me dubious that they can create thousands, millions of unique characters and content that out-compete real people.
I’ve actually worked on GenAI in Hollywood so I have some insights here.
GenAI is making inroads there but only as tools. Like Avid plugins that enhance productivity rather than replacing the actors themselves.
However Hollywood is a very very very different industry to the likes of YouTube. You cannot discuss the two industries like they’re equivalent. The stuff that would be banned in Hollywood would be perfect acceptable on YouTube. And visa verse too.
Google et al are already stealing their identities by using their content as part of the ML training data.
By the way, I wasn’t suggesting that YouTube would replace real content creators with AI copies of their originals. I was just making the point that YouTube doesn’t make its money from content creators; it makes it from advertising. So we shouldn’t assume that Google will defend the rights of those creators.
We have models now that will generate not-quite-convincing video of a talking head in real time on a consumer GPU. Given the pace of developments in the field, I think it's utterly ludicrous to suggest that getting this type of model from "not quite right" to "good enough to fool a lot of people" will take decades, when so many other types of model have achieved that feat in a matter of years or months.
I don’t find that to be “not quite convincing” at all. It’s pretty obviously fake, and not comparable at all to a real person reacting in real time in their own personalized way, to slang or other internet interactions.
1. I'm a computer science graduate and work in IT, although the boring enterprise kind. ~5 years ago I would have easily signed a document vouching that we will not have the kind of interactive experience chatgpt gives us today, within my lifetime :-). So the notion that automatically generated content of any kind is "decades away" while the field is wildly unstable, seems like a tricky prophecy. I'll take that bet :-).
2. Social networks don't have any interest whatsoever in keeping creators humans. Creators are the cost, viewers or better yet paid subscribers, and the advertising buyers, are the revenue. IF majority of people are happy to consume generated content, social networks are happy to indulge. We on HN are assuming and hoping for some kind of human revolt, but we have historically been horrible at predicting such. Majority of people don't care a whif about stuff that HN cares about deeply (see: privacy, security,open source, etc:).
(This is not a hypothetical as far as I'm concerned - Spotify already has popular LoFi / sleep / working music channels it promotes that are generated cheaply by AI, or session musicians that are one step away from being / being replaced by AI. YouTube and Facebook are full of clips which have chatgpt generating text, AI voiceover, and short AI clips merged together - today, vs decades from now).
The ecosystem is dependent on real people. The idea that audiences are going to just sit by and watch AI generated content without any desire to connect to other human beings is a fundamental misunderstanding of culture.
I wish I had your optimism; given how many people enjoy, praise and share AI content today, I'm worried.
* Completely unrealistic fake rescue videos
* Drawings and photos and paintings presented as own
* Videos which are AI voices from an AI script with AI animations
* And let's not get into the news/"news" aspect of it all
I fear Idiocracy is more in our future, but here's hoping you're right :)
> YouTube, Facebook etc. have an interest in keeping real humans on their platforms and not just bots.
They want a human audience, but I suspect they are fine with bot creators and are probably working on that themselves.
Ever notice all those channels on YouTube and Spotify that are hours and hours of lofi music? They aren’t all human and some of them have big audiences.
Everything ai can read or watch, it can copy. So someone showing his personality on YouTube video, Instagram or even tiktok, you get all the video, feed it into an AI and you get an enhanced clone.