That is honestly just the same old big tech story I've must have read a thousand times now.
Sure, it is a little bit more sinister than usual even, but that's just because the company in question serves a market with some very dark corners, like the assault featured in the article.
People here argue pro or contra porn prohibition or platform specifics, but it comes down to this: The Western world decided that giant multinationals sitting above countries, laws, ethics and responsibility holding all the data are the ultimate expression of the American Dream and should therefore exist undisturbed forever, growing bigger and bigger. Shit like this is just fallout from the gargantuan power distance between megacorp and human you get, by design.
Sure, there are probably ways to get PornHub to be sorry for that, make amends, change some rules. Whatever. Last week it was Uber discriminating, here it's PornHub violating, maybe next week it will be someone having his business destroyed because he associated with a scammer in 2004 again.
Every time there is juicy drama and a micro-outcome in this or that direction, but for someone reason the macro-problem gets mostly ignored.
It is hardly just changing some rules. Pornhub killed off the majority of their content, and removed the permissive copyright system that enabled them to become dominant in the market in the first place.
Mindgeek has a very interesting corporate past. This FT article is the closest I’ve ever found to an investigation but honestly, it reads more like a really bad soap opera than an FT article:
Well, up to a point; there are plenty of laws which apply to anyone deemed to be marketing to the US, especially financial services and online gambling, and Megaupload got raided by the FBI despite being in New Zealand.
Thats true, but I don't know if that necessarily means U.S law applies to those countries, as much as allies will oblige in certain cases such as the ones you mentioned and Huawei. And when I say I don't know, I very much actually don't know
They're just following in the footsteps of the precedent set by the DMV et al who themselves are standing on a long and storied tradition of bureaucratic ineptitude. Heck, this crap goes back at least as far as the ancient Roman administrative state if not further. The fact that the bureaucratic run-around flies from department to department in the form of 1's and zeros instead of dried tree pulp or waxed stone tablets doesn't really make a substantial difference. This isn't anything new.
I think the argument is that if companies feared consequences from an engaged government that actively protected its private citizens' rights, this specific issue would not exist (since the company's continued functioning could then depend on its efficiently cooperating with such requests).
I think the issue being discussed here is also, "how can we make a government whose bureaucracy will be able to be engaged?" So many examples show bureaucracy and the abdication of responsibility go hand in hand. In order for the government to be empowered, its workers need to be empowered, and we have few cultural narratives of how rule following enabled great outcomes.
Instead, almost fetishistically, we embrace rule breaking as the ultimate expression of self actualization, from forming businesses to bureaucrats who actually took time during their breaks to help.
IMO, working for a company and working for the government are both exercises in limiting personal liability, and I've not yet encountered a construct that allowed people to limit liability without also limiting their agency. I also think it is important that we as a modern democracy do work to solve this issue.
Though it starts out feeling a little dismissive, I think your comment does a good job of explaining the problem we have now is not limited to the mico level, but that it's a macro level problem.
Now it's time to start talking about how we're going to fix that problem you have so rightly pointed out.
It's clear the rules need to change, and it's clear we need to claw back the gargantuan power we gave over to mega-corps.
> Pornhub has recently removed that download button.
News flash: there are many (many many many) softwares out there that can 'catch'/downlaod a video, mp3, etc. from a website.
I listen a daily radio show. Not live though. I wait till the recording is done, then wait till they upload it on the station's website, and I am using a SW to 'catch' the mp3, download it, and listen to it later at my convenience (and offline). I have written and asked them to make a podcast for each of their radio shows, they're in the process of setting it up, I just cannot listen to them live, so I do this 'illegal workaround'.
That same software can catch videos (and streams) from YT, Vimeo, and most sites I visit. I don't 'pirate', but I do download some old-time music videos because I fear that one day they will disappear (e.g. Van Halen - Right now, Chicane - Saltwater). I want to be able to watch them when I'm (very) old, and I don't know how 'old music' videos will be treated. I do pay for Spotify, so I do pay for the music I enjoy (to avoid any responses to the contrary).
> Every time there is juicy drama and a micro-outcome
(not picking a fight with you): If that was your sister, mother, daughter you wouldn't even think of writing something like that. It is cases like this though, that can bring a $50m penalty to the big bullies and their minions ("marios", "kevin"). And large corp don't give a poop for the little people. They care to not lose money. Slap a $50m penalty to PornHub, and see them changing their tune within 24h and fixing this problem within 2-3 months.
Or maybe it is just media companies trying to exert power and blowing up issues out of proportion. Just because some media outlet writes a dramatic article, doesn't mean politicians should rush to instantiate new laws to appease the journalists.
I'm not defending the hosting of revenge porn, but that specific issue does not seem to rely on the existence of porn sites.
It does rely on sites where people can upload stuff. Maybe media companies don't really like that such sites exist, because it destroys their information monopoly.
Perhaps a bit pedantic, but the American Dream isn't necessarily wholly applicable in Canada. Uber was very slowly allowed in, and developer salaries are kilometers behind the US.
What a nice manipulative article perfectly targeted at clueless general public.
So the criminal was her husband, the video was reposted by random users or just bots (probably not illegal, but can start a long talk about personal moral obligations), and the blame is on… Pornhub (because all other porn sites are supposedly very noble). First, because it is silently implied that all good websites should collect and check the official IDs of all good citizens uploading any data to them (what a great perspective), and, second, because it had The Download Button (this lame Don't-Copy-That-Floppy point gets stressed in most of the articles in anti-Pornhub campaign — bravo, incorruptible journalists). But what is the alternative? Of course, it's good old trusted porn studios that have all the papers to prove that women (and men) pretending to have sex on camera and even destroying the functions of their body parts have legal contract. Hooray!
It is amusing how porn industry leveraged anti-porn, women's rights, and other groups to kill the competition from amateur and no-name internet content. It is clear that Pornhub owners understood that they can not refuse the offer, and the goal was reached: now Porn Site #1 is mostly a shop-front for big porn studios (and you can be sure they get their share of those horrible, horrible ad revenue money).
First, because it is silently implied that all good websites should collect and check the official IDs of all good citizens uploading any data to them (what a great perspective)...
There's a significant difference between "uploading any data" and "uploading porn". Suggesting someone who is uploading porn should have to prove that they're entitled to, and that the people in the video have consented to be put online in the context of a porn video, is not unreasonable given the weight of evidence that "revenge porn" is incredibly damaging to victims.
It's a balance. Suppose you add this proof requirement. What happens then? It's not as simple as lots of legitimate amateur content producers facing additional friction. Most will make the explicit decision not to have their legal name and address associated with pornographic content. Others will upload their content anyway, without doing an audit of the different platforms' security postures or being aware of the likelihood of and outcomes from breaches.
What happens in a breach? Now, you're probably thinking, just require the platforms to be secure! But you can't simply will that into existence, and I guarantee you it's impossible. So ultimately you've made people vulnerable when they weren't before, and you're setting up a situation where it's pretty much inevitable that they'll be exposed and threatened by stalkers, deranged fans, and ideological zealots.
Let's suppose you bite the bullet, and say turning all the porn platforms into outlets for major porn production companies is the way to go. At least we've eliminated revenge porn, right? No. Large masses of people really hate that kind of mass produced porn, so they jump to foreign platforms that don't require complicated ID verification. These sites end up with the large majority of amateur porn content, along with the revenge porn. But they're now out of the reach of US law enforcement, so the victims of revenge porn would then be in a worse position than they are today.
Now we're getting to even more extreme measures. Maybe the US needs to build a national firewall to prevent the foreign criminals from penetrating our great nation and undermining our morals with porn that's unapproved by the Feds? China is way ahead of the game on this, in that pornography is outright illegal, and it has sophisticated internet security measures that are exceptional in breadth and depth by Western standards. How's its war on porn going? Spoiler alert: it's lost it. Perhaps it could be a moment to build cross-cultural empathy, as Chinese netizens exchange tips with Americans on how to use Shadowsocks to avoid the censors and access amateur porn.
Requiring ID verification is one of those things that sounds good and moral on its face, but has so many unintended downstream consequences that exacerbate the original problem while making things generally worse.
So all professional porn already requires identification and age checks. I'm...not convinced it's the end of the world if amateur ones do too. The considerations are the same.
Professional pornstars are the people that are okay being known in public as a pornstar. There are also vast swaths of people who make content, but don't want to be known as "the pornstar" when they show up to their job on Monday.
I would imagine it's quite hard to get a job if the top Google result for your name goes to PornHub because all their identification info got leaked.
I think the amateur producers deserve respect for their privacy. Having all your amateur porn tied back to you because of an identification leak is going to screw up your life a lot. I guess the question is, are we willing to risk the privacy of the amateur producers to offer some extra level of protection to revenge porn victims? I say some because there will basically always be sites that don't enforce ID requirements. As long as there's demand for that kind of thing, it will find a way. The war on drugs hasn't eliminated drugs, and I sincerely doubt the war on revenge porn is going to be drastically more successful.
I hate the DMCA, but I'd rather see a DMCA-like process where you can submit a request to take down the video. They can either take it down and avoid liability, or refuse to take it down and accept liability if the complaint turns out to be valid.
People appearing in professional porn today are almost by definition those who are most comfortable getting identification and age checks; the ability to maintain anonymity is a key concern for most amateurs. Adding the same verification process to amateur porn is more or less making all porn professional porn, not mildly tweaking the nature of amateur porn. Amateurs would be faced with a choice of 1) effectively going pro, 2) fleeing to a different platform, or 3) exiting porn production altogether. I'm fairly confident that 2) would be the largest proportion of people, closely followed by 3), with 1) a distant third.
There already are professionally produced things with amateurs seeking to maintain anonymity. And yet, documents are signed, checks are written. The paperwork is just kept offline.
Yes, it may make it harder for an amateur seeking to upload something they filmed themselves, and who are not looking to build a brand or collect a payment for it. Not sure how many of those there are. But as soon as you're looking to do one of those things, there are existing allegories you can look to to figure out how it might be enabled.
I'm not arguing that it's impossible to verify people: it's clearly not. The point is that it will end up being a painful and risky enough process that amateurs will respond by switching to other platforms hosted overseas, outside the reach of law enforcement. This will provide a space where the exact same problem as exists now--commingling of legitimate amateur porn and nonconsensual videos--reproduces itself, except with no tools available to expunge the videos that need to be expunged.
Every proactive approach has this downside. That's why we should focus more on making it easy for victims to report a video and have it and variations of it removed from all sites nearly immediately, as well as making sure laws are on the books making it easy to prosecute offenders.
What you’re arguing for can be classified under the “slippery slope fallacy”.
We have repeatedly seen that, while deplatforming does drive content creators to move to other platforms, these platforms will not have the same reach as the mainstream ones. I.e. it’s not a perfect remedy but it’s quite effective.
Not all pro porn requires age checks. That made in the US certainly does, but the requirement isnt universal. Nor is the definition of porn. Many european "art" websites (nudes, generally no sex acts) are not pornographic and feel no need to maintain such records. Playboy would probably not be considered "porn" in many jurisdictions, at least in terms of production. What it is when found on your hard drive while sitting in the US is another matter.
You're aware that the vast majority of amateur porn is some person sending some other person nudes, right? You're effectively arguing for everyone should have to provide ID with every "u up??". Let's not make the insane over-enforcing of nudes worse.
If you don’t check ID and age, then you are recklessly running the risk of hosting underage pornography and revictimizing minors with every impression.
If anyone recalls the old “Girls Gone Wild” days where they sold videos/dvds via late night info commercials there was a very famous criminal case and civil lawsuit that put them out of business because they filmed underage girls who produced fake identification...while most crimes require intent and the fake ID would generally have made it possible to prove intent, underage sex/pornography is a strict liability crime so if it happens it’s criminal regardless of intent.
Wikipedia seems to indicate that the Girls Gone Wild folks survived the lawsuit that you're talking about and were eventually driven out of business due to other legal problems.
You are reading into the dates far to much, and they didn’t survive the underage girl criminal and civil cases.
Subsequent to those underage cases which began in 2003 additional civil lawsuits were also brought against the company for other issues, but those underage cases were still ongoing.
A different example would be the Hogan/Gawker case which started in 2013. The company faced other lawsuits after, but it was the hogan case that pushed them into bankruptcy because it didn’t end until 2016.
Wouldn't it be simpler to make porn a regulated commodity that only the government, or those it authorises with permits, can produce? Like alcohol. But why stop there? Let's make sex a regulated activity that only the government can approve. This will solve the whole consent problem in the regulated space because every sexual interaction will be pre approved and in writing, with a no harm no foul escape clause if anyone changes their mind.
/S
But seriously, there's problems. if you want to ID people it might be easier to make every video begin with the actors holding up a placard that's like an anonymous crypto QR token clapper that somehow only government sensors can officially check, that's like a hash of their face id, and it's all in a big government bloom filter for registered porn stars. But then you couldn't stop people prefixing an unapproved video with their identities... Maybe the tech angle of not the right way to deal with this....
The US' current ID and info collection requirements for sexual content exist on very thin and shaky constitutional grounds right now.
They've never had a criminal indictment, had very limited enforcement solely due to the predictable constitutional challenge, which occurred anyway.
Simply put, dragnets against all producers to create a criminal neither solves the problem, deters the problem, nor catches the actual criminal.
All you have now is an awkward duty to give anybody with a camera your drivers license or passport. Or did we ever actually think about the people that appear in the videos. And in this thread's suggestion: having them upload it to random websites. Fantastic.
Something tells me that the whole record-keeping system was introduced just to protect the studios from troubles. “See, we're legal, shut up and don't get in the way, we're doing business”.
And still, most people don't realize what “MET” in “MET-Art” stood for.
I mean that's also true, representatives left to their own devices would come up with something that completely breaks any sense of practicality while still failing to accomplish whatever cause they thought they were addressing.
What's the significant difference? I've seen viral videos of people in compromising or embarrassing situations. Some have even lost their jobs as a result. Should all video uploading sites demand documentation that every person in the video consents to it?
I’m not comfortable being liable to being sued because my cousin decided to let us know after the fact he didn’t want to be in any of the pictures uploaded to Facebook/Twitter/Instagram/tumblr/Flickr/etc.
I admittedly didn’t read the article, so maybe your comment’s sentiment isn’t meant to include these scenarios.
What if your cousin is doing something frowned upon by society (wearing blackface, spouting racial epithets, etc)? What if it's not frowned upon now, but is at a later date? What if you don't find anything objectionable, but they wouldn't want it shown, and their concerns are borne out later? What if it's completely benign but out of context appears problematic (goofing around with a good friend of a different race and it comes across as something different when only a portion of it is represented)? What if they just have a personal objection to having heir picture posted?
There are many reasons why it's a good idea to at least check that the person is comfortable with it. You shouldn't be liable if they've said it's okay.
Absolutely. Good UX would be the subject receiving a notification that a photo of them has been uploaded, requesting whether they should be scrubbed/blurred from the photo or not before it’s made public. It’s about consent and agency.
I’d argue the tech to support this would require much more to be known about me by another entity and system than likely any picture taken by someone would share. No thanks.
"If someone is in a public space, like a park, beach, or sidewalk, it is legal to photograph them. They shouldn’t expect privacy laws to prevent them from being photographed. That means a street photographer can publish candids taken in public spaces, as long as those images are only being used for editorial purposes."
It'd be impossible to photograph a protest, rally, sporting event, or basically any long shot of a public area if you had to get individual consent from every person in the frame.
I suspect there's a bit of nuance there which hasn't been identified here yet. Perhaps to do with whether people are background or immediately identifiable, or and active part of of what's going on. I know consent forms are a part of television recording when talking to people on the street (otherwise they blur faces), but I don't think they bother to get everyone in the background to sign.
I agree with this stance, but I also think it's beneficial to explore whether "there's a significant difference between 'uploading any data' and 'uploading porn'" is true in part or whole, and whether the "porn" part of it is a red-herring.
If it's public and it's in the public interest (non commercial), I would expect statute to not require approval (see my sibling comment about US photograph release requirements). One would also hope that policy governance agencies would release the video publicly when necessary (as is happening with Chicago's Civilian Office of Police Accountability with the body cam footage from the shooting of an allegedly armed 13 year old in Chicago recently [1]). Chicago requires such materials be made public within 60 days of an incident, although a minor's involvement delayed release of the video in my example.
The whole question here is whether the law should require it. The reason why it's a strange concept is that it would severely limit the spread of information. The same avenues that a regular person can get content about them deleted can be used by politicians or criminals to do the same. Not only that, these levers can also be abused to simply target somebody with false claims. Furthermore, it also places trust onto the video hosting website to keep all of that information safe. If they get hacked, then all of that documentation and images of IDs get leaked.
>Why should someone be able to upload whatever they want to platforms with no responsibility for what they’re uploading?
This isn't in question though. The question is whether sites need to be preemptive or reactive. Illegal content is illegal even right now and these websites will delete it. If they don't then you can take them to court and the content will get deleted. However, the thread suggests that these websites should be forced to be preemptive - to demand documentation upfront. That's a whole different situation.
All systems are open to abuse, which is why it's important to build systems with robust measures to counter abuse. The spread of some information should be limited. Takedowns should be logged and reported on to regulatory bodies to ensure everyone involved is acting in good faith.
With regards to video hosting websites maintaining identification information, it's possible to provide this functionality without they themselves storing sensitive user identification data. Third party providers (ID.me comes to mind) could be utilized to do so, with adult content hosting providers treating this data similar to payment data and the PCI requirements that go along with it (store the verification/attestation token and timestamps, not the data itself). I would agree that I wouldn't want Pornhub storing my passport photo and metadata.
Finally, to your point about preemptive vs reactive, in the case of adult content platforms, it seems reasonable to require preemptive documentation considering the record keeping requirements for producing adult content. Refer to 18 U.S. Code § 2257 for specifics [1] [2].
How could that even work in practice? Everybody who is visible in a video has to do some kind of VideoIdent and clearly state their consent? YouTube should have to do the same, btw.
Are you saying that movies and porn are done without people consent? Performer on set, people acting as background give theirs explicit on paper consent. So if you are not your average porn company you should just send these contracts to the platform. And BY THE WAY it's already like this in the web caming industry.
The producers sign the contracts with the people they meet in person to make the videos, so they can verify those same people signed the contracts. A web site has no way to check if the contracts are real. People can just upload made up documents.
So you think showing NO proof is better? It's real world we are talking about. The reason backpage and co got busted by the FBI is because of the amount of shady activity. And pornhub is definitely hight on the list of shady sites. Also a lot porn site are just own by porn company so they already have the contracts. Just don't be surprise when you see FBI indictments.
You are probably guided by the false dilemma presented by that or similar articles. I doubt any porn service on the internet in the last 25 years did anything like that, just like no web hosting service has checked that each image and each piece of text on your web site is legally owned or distributed by you. This is by no means a recent problem, naked pictures were leaked intentionally or unintentionally long before internet existed. However, the discussion implies that all of the sudden porn sites appeared, you can be on them, and wegottadosomethingfast!
I don't think there is much difference between your naked body and other information you might want to keep private. “Porn” triggers people (most often American people, I have to admit as a distant observer), makes them reason like there is some inherent difference between “porn” and “not porn”, and makes manipulation easier. Basically, what you're saying is that there should be more (indirect) censorship and real-life identity matching on the Web — but it's for an Obvious Good Cause! The problem is that Good Causes get forgotten quite fast, and the one who benefits from it is the entity in control of the system that gained power. Here we have porn studios dictating previously independent Mindgeek (former pain in their asses) which content should be on the biggest porn streaming sites under new system. The speed and scale of the attack, and the concerned voices from various directions, hint there was a lot of high-level lobbying involved.
Here's no less thrilling example. Suppose that Rick Astley wakes up tomorrow and decides to point out that he has always been a singer and not some kind of internet joke. Perfectly understandable impulse. Then he proceeds to remove all the non-musical uses of his songs (let's forget about corporate ownership). How would you react to that? What would results be? There's certainly more people with similar wishes — basically the whole genre of “viral videos” is one big zero consent heap. When you are having a laugh at someone's expense, do you worry about them? When you watch videos of Beirut explosion, do you think about all the dead people who never gave consent for their last moment to be your entertainment? Should we ban the uploading of such videos without explicit source checks, then?
I don't think there is much difference between your naked body and other information you might want to keep private.
Absolutely. Swap "porn" for "credit card details" and the point I made is still true. There are some things that people don't want to be shared online and they should be in a position to stop that happening.
One individual's right to privacy is more important than another individual's freedom of expression except in very limited circumstances of the public interest. Reporting news should happen even without consent from the people the news is about because it's good for society to be able to share that information. Uploading porn or a prank video should not happen without consent from all parties.
As for your example of Rick Astley - the law already enables him to do that if he wanted to. There are many legal mechanisms for copyright holders to revoke access to their material.
There doesn't have to be one simple rule that covers everything. We're intelligent human beings. We can have a bit of nuance without society collapsing.
Then there should be no difference in how it is handled, and “porn” is not a wild card to prevent the critical assessment of proposed actions, just like “think of the children” isn't.
I suppose that some legal options are available to Rick Astley and other people whose name are hardly well known. But can they realistically do anything with it apart from causing another Streisand effect? Who is to blame for that, who is going to pay billions of dollars? YouTube? 4chan? Tim Berners-Lee?
Then there should be no difference in how it is handled, and “porn” is not a wild card to prevent the critical assessment of proposed actions, just like “think of the children” isn't.
The reason we're discussing porn is because that's what the article is about. The point that people should be prevented from sharing things that they don't have permission to share (unless it's in the public interest for it to be shared) is an obvious one and mostly how the law works already.
One more very important thing to remember here is that the woman in the article was the victim of a crime and the video in question is of that crime taking place. It's not 'simply' porn. It's a video of a sexual assault, and the victim is the one suffering because it's on the internet. Arguing that it should be allowed to stay on the web is arguing that the victim of a crime should continue to suffer forever. Freedom of speech, freedom of expression, etc have never covered criminal acts.
Yeah, I think you're missing the point, which is young women are having their stuff put up online and it's systematically distributed by entities that may very well have the capability to do better.
That's a legit story.
Edit: We're talking about a story for the masses here, nuances are important, but I don't think most people really fathom that this is a common thing.
What happened to this woman is awful and her ex-husband should go to jail. I also think everyone agrees pornhub and other sites should be forced to remove this content.
He is making the argument that including a drivers license and signed agreement with every piece of nude media is not a good solution.
The obvious step would be to require the uploader to be responsible for their content. Less so than youtube, ph's content would generally imply that the owner has secured the rights of its participants.
> What a nice manipulative article perfectly targeted at clueless general public.
This could describe approximately 100% of local news articles/segments on any topic even remotely political or controversial. And maybe 75% of the remaining.
Well, as your link states, this applies to producers of the 'visual imagery' (i.e. in this case, the woman's husband - but presumably he has a record of her name and age required by this clause, and presumably she's over 18), not all the sites distributing/redistributing user uploaded content.
No mention of it in the article but can’t charges be brought on the (ex) husband for this? Pornhub definitely owns some of the responsibility but the dude that did this is the bad guy here.
It seems difficult for pornhub to do anything about this(from a hosting perspective), considering people also will upload completely consensual videos along the same lines (where the "victim" is just pretending to be passed out and is fully consenting).
Pornhub could make you sign an affidavit, but even then, it is relying on trusting the uploading party - if they are already breaking the law by uploading non consensual sexual assault videos, odds are they will probably check the affidavit box saying that everyone in the video has consented.
Pornhub has already mostly fixed this problem by only allowing uploads from verified accounts. It won't 100% eliminate it, sure, but when the site has your real name, picture, ID etc. you will be much less likely to upload stuff that you aren't completely sure is legal and consensual.
"fixed the problem" by destroying community-generated porn in response to a PR campaign by an activist newspaper and business-destroying threats from pusillanimous payment providers.
It didn't destroy community-generated porn, it removed it from their servers (or at least removed the publics opportunity to download it from their servers).
Sure, but you can't just check the affidavit box when you have to have all parties in the video identified, registered with the site with government ID, and everyone in the video has to approve it before it goes live.
It's wild that we just offload all responsibility to the victims who have to scour the internet and issue takedown requests for videos because hosting sites literally can't be bothered to actually get consent beforehand.
If there is a husband who is living with his wife, and he drugs her and sexually assaults her, wouldn't he also have access to her ID to scan it and upload it? I guess at a certain point, any kind of roadblocks will lower the chances this person uploading the content.
What's the alternative to victims scouring the internet and issuing takedown requests? Some centralized porn database where you can type in someone's name and see what porn they've done?
> What's the alternative to victims scouring the internet and issuing takedown requests? Some centralized porn database where you can type in someone's name and see what porn they've done?
I mean this completely unironically: ContentID. It exists exactly for this use-case because copyright holders don't want to scour the internet for violations either. If you're a victim and you find that someone posted a video of you being assaulted online you should be able to register that video with ContentID and have every site immediately and automatically take down the video everywhere.
Yes it will only affect above-board sites but broadly speaking those are the sites with large audiences and the ones you really care about.
We already have an apparatus to do this for copyrighted material that people go far further out of their way to share and download. It should be far easier to get some amateur porn video off the web than a BDrip of Disney's Coco.
If the question is "who should shoulder the cost of takedown," my suggestion is that offenders and porn industry behemoths should pay into a fund that finances redflagged content takedown efforts.
Requiring real names, age, and contact info from uploaders and everyone in the video would discourage people well beyond a simple checkbox. Similarly, adding video fingerprinting should make it very easy to avoid someone uploading the same video again.
Real name policies in this context have an obvious flaw when the host gets hacked and now the ostensibly private records fall into the hands of criminals who start blackmailing everyone to reveal to their bosses and families that they were involved in the creation of pornography.
The ability to remain pseudonymous is more important in this context than many others.
> Similarly, adding video fingerprinting should make it very easy to avoid someone uploading the same video again.
Those systems don't really work, because the uploaders can tell when it's rejected so they can keep messing with the file until it's accepted. Or they upload it to a different host each time, or to a file rather than video host who can't see the contents because it was encrypted and the key is distributed to the downloaders with the link but the host doesn't have it.
I personally "online"-know people where I have no doubt they share their porn with consent of everybody involved (e.g. they often hold signs up with messages for their fans), but who at the same time would never share their identity because of the social repercussions if certain neighbors or coworkers or their families learned about their "hobby".
Same as reddit's r/gonewild really, where posters would verify themselves with handwritten signs, but most would never even dream of handing over a copy of their driver's license to moderators or reddit.
I'd argue such a "sign-holding" verification method in regards to consent is far more conclusive and secure than any checkbox + copy-of-some-id method ever could be, and yet every campaigner/activist out there seems to rave on about real names and government id, which is a worse method that also comes with a huge chilling effect.
(I also happen to know that a lot of horny husbands share the ids of their wives with other people. I am not condoning that in any way. But of course is another way other than hacks where verification by government id can go horribly wrong)
> What’s privileged about suggesting you shouldn’t upload videos of people without their consent.
You're suggesting that people shouldn't upload videos of themselves with the consent of everyone involved unless they are willing to attach their full name in a way that could plausibly lead to their employer or entire extended family discovering it.
This is a serious concern for people in conservative religious families or who work for people who are. It increases their risk of unjust retaliation or violence and impairs their ability to express themselves when the increased risk induces self-censorship.
So I take it Retric is your full legal name? Or are you posting under a pseudonym? If you don't want to be associated with your comments maybe they shouldn't be made.
Is that the point you are trying to make? Why is porn "special" in this regard?
It’s video that’s special in this case not porn. Comments come from one person, video can include hundreds but only one person needs to upload it.
In the cases of Mainstream movies you can trace consent before they end up on the big screen. But, online it seems like people want anyone with access to a file to be able to do anything with it. Hacker find some cool footage, wow let’s post it to everyone!
If society weren't bunch of puritanical prudes walking around with sticks firmly implanted in their rectums whether someone created pornography or not wouldn't be an issue at all.
This entire "problem" is the result of society's backwards thinking towards sex and sexuality.
there are many things that PH can do. they could simply require verification of both participants. or they could just reject any such videos if there is even a doubt about consent. I think recently they just axed the whole "community videos" (only because VISA and Mastercard cut ties with them), so you must be a pro to upload. or did that change already?
I think the right approach is for gov to regulate this as prostitution (in Europe at least). you can't just let anyone make porn videos because women and children will get abused, same as with prostitution. you regulate it so whoever wants to do it can do it safely. if you're not a licensed porn actor then you can't upload.
If my fetish is people over-eating, does someone who stuffs their face with 10 Big Macs (completely clothed) need to get approval to upload to pornhub?
Community uploads to pornhub don't feel like prostitution to me, because for the most part no one is doing it for money, merely because they're horny or enjoy it.
well if it falls under the legal definition of porn then yes, otherwise no. I don't know what is the legal definition. right now there is no regulation at all, which is crazy. or there is but somehow it doesn't apply to pornhub. we could start with something simple and obvious that covers most videos. something is better than nothing.
> Community uploads to pornhub don't feel like prostitution to me, because for the most part no one is doing it for money, merely because they're horny or enjoy it.
I meant we can regulate for the same reason prostitution is regulated, to make it safer.
But if there is no financial incentive in sex, then isn't it more like just attempting to regulate the private sex lives of individuals? Prostitution is regulated, but there is no regulation (at least in any Western country that I know of), that prevents a person from just going out and having sex with other consenting adults.
The item I find with this is once it's online... it's never going away. Not without some form of incredible draconian overarching system. What could even be done with situations like this. Short of insisting that any content provider (pornhub included) be liable for all content posted to their site in perpetuity. (Which seems like a really dangerous idea and a good way to force extreme censorship online.)
It may not be possible to eradicate it completely, but it's absolutely feasible to dramatically reduce its availability until there's no practical difference.
Maybe a thousand people downloaded it and added it to personal collections/datahoards. The vast majority of that thousand would never consider uploading to a public site (possibly as a response to a request). Of the remainder, most probably wouldn't bother unless the video in question is an outstanding example of that particular kink (in the story sex with someone unconscious).
The first step is undoubtedly getting and keeping it off public sites. Once that's done, even the few places it may remain effectively disappear in a vast sea of other content.
Pretty sure the parent corporation owns a disturbingly high percentage of the video sharing sites. At the very least they own Pornhub, Redtube, Youporn as well as a bunch of others (Wikipedia about Mindgeek).
Maybe it is OK to ask money for such a service? I think in general that kind of service does exist. You can pay people or companies to try to remove as much as possible of your content from the internet.
> However, when she searched for the name of the video on Google in January, it still returned 1,900 results. It seems that although Pornhub had removed the video, it still kept thumbnails of the naked images. Because those thumbnails still existed, a Google search would find – and display - those naked images. She realized the only way to eliminate those was to get Pornhub to remove all traces of the thumbnail images.
It seems the problem was / is Google (and not pornhub), because makes it impossible to remove stuff from the internet fast enough to prevent damage. I wonder why the article doesn't consider it. Google should have an on demand mechanism to instantly deletes all text/images that match a fingerprint .
Perhaps I’m not understanding but why couldn’t PH just remove the thumbnails so they couldn’t be indexed anymore? Sure, Google should have a mechanism so material like that isn’t indexed but Google isn’t the only search engine and PH shouldn’t be hosting that content, not even as a thumbnail.
> why couldn’t PH just remove the thumbnails so they couldn’t be indexed anymore?
> Kevin responded again, insisting that Pornhub “can NOT” remove content from other sites. However, that doesn’t seem to be completely accurate. Pornhub offers something called its “exclusive model program,” which promises that it will send takedown notices to any website to “help protect your content from being uploaded to other websites.”
I am not sure that PornHub actually hosts those thumbnails or necessarily controls them. Sending a takedown notice indicates to me that they don't control that content. They would just be requesting removal.
Even if they did remove it, how long would it take for google to remove their copy? That's why it's better to have an on-demand mechanism to remove content.
I don't even think they own all of them? There are a billionty porn site aggregators that pull from Pornhub and get ranked on Google, Bing, and other search engines.
Is removing _all_ traces of this video even a tractable problem?
That assumes all thumbnails or pictures hash to the same value. Why would that be? There can be different algorithms, resolutions, crops... And there can be millions of thumbnails to check.
Yeah it's a gnarly problem for sure. I mean I'm sure doing some sort of hash-based blocking would knock out 80% of the content related to a takedown request, but I could also see that last 20% being difficult af to purge.
the internet definitely remembers forever, unfortunately, especially porn
If Google removed it from their search result it is true that most people will not see/find it.
But her video/thumbnail is still all over the internet.
But if the (porn) sites removed the video/ thumbnail it will be gone from google search as well
It sucks that this is a case where no amount of money & no punishment of anyone involved can really make the victim whole. Some things require a time machine to fix.
It shows how much our societal norms and laws were - and still are - completely unprepared for the implications of everyone being able to effortlessly copy and distribute images and video all over the world. We're still in a sort of local to national mindset, but international and virtual issues completely shortcut a lot of the foundations of that, and the people who get caught in that friction get hurt, sometimes very badly.
If someone wants to ruin your reputation and cause you immense grief, they can very easily do that, and there is basically nothing you can do once the material is out there, especially if it is pornographic in nature.
Privacy is a basic human right, and yet we are trampling all over it. I wish I knew what we as humanity could do about this, because shitty people aren't just magically going to stop existing, and the internet is an extremely powerful tool for abuse in their hands.
Facilitating abuse by sharing abuse materials should be subject to just as hard punishment as uploading them in the first place.
The subjects of Nick Ut's “The Terror of War” didn't sign consent papers, the same with countless other famous photos. There is no doubt you have seen it. Aren't you the one of the “shitty people”? Will you repent?
You should really reword that, because now it sounds as an incredibly insensitive thing to say. She was a child when it happened. Bad things hurt so much more when you're a child.
I'm confused. The article is talking about her ex-husband. Did she marry as a child? Or did some things happen to her as a child and later her husband of all people got his hands on some recordings? Or are you talking about some other person than the woman in the article?
I suspect you’ll be downvoted, but I agree. People take the internet waaaaaaay too seriously now. Not downplaying their trauma. Crazy stuff happens online, but everyone’s attention span lasts for 30 seconds, so just move on, get some therapy, and try to ‘let it be’. I suspect stories like this will facilitate additional mass-censorship via complaints similar to “think of the children!!”, but framed towards adults who don’t understand how web archival works.
But is this what we want? Don’t we love the internet because everything lives forever? Remove the story in the OP from your mind and respond without any emotion.
I know I sound like a contrarian, but I really miss when the internet felt more dangerous and unhinged. People starting these justice campaigns for every little thing about the internet is not only futile, but also short sighted. Do we want more “mass deletions” of content like Tumblr, Pornhub, etc ? This is what happens when the internet is slowly homogenized into this business friendly, marketable, “safe space” for casual users. Sorry for the rant, but I’ve noticed this so much lately.
YES terrible stuff happens online. YES it has been happening since before the author of this story was born. Deal with it the best you can, but stop using emotion to convince everyone that you will be “fixed” until every “bad” site is fully regulated and monitored.
Asking someone experiencing sexual trauma to 'let it be' is such a dismissive take.
Being a victim of revenge porn means that you can be blackmailed at anytime. Most people do not want to be sexualized, and it's humiliating to have to live with the fact that your coworkers will now have ammo to harass over you if they discover your videos. Don't get me started on what happens if you have kids, and their friends find out you were in porn.
Realistically, you can't scrub the internet of a video. I personally feel like the solution is that porn should be highly regulated. A formalized content upload process, licensing, required staff to deal with these types of complaints. It should be impossible to upload a video without a signed waver from the participants. Sure, this might remove pure anonymity, but you can still have mechanisms in place to protect your identity if you want to make amateur content without divulging your identity to the public.
“Asking someone experiencing cyber bullying to 'let it be' is such a dismissive take. Realistically, you can't scrub the internet of cyber bullying. I personally feel like the solution is that comments should be highly regulated. A formalized posting process, licensing, required staff to deal with these types of complaints. It should be impossible to comment referencing someone else without a signed waver from the participants. Sure, this might remove pure anonymity, but you can still have mechanisms in place to protect your identity if you want to post comments without divulging your identity to the public.”
This sound similar to any other regimes in recent memory? Porn isn’t the argument I’m responding to, it’s short sighted bandaid solutions which require ADDITIONAL regulation and government control. We all know these systems of regulation bleed into other parts of the internet. Are you willing to start this trend and put it into the hands of someone whom you disagree with?
You are equating mean youtube comments to revenge porn. This is the whole "We should just ban cars" response when someone brings up gun control.
>It should be impossible to comment referencing someone else without a signed waver from the participants
If the comment is a nude picture of myself and I didn't give you permission to post it the law already agrees that this should be illegal. All I'm asking is to make it enforceable.
Porn is shady industry ripe with abuse and exploitation. How do you know the person you are watching is over 18? Let me answer that... You don't. Some industries should be given more regulatory scrutiny than others.
Takedowns are a perfectly valid thing to have for something like pornography websites. Its not life threatening if it vanished for a few hours, but it can be devastating if it stays on long-term.
Another comment correctly pointed out that we would not treat this as laissez faire if it was a child, but we should. There is a victim who has had harm done to them during the creation of the video.
“Another comment correctly pointed out that we would not treat this as laissez faire if it was a child, but we should. There is a victim who has had harm done to them during the creation of the video.”
‘Think of the children!’ is such an overused moral bulwark. The people in power who would handle this regulation others in this thread are begging for are using that argument to defeat encryption.
Submit a request to have it removed. Lawyer up if the site doesn’t comply. If the site is overseas and doesn’t comply, then you’ll have to just deal with it. Google even has their own methods to have a URL removed. I’ve done this before myself. I agree content should have mechanisms which allow takedowns to occur. Taking Google’s approach and delisting certain websites from their search results seems a bit much (this refers to the mass censorship in my original comment)
Stories like this cater to people who aren’t savvy enough to do the work to get the content removed. I said I miss when the internet felt more dangerous and wasn’t catered to casuals. I think both of us have the same beliefs, I just think there should be a bit more freedom online, but these “platforms” are publicly traded companies now, so my thoughts don’t matter.
I agree that she should move on but the porn industry also needs regularization, random people should not be able to upload anything they wish at the click of a button.
While I agree with your sentiment, we will miss the early days, but internet was in its "teenager" phase, wild chaotic, fun, but not sustainable.
The fact that small sites have been swallowed by facebook & co. , the knowledge bubbles cause by "smart" search engines, etc. those are another topic entirely.
> I suspect you’ll be downvoted, but I agree. People take the internet waaaaaaay too seriously now. Not downplaying their trauma. Crazy stuff happens online, but everyone’s attention span lasts for 30 seconds, so just move on, get some therapy, and try to ‘let it be’. I suspect stories like this will facilitate additional mass-censorship via complaints similar to “think of the children!!”, but framed towards adults who don’t understand how web archival works.
This will stop being a big deal as the boomers die out and GenX gets into a nursing home age. My wife is a millennial. She grew up with the internet and cameras -- the number of photos that would make Gen X ers blush that she has is insane and that's nothing compared to what an average gen Z has.
When I try to respond `without any emotion` to the question `Do we want more “mass deletions” of content like Tumblr, Pornhub, etc ?' , the answer that first comes up is `why not?`.
The Spock couldn't care less about these websites -- they are mostly entertainment, after all -- and on the other hand the risk that they someday archive something that may damage you is not nil.
So there is one clear cold, logical conclusion.
This problem is fundamentally impossible to solve. Once data has been created, it's trivial to make and transmit copies. Complete tyranny would be necessary to erase all illegal data from all computers in the world. The best people can hope for is removal of data from popular centralized services.
The 'good' news is that with deepfakes everyone will (soon) at least have the veneer of plausible deniability. I know there will always be ways to determine whether something is a deepfake via adversarial detection but perhaps in the future so much content will be faked that unless it is a head of state or other renowned person, only then would it then be worth checking.
This seems like an actual case where you could use machine learning to detect and remove instances of this video, or automate the sending of takedown requests. That would however require that the aggregators actually care.
CSAI Match – YouTube’s proprietary technology for combating child sexual abuse imagery. In 2020, we scanned all video content previously uploaded to Pornhub against YouTube’s CSAI Match and continue to scan all new uploads.
PhotoDNA – Microsoft’s technology that aids in finding and removing known images of child sexual abuse material. In 2020, we scanned all photos previously uploaded to Pornhub through Microsoft’s PhotoDNA, and continue to scan all new uploads.
Google's Content Safety API – Google’s artificial intelligence (AI) technology designed to help identify online child sexual abuse material. Initially built to help detect all “not safe for work content” as well as illegal content, Google’s Content Safety API attributes a score to content, which in turn serves as an additional tool available for our moderation team.
MediaWise – Vobile’s cyber “fingerprinting” software that scans all new user uploads in order to help prevent previously identified offending content from being re-uploaded.
Safeguard – Safeguard is Pornhub’s proprietary image recognition technology designed with the purpose of combatting both child sexual abuse imagery and non-consensual content, like revenge pornography, and helping to prevent the re-uploading of that content to our platform and any other platform that uses Safeguard. We believe in sharing this technology with other social media platforms, video sharing platforms, non-profits, and governmental organizations, free of charge, to help make our platform, as well as the Internet at large, a safer place for all by helping to limit the spread of this harmful content. We also will provide the Safeguard technology to our Trusted Flaggers so that our Trusted Flaggers can fingerprint content on behalf of victims or potential victims.
So yes but I think the real answer would just be to have sites support ContentID and once a victim registers a video with the system it will automatically be taken down.
> Rachel searched for the username that her husband had used. It led to a video uploaded to the world’s biggest porn site, Pornhub. In that video, it shows her, in her own bed, obviously unconscious. She says her husband’s hands can be seen reaching in to move her, touch her and sexually assault her. The video titles include “while sleeping” and “sleeping pills.”
Depends on the jurisdiction. The sexual assault is a crime, obviously, but very hard to prosecute even with this rare video evidence. Publishing nonconsensual porn surprisingly isn't a crime in a lot of places. Scotland recently criminalised it. Check your local jurisdiction.
> “You guys decided to host a non-consensually uploaded video of my assault on your site for three years. I want every trace of this video removed. Not just from Google; from the actual internet… From your site specifically and from your affiliated sites, such as Thumbzilla. The thumbnail - that you created and distributed - is literally my naked body… Like, dudes – you’ve even been contacted by the police in regards to my video and you still have a picture of my boobs on your site. WTF?”
> Kevin responded again, insisting that Pornhub “can NOT” remove content from other sites. However, that doesn’t seem to be completely accurate. Pornhub offers something called its “exclusive model program,” which promises that it will send takedown notices to any website to “help protect your content from being uploaded to other websites.”
> It offers this “protection” to clients who pay a fee for this exclusive service. The point being that Pornhub has certainly requested content to be removed from other sites.
I'm confused if said images are thumbnails hosted on PH or on third-party sites. If it's the former then PH is clearly at fault. In the latter case, there's nothing PH can legally do for her without actually contracting something with the copyright owner of the images (I'm not sure about other law regarding non-consenting depiction of people): broadly speaking, PH is able to request take down content on other sites with teeth because they own the copyright / license it. (IANAL)
It sounds like other sites (not pornhub.com) have the porn video she’s trying to remove, and she doesn’t understand that Pornhub isn’t in control of them. Doesn’t really seem like Pornhub’s at fault here, and I question the motives of a journalist who writes something like this
If you've been paying attention to the news in the past few years, its been increasingly obvious the lack of research and effort put into new posts. We live in the era of social media where the headline is the only thing that 99% of people read. The actual content of the article is irrelevant. People will be outraged for a second and then continue scrolling and completely forget it existed 5 minutes later. There is no economic incentive for journalists to do any real journalism.
> It sounds like other sites (not pornhub.com) have the porn video she’s trying to remove, and she doesn’t understand that Pornhub isn’t in control of them. Doesn’t really seem like Pornhub’s at fault here
(1) Given the nature of the pornography industry, most sites are owned by a small number of players, so it's definitely not clear that Pornhub isn't in control of them.
(2) As the article describes, Pornhub offers a service which is exactly what she is demanding: 'Pornhub offers something called its “exclusive model program,” which promises that it will send takedown notices to any website to “help protect your content from being uploaded to other websites.”' And so even if Pornhub is not in control of the sites where the video is hosted, they already advertise the ability to get content taken down from other sites.
>(1) Given the nature of the pornography industry, most sites are owned by a small number of players, so it's definitely not clear that Pornhub isn't in control of them.
It also doesn't mean they are in control of them. I think it's reasonable to assume that PornHub doesn't own every site the video was uploaded on.
>Pornhub offers a service which is exactly what she is demanding
This is almost certainly on the basis of copyright. As in, those 'exclusive models' sign over the copyright to their content to PornHub, or authorize PornHub to act on their behalf as copyright holders. PornHub then issus DMCA takedown notices or uses some other 'takedown notice'.
In order for this to work, she would have to be the copyright owner of the video. If anything, in a horribly ironic twist, her ex-husband is likely the copyright holder. She has no standing (speaking in terms of copyright) to get PH to protect her video like they do with the exclusive model program.
If you know more about how it works, then please inform me, but I can only assume it's copyright.
> I think it's reasonable to assume that PornHub doesn't own every site the video was uploaded on.
Yep, that is reasonable. It's also reasonable to assume they do own some of the sites the video was uploaded on, given they share videos between their own sites—and Pornhub seems unwilling to distinguish between the two.
> If you know more about how it works, then please inform me, but I can only assume it's copyright.
I have no idea, although telling her that would help, because she could pursue getting the copyright on the video from her ex-husband, given he's already going to be in serious legal trouble due to the videos. But instead, they pretended they had no way of handling anything like this.
3. If the human includes official IDs for all the participants when uploading a video, only a request also including the official ID of one of those participants can take down the video.
Rule one allows anonymous content. Rules two lets the aggrieved or regretful remove content without exposing themselves. Trolls could also take down professional content, so rule three protects videos that have documented consent.
Once it's out there, all someone can do is disavow it. Deep fakes will help here.
Only way to get PH to act is to threaten their bottom line. Visa and MC threatened to stop processing transactions related to PH in response to a journalist article and then in less than a few days PH decides to haphazardly delete all of the non-partnered content (which includes innocuous videos as well).
I guess most posters have already forgotten the previous big case of Visa and Mastercard declining to do processing of a client organization to praise this one as if it is a good thing. And, no, they couldn't care less about some article, and they most likely deal with much sleazier companies when it's “a matter of national/international importance”, or sufficiently high-ranking official asks politely for certain exempts. Such corporate reactions are agreed upon in advance.
Even counter-terrorism arguments still mean that someone gets the right to define who is called a terrorist today for you. I can't see how this is good news. If you think that this isn't real danger, political activists in Russia have routinely be placed on “extremist” lists, which means all their bank accounts have been frozen. I know, I know, just following the law, someone has to stop the bad guys, etc.
..and forced them to bitcoin right before the latest massive uptick. If they've been sitting on them, while I'm sure only a tiny percentage will bother converting their account to BTC, and likely not enough to make up the difference, they could not have asked for a better timing.
The monetary barrier to attaining the NFT. I doubt her husband would have uploaded it if he had to pay $100-200 to do so. In general this is not a practical solution.
Definitely not an easy issue to address but for argument sake, shouldn't the central claim be that of copyright? Is the claimant the copyright owner of the material, or can they become copyright owner by virtue of being depicted in the material?
"Solving" a problem deeply intwined with bodily autonomy and consent with 'obfuscate it until its no longer your body and face to abdicate you of control' seems like... The wrong approach. If actors/actresses are okay with it then let's go right ahead.
You'd be happy with people jackin it to a video of you being sexually assaulted if they just obscured your face? In that case why bother with the fancy neural network stuff, just make sure to wear a paper bag over your head when you're being raped! Problem solved.
/s
Your proposal has merit as a privacy measure - but this isn't about privacy. This is about consent.
And now with companies making it easy to make self produced porn, we're going to see even more levels of mental issues in the future when those women realize the mistakes they're making.
The whole "we're just a platform" thing makes it incredibly easy for Pornhub & co to just say "wasn't us, it was some user" and not even handle the deletion process on their affiliates' sites (who they provided the video + images to). It's like Mega Upload, which was obviously made for copyright infringement but successfully hid behind the platform-excuse for years.
I'm not a friend of far-reaching regulation, but it seems that we don't have sufficient processes in place to deal with these companies unless there's gigantic financial interest involved that makes the state feel motivated to intervene.
I would argue the copyright argument goes the other way. If you're a big vertically integrated corporation who has greased the right cogs it's incredibly easy to get content taken down wholesale without any oversight or accountability. Whether it's having content removed or restored, the burden somehow always falls on the individual.
Pornography is bad for its consumers, bad for willing actors, and obviously terrible for people featured without consent. Tolerating it will be one of the great shames of our time.
If a war on drugs has taught us anything, a war on porn is going to lead to far worse porn being spread further than ever before while innocent lives are ruined, much like how teenagers are already being charged as adults for sexting while not adults. There are likely places the law can be improved but a blanket ban would be a large step backwards even if we ignore freedom of speech implications.
In the world where entire sites are pulled over some silly Q conspiracy theory, we can surely do more to combat porn. We've largely dealt with smoking, also addictive, also once prevalent, also while cigarettes are still available at every gas station and grocery store.
We can definitely do better, but you have to allow for legal porn with consent to remain or else you will push everything to an underground market where the end result will be far less rules to control content.
One thing that would be nice is an automated take down. Anyone who no longer consents to their porn being hosted (or who never consented) can have the selected files added to a database and all porn sites would have to take down based upon matches to this database. This technology already exists with PhotoDNA for fighting known child porn (though I think the technical details are kept secret to avoid people finding work arounds).
As long as one draws a line between consensual and nonconsensual porn then I think you'll be able to crack down on the non-consensual material without having to worry about the failures of a 'war on x'.
Think of it like the difference between cracking down on weed and cracking down on synthetic 'weed' that is killing people. Or just look at stores that are able to sell alcohol. Because it is generally allowed, specific bans are much easier to enforce because business likes keeping the legal status.
Alcohol and weed are not good examples because these are failed interventions if you consider them harmful. Consumption of both exploded in the last few decades.
This is why I use smoking: it's also legal or semi-legal, it used to be prevalent but its popularity cratered.
"Consumption of both exploded in the last few decades." -> I don't know about weed, but it's definitely not true for alcohol, it's roughly stable; in USA there has been some decrease in per capita consumption since a peak in 1980s - see https://www.who.int/substance_abuse/publications/global_alco... for example.
Philip Zimbardo, the psychologist behind the Stanford Prison Experiment, seems to think so. He has written a couple of books on the subject of deteriorating development of young men, and has suggested a correlation to the advent of high-speed internet.
Not sure if he's right, but it's certainly a reasonable theory.
He also did a TED talk briefly touching on the subject.
Whether you think the experiment itself was good or bad (Zimbardo recognized its failures and ended it early), the lessons it taught were invaluable and that's why they brought him in as an expert witness for the Abu Ghraib trial. Your suggestion that "lots of people write books" is a silly pooh-poohing in that context, especially considering that Zimbardo was already a Stanford professor by that time in 1971, 50 years ago.
Trashing some offices is nowhere near overthrowing a government.
And pornographers did much, much worse. They trafficked underage women, misrepresented the contracts, routinely provided drugs to dull their actors' senses, and engaged in all kinds of underhanded or outright criminal conduct.
Those cases generally aren't entrapment. They get pretty close, but they don't include the final push. The other party is free to walk away without taking the bait. Granted, I've only read the details on a few cases but in the ones I read the FBI is clear to not cross the legal boundary.
Legally that's a fair argument, but there's an ethical hazard in law enforcement catalyzing a crime that may not otherwise occur, in order to bag a person who may not otherwise be a criminal.
Some of those setups discriminate based on ethnicity, such as those that target Islamic radicals and black nationalists. In my mind, this further deepens the ethical quandary.
Sometimes a solution in search of a problem is itself a problem.
"...the FBI and Joint Terrorism Task Forces (JTTFs) have approached multiple activists organizing for justice for George Floyd—who was killed by Minneapolis police officers—and have alternatively attempted to entrap them or pushed them to work as informants."
Given the state of the war on drugs and the war on human trafficking, do you think that there would be less drugs and human trafficking if people were not allowed to watch porn as you suggest?
I suspect that because criminals tend to ignore the law anyway, placing restrictions on pornography will completely fail to reduce any harm as bad people will continue to do those things regardless of whether PornHub exists or not
If pornography were not legal, then production would move underground and would probably involve even more harm.
Some might suggest that there needs to be heavier regulation and more protection for the women involved but banning porn would mean zero protection for the women and an unregulated trade
You're right, fast food is probably not the primary offender, and there are many causes (HFCS, seed oils, not cooking, abundance in general), and future is overdetermined... but fast food definitely carries some responsibility for the obesity epidemic.
Same with porn and Pornhub. But it is the mental tobacco of our time.
You should check your history. Tobacco was native to South America and introduced to the rest of the world in the 1500s. You could argue it is prehistoric to South America, but that is very different from "barely more recent than drawing."
Hemp smoking, on the other hand, has been traced back to many prehistoric sites in asia and africa which would fit your description much better.
I am not convinced that is is bad for consumers. Occasionally I see people talking about their porn addiction. But it can also boost people's mood. I don't know why some people seem to get addicted, but it clearly doesn't happen to everybody.
Whether it is bad for willing actors also needs citations. I don't know why people do it. I can think of some cases where it seems to have solved a psychological need, and they went on to lead successful lives afterwards (like Sasha Grey, or Sibel Kekilli).
Edit: googled article about Sasha Grey, with her quote about women in porn: "the reality is that there are a huge amount of women who are very happy with their careers and where they have gone in their lives"
I’m very curious because I think the existence and availability of porn and the sex industry is very important for the well-being of lots of people. Definitely some problems.. but I’d parallel those to problems with prohibition and suppression.
Have you always held this opinion? Are you religious?
Especially with environments where anyone can upload anything without proof of consent. Essentially you have a system where producing and distributing this regardless of consideration is maximized. However, even if pornhub et al. required proof of consent for every video it would still likely lead to many situations where the actor was "willing" for bad reasons which will play out over time with an increasingly negative effect.
As I get older I'm expecting more and more to have a future generation look at our with quite a dark opinion.
"A future generation" will think poorly of our mass incarceration and meat consumption, but they won't have our body image hangups and literally every second of their lives will be recorded from multiple angles so they won't care about porn.
I don't get how pornhub still exists considering it hosts so many illegal videos: assaults, child porn, spy cams, etc. and this is the premier porn site. imagine what you can find on less mainstream porn sites.
the US gov went full on war against torrent websites and closed every one of them no matter how small, yet this giant website that literally hosts rape videos is OK.
all providers of PH should follow VISA and Mastercard and cut ties. I realize killing PH will not end sharing of videos but at least there will be a lot less money made from it and a lot less viewers.
> the US gov went full on war against torrent websites and closed every one of them no matter how small
A teacher in a Soviet provincial day care center tells the children: “The Soviet Union is where the best hospitais are found, the best schools, trains, airplanes, the best factories, the best candy and toys…” Some of the children cry out, “We want to live in the Soviet Union!”
Must be nice to live in US that had victory against torrents.
Does it actually host so many illegal videos? It is difficult to verify those claims (as certainly one can't simply go there to search for some child porn, and usually the videos will already have been removed when a story is published). Maybe actual criminal proceedings could be checked with the police? Are there articles that provide such references?
See, the comment above you is how the reader of that article was supposed to react, while you are the irregular one asking questions.
The article simply lies. Finding illegal content on a giant service like Pornhub is actually less likely because of the automated moderation and content identification systems that already have most of the unwanted content in their databases. In comparison, some small sites or forums might still have something uploaded a decade ago, and do nothing about it until someone complains. Also, if you think that a company of that scale doesn't have behind-the-scenes agreements with law enforcement (in addition to whatever legal obligations they have in native jurisdiction), you don't understand how internet works today, and who is the global policeman in it.
Any serious internet user knows that stories of common people finding “loads of child porn” somewhere are obvious fantasies. Heck, even some other content that teens shared on CD-Rs 15 years ago is nowhere to be found today. But the article is clear about the type of people it addresses, and it's not actual internet users.
I bet if someone uploaded a video of all the dodgy tax avoidance these crooks do a spree gun it all over the internet these bastards would find a way to trace it and take it down in less than 24h. Maybe someone needs to start physically harassing all the CEOs of those shit companies and video document it so that they can’t live a single day of their lives without being violently attacked and harassed before they will find a way to take harassment serious.
Not trying to belittle this womans struggle but there are videos online where people are being hurt real bad and those videos will forever be someone's entertainment.
World star hip-hop, and other sites like it, have almost made a business out of showing people be knocked out, kicked, punched and assaulted.
That's what the internet is. A global network that spreads information at light speed.
So I don't think attacking Pornhub specifically is the right thing to do here.
That sort of smells of someone trying to make waves by going after one of the more established players in the internet porn business.
What happened was awful but it has nothing to do with Pornhub. They're doing their best to police a giant platform that everyone in the world wants to use and abuse. They're not alone in this challenge.
I don't see why non-consensual gore isn't treated the same as child porn.
Both are done without consent.
Both require someone to be hurt to be created.
Both either have a victim who is dead or who is harmed by the continue spread of the video for entertainment purposes.
Both cross the threshold for obscenity.
Political and historical exceptions would still apply, just like the photo taken of Phan Thi Kim Phuc fleeing a napalm attack is legal since it serves significant political and historic significance, despite it being a literal picture of a naked child being harmed.
> I don't see why non-consensual gore isn't treated the same as child porn.
Because then the video of George Floyd's murder would never have surfaced the way it did. The dissemination was driven by social media and those "bad" gore websites at first, and only later picked up by the professional media.
If there was such a law, even if it had exemptions for cases like the Floyd murder, social media companies would have put a lid on it "just to be safe legally" severely hurting dissemination, and professional media would have maybe reported on it, but wouldn't have shown it because their legal would never have OK'ed showing it.
Yes, it's bad that people use such videos for entertainment, but in my opinion it's worse to hide or penalize publication of videos and pictures of murders (like George Floyd's murder), war atrocities (like the naked Vietnamese girl), or terrorist attacks (like 9/11).
Is a lot of gore footage created for the sake of selling videos? Allowing child porn has the consequence of incentivizing more to be produced. Is anyone producing gore videos in any quantity?
The incentivizing argument seems to be a red herring because in no universe would we legalize some subset of child porn that is shown to not incentivize more being produced, no matter how clearly such a case was shown.
Drawn and computer generated images of that kind are legal under the First Amendment in the USA. I mention it because they are, in contrast, illegal in Canada.
The other argument is that it inflames and encourages desire to assault children in a significant subset (in the sense of risk; i.e. the population doesn't have to be large, only the risk) of those who consume it, and that it does so in a unique way, compared to other forms of media. The other argument is that it's a particularly grave violation of the child's privacy, one they cannot consent to.
Alternatively, we could just bite the bullet and conclude (perhaps rightly) that maybe porn in general has the same negative effects we allege CP to have. I'm not sure if that's true, but if it is, then I think it would make a good case for banning it.
> They're doing their best to police a giant platform that everyone in the world wants to use and abuse.
It doesn’t really sound like they’re doing their best though, when they offer an aggressive content take-down service to their paying customers but not to victims of exploitation from whom they’ve (knowingly or unknowingly) profited.
Sure but put yourself in their position for a minute.
How does pornhub even receive notice of this video being posted? How many others contact ph through this channel? How many of these cases are bogus and lead nowhere?
Remember that this is the internet. If you open up any communication channel to your massive website you will be flooded with junk.
So just to maintain a communication channel with the outside world is an entire project in itself. Probably requires its own manager and employees working full time with nothing but handling cases.
And despite all this ph did respond on this case, they even tried sending takedown requests to OTHER SITES.
Imo they did truly do their best.
But the problem goes beyond pornhub. It's an internet problem. There is no simple resolution to this problem, unless you want to lock down the entire internet.
And despite all these difficulties the stories posted still mention Pornhub as the problem.
Pornhub is not the problem here.
But I would not be surprised if Pornhub comes up with a solution. if coinbase can verify your identity to open an account with them then surely Pornhub can do the same for uploaders.
They already have a program that verifies the identity of uploaders called Verified Amateurs, and last year after the NYT published their hit piece and Visa stopped processing payments to them, they removed all amateur videos that weren’t Verified Amateurs, which was most of them.
What’s sad is that you’re having to read this from me, instead of from the original article linked above, that someone who has a college degree in journalism was paid to write.
> What’s sad is that you’re having to read this from me, instead of from the original article linked above, that someone who has a college degree in journalism was paid to write.
Exactly. Also the fact that none of their journalism colleagues wont call out this bad piece of work, reinforces that the news media really isnt trustworthy these days.
The article pretty clearly indicates that they're not "doing their best":
> Kevin responded again, insisting that Pornhub “can NOT” remove content from other sites. However, that doesn’t seem to be completely accurate. Pornhub offers something called its “exclusive model program,” which promises that it will send takedown notices to any website to “help protect your content from being uploaded to other websites.”
The logical step here would seem to be to extend that takedown program to victims as well as their models.
While I agree with your suggestion, the article notes that PH did request removals for her and still concludes by attributing blame to PH for the video being newly uploaded elsewhere (despite the fact that her ex husband likely has a copy which he may have uploaded again)
If you have serious points to make about a porn issue, perhaps you should use a throaway account rather than 'INTPenis'. joke that's mildly amusing in other contexts seems tasteless when it shows up in a discussion on sexual assault, and that's probably outweighing the substance of your argument.
Sure, it is a little bit more sinister than usual even, but that's just because the company in question serves a market with some very dark corners, like the assault featured in the article.
People here argue pro or contra porn prohibition or platform specifics, but it comes down to this: The Western world decided that giant multinationals sitting above countries, laws, ethics and responsibility holding all the data are the ultimate expression of the American Dream and should therefore exist undisturbed forever, growing bigger and bigger. Shit like this is just fallout from the gargantuan power distance between megacorp and human you get, by design.
Sure, there are probably ways to get PornHub to be sorry for that, make amends, change some rules. Whatever. Last week it was Uber discriminating, here it's PornHub violating, maybe next week it will be someone having his business destroyed because he associated with a scammer in 2004 again.
Every time there is juicy drama and a micro-outcome in this or that direction, but for someone reason the macro-problem gets mostly ignored.