Pornhub, MindGeek hosted rape videos of teen sex-trafficking victims: lawsuit

A federal class action lawsuit against PornHub’s parent company MindGeek alleges the smut purveyor hosted multiple rape videos of teen sex-trafficking victims — and profited from them while doing nothing to verify the ages or consent of those depicted, court records filed Friday show. 

The suit, filed in the Northern District of Alabama, centers around two victims — Jane Doe 1, who lives in the Southern state, and Jane Doe 2, who now lives in California. 

In 2018, when Jane Doe 1 was 16-years-old, she was drugged and raped by a man in Tuscaloosa, who filmed the crime and uploaded it to Modelhub, a subsidiary of MindGeek for “amateur” pornographers, the suit alleges, 

“MindGeek reviewed, categorized, tagged, and disseminated the images and videos depicting the rape and sexual exploitation of sixteen year old Jane Doe #1. One of the videos of Jane Doe #1 had been viewed over 2,400 times since MindGeek added it to its websites in early 2018,” the complaint states. 

“At no time did MindGeek or PornHub attempt to verify Jane Doe #1’s identity, age, inquire about her status as a victim of trafficking, or otherwise protect or warn against her traffickers before or while the video of her being drugged and raped was sold, downloaded, viewed and otherwise advertised on PornHub,” the complaint continues, adding the video’s title included the word “lil” to signify it depicted a kid. 

Jane Doe 2, who was 14 when she began being trafficked, was forced to appear in videos of adults raping her that were then were uploaded to MindGeek’s websites, including Pornhub and Redtube, the suit states.

As with Jane Doe 1, the company never tried to verify Jane Doe 2’s age or consent, the suit alleges. At least four videos depicting the kid have been identified, and both of the survivors continue to be traumatized on a daily basis knowing their sexual abuse imagery is online and “permanent,” the records show. 

The suit, brought by over a dozen attorneys from a series of law firms — including the National Center on Sexual Exploitation — alleges MindGeek is completely ill-equipped to monitor the millions of new videos that are uploaded to the company’s many porn sites each year.

The sites bring in hundreds of millions in profit, the suit says. 

For Modelhub, where Jane Doe 1’s videos were uploaded, the amateur pornographers are only required to submit a picture ID attesting the subjects are over the age of 18. 

“Once a Modelhub account is created videos may be uploaded. If a video posted by an amateur pornographer includes other parties or individuals, MindGeek has no effective process to verify age,” the suit alleges. 

“Essentially, ‘moderators’ hired by MindGeek eyeball the performers in the video to see if they look young. If the performer is a child under the age of 12, it may be more likely that a moderator would flag that video or image,” the suit alleges.

“However, if the performer is 15, 16, 17, the moderator may be less likely, and less inclined, to flag that video or image due to the ineffective system PornHub implemented.” 

The system MindGeek has to monitor content is woeful at best, the suit contends. 

The company maintains an “offshore ‘moderation team’” of around 10 individuals who are required to remove videos depicting child porn and other “inappropriate” content like bestiality, the suit says.

But with the sheer volume of content, it’s impossible to view all of them adequately, the plaintiffs say.

“The ten individuals on the ‘moderation team’ were each tasked by MindGeek to review approximately 800-900 pornographic videos per 8-hour shift, or about 100 videos per hour,” the suit alleges. 

“According to PornHub, there are approximately 18,000 videos uploaded daily, with an average length of approximately 11 minutes per video. Hence, each moderator is tasked with reviewing approximately 1,100 minutes of video each hour. This is an impossible task, and MindGeek knows that.” 

Further, MindGeek “incentivizes” employees to not remove content because there is a yearly bonus system based on the number of videos moderators approve, the complaint alleges.  

“This results in individuals fast-forwarding to the end of videos (or not reviewing them at all) and approving them, even if they depict sexual trafficking of children,” the complaint states. 

For inappropriate content spotted and flagged by users of the websites, which tends to be the primary way most internet companies learn about harmful content, moderators can’t remove it themselves, the suit says.

The flagged content still need to be removed by a team leader, a process that takes months, the suit says.

“There is an approximate backlog of five months between the time a video is reported by a user as ‘inappropriate’ and the time it is reviewed by a ‘team leader’ to determine whether it should be removed,” the complaint shows.

“Thus, for five months, such videos would sit on MindGeek’s sites, available for downloading and redistribution.” 

Representatives for Pornhub and MindGeek didn’t immediately return a request for comment.

source: nypost.com