Judge rules that Twitter can be sued for refusing to remove child pornography and block traffickers

/

 

 

A federal district judge in California has ruled that a civil lawsuit brought by two teenage victims of child pornography has merit and that Twitter can be sued for its lack of action in a child sex trafficking and exploitation scheme.

 

Quick Facts

 

 

The lawsuit alleges that Twitter violated several federal and state laws, including the Fight Online Sex Trafficking Act of 2017 (FOSTA), failed in its duty to report child sexual abuse material, and caused emotional, psychological, and reputational injuries to the plaintiffs as a result of knowingly receiving, maintaining, and distributing this child pornography.

 

The suit alleges these facts: When the first plaintiff, known as John Doe, was 13 years old, sex traffickers posing as a 16-year-old girl coerced the teen to send explicit content. The traffickers blackmailed the teen into sending more explicit content by threatening to share it with his family and school. He initially gave in to their demands, which included sending more explicit images and recruiting another teen to take part. When he later blocked the traffickers, they released the images and videos on Twitter. The teen learned of the content after his classmates saw the content and subjected him to “teasing, harassment, vicious bullying,” leading him to become “suicidal,” according to court records.

 

His parents filed a police report, and the teenager filed a complaint with Twitter asking for the content to be removed. His mother also filed two complaints with Twitter. For a week, none of the complaints received a response. On January 28, 2020, Twitter finally responded, saying,

 

“Thanks for reaching out. We’ve reviewed the content, and didn’t find a violation of our policies, so no action will be taken at this time. If you believe there’s a potential copyright infringement, please start a new report. If the content is hosted on a third party website, you’ll need to contact that website’s support team to report it. Your safety is the most important thing, and if you believe you are in danger, we encourage you to contact your local authorities.”

 

The content was finally removed after an agent with the Department of Homeland Security issued a take-down order, but by that point, the content had accumulated 167,000 views and 2,223 retweets.

 

In response, Twitter officials told the New York Post,

 

“Twitter has zero-tolerance for any material that features or promotes child sexual exploitation. We aggressively fight online child sexual abuse and have heavily invested in technology and tools to enforce our policy. Our dedicated teams work to stay ahead of bad-faith actors and to ensure we’re doing everything we can to remove content, facilitate investigations, and protect minors from harm — both on and offline.”

 

Twitter argued to the court that it has liability immunity under the Communications Decency Act’s Section 230 amendment and further claimed that it had no knowledge of and did not benefit from the activity. It further stated that “civil claims can only proceed against sex traffickers and those who knowingly benefit from their affirmative participation in a sex trafficking venture.”

 

In his order, Judge Joseph C. Spero disagreed with Twitter’s position, ruling that the plaintiffs can civilly sue Twitter, most notably under the FOSTA exemption to Section 230, writing that the allegations against Twitter “are sufficient to allege an ongoing pattern of conduct amounting to a tacit agreement with the perpetrators in this case to allow them to post videos and photographs it knew or should have known were related to sex trafficking without blocking their accounts or the Videos.”

 

Peter Gentala, senior legal counsel for the National Center on Sexual Exploitation Law Center, said,

 

“As John Doe’s situation makes clear, Twitter is not committed to removing child sex abuse material from its platform. Even worse, Twitter contributes to and profits from the sexual exploitation of countless individuals because of its harmful practices and platform design. Despite its public expressions to the contrary, Twitter is swarming with uploaded child pornography and Twitter management does little or nothing to prevent it.”

 

The order follows a Texas Supreme Court ruling in June that Facebook does not have blanket liability protection against sex trafficking activity conducted on its platform.

 

 

Twitter’s refusal to remove such disturbing and clearly criminal content even after multiple complaints is mind-boggling and reprehensible. Their claim that the explicit images and videos of minor children did not violate community standards is hard to fathom, especially since Twitter appears to be more than capable of flagging, censoring, and banning content posted by legitimate news organizations, medical doctors, scientists, politicians, and others who don’t agree with their ideological orthodoxy.

 

It is good that the courts are finally recognizing that Twitter and other social media companies can and should be held accountable for certain types of activity and content allowed on their platforms, but Congress needs to act by officially withdrawing their liability protections.

 

In addition, regular Americans should strongly consider moving away from this social media platform and others.

 

This case should serve as yet another urgent warning to parents: You won’t be able to protect your children from all harm, but please continue to point out what can happen on the Internet and then continue to take extreme precautions with your child’s screen time and even your own. Social media companies will not protect your child, and in fact, they are often unwitting and indifferent allies in enabling crimes against children.

 

And teenagers, John Doe’s case is a cautionary tale that you should heed: No one ever truly knows who they are communicating with on the Internet, so please take extra care with what you share on social media or send via text or email. Once it’s out of your control, it can be seen and exploited by anyone.

 

So, before you press send, stop for a moment and ask yourself: How will I feel if my friends or my parents or some creepy stranger ever read or saw this? Would I be proud or horrified? If your answer leans even a little towards the latter, don’t send it. You will save yourself a lot of future pain and embarrassment.