Event Banner

Texas court rules that Facebook does not have blanket liability protection against sex trafficking activities on its platform





A recent Texas Supreme Court decision that ruled that tech giant Facebook could be held liable for sex trafficking activity conducted on its platform is likely to have big implications for social media platforms.


Quick Facts



The ruling was made in response to three plaintiffs who filed separate civil lawsuits against Facebook under a Texas law that allows civil redress against “those who intentionally or knowingly benefit from participation in a sex-trafficking venture.” The three plaintiffs said they became victims of sex trafficking after meeting their victimizers on either Facebook or Instagram (which is owned by Facebook), and all three were minors at the time they were recruited.


Facebook sought to have all of the suits dismissed on the grounds that it is fully shielded by Section 230, a provision of the 1996 Communications Decency Act that protects tech companies from any liability for the content that runs over those platforms.


The Texas Supreme Court rejected their motion to dismiss the suits. The judges further ruled that Section 230 does not provide Facebook blanket protection regardless of what happens on its site, noting that the provision does not permit Facebook to be a “lawless no-man’s-land.”


In their ruling, the judges explained,


“Holding internet platforms accountable for the words or actions of their users is one thing, and the federal precedent uniformly dictates that section 230 does not allow it. Holding internet platforms accountable for their own misdeeds is quite another thing. This is particularly the case for human trafficking.”


A spokesperson for Facebook told Fox Business, “We’re reviewing the decision and considering potential next steps. Sex trafficking is abhorrent and not allowed on Facebook. We will continue our fight against the spread of this content and the predators who engage in it.”


A 2020 report by the Human Trafficking Institute found that since 2000, 30 percent of all sex trafficking victims were recruited online. In 2020, 59 percent of online recruitment of victims took place on Facebook, as well as 65 percent of all child sex trafficking victims recruited on social media.


Facebook responded to the report by saying, “We have policies and technology to prevent these types of abuses and take down any content that violates our rules. We also work with safety groups, anti-trafficking organizations and other technology companies to address this and we report all apparent instances of child sexual exploitation to the National Center for Missing and Exploited Children.”


Whatever policies Facebook has in place, they are clearly not effective. As Victor Boutros, CEO of Human Trafficking Institute, explains,


“The Internet has become the dominant tool that traffickers use to recruit victims, and they often recruit them on a number of very common social networking websites. Facebook overwhelmingly is used by traffickers to recruit victims in active sex trafficking cases.”



This decision could have an impact on the future of Section 230 protections for Big Tech companies. Section 230 has been in conservatives’ crosshairs for some time due to Big Tech’s blatant censoring of political speech, prompting accusations that they are, in fact, publishers and should have their legal protections removed.


It is a salient point. If Facebook is able to take the time to ban the sitting President of the United States or to unceasingly police individual posts for any possible dissent regarding election fraud, COVID origins, vaccine safety, climate change information, or even Hunter Biden’s laptop, then surely it should be able to better monitor the illegal child trafficking and exploitation activities taking place on its platform.


Section 230 protections have been abused by tech giants and it is appropriate that this court has recognized this fact and is willing to begin the process of removing the heretofore impenetrable shield the law has provided.