920 Million Videos and Counting

What if I told you that for the past 25 years, there has been a federal law on the books that has allowed child pornography to flourish? I wouldn’t be spouting the latest conspiracy theory; I’m referring to Section 230 of the Communications Decency Act (CDA), which is and always has been a terrible law. There have been many discussions by both parties recently to repeal the law, not because of crimes against children but because it is costing politicians votes and elections. But I don’t really care why they want to repeal it; I just hope that they do.

In essence, Section 230 shields internet service providers (and – although they didn’t even exist when the law was passed – social media platforms) from any legal liability for the content of what a user on its platform may post. Thus, if I were to post a defamatory remark on Twitter, there could be no action brought against Twitter for hosting that remark. The theory is that ISPs and social media platforms are just gateways or conduits for people to communicate, sort of like Verizon or Optimum, with no ability to regulate or edit what people say, and therefore should bear no responsibility for those comments or materials.

As a law student in 1996, I wrote a legal article calling for ISPs – then in their infancy – to be held to the same standard of liability as a newspaper, radio station, or television station. To demonstrate why, I cited and used the proliferation of child pornography on the newly created internet to argue that if left unchecked, the internet would become a vehicle whereby massive amounts of harm could be caused against society (vis-à-vis children). My article was published, used as support by Congress to rewrite the federal child pornography laws to account for new technology, cited by federal courts that needed support to understand this new thing called the internet and used by law schools as material for their classes on the First Amendment. As a law student, I felt like a rock star, but unfortunately, Congress rejected my main argument when they afforded ISPs the shield of protection in Section 230 of the CDA.

Prior to the inception of the internet, the federal government reported that there was no way to easily access any child pornography; it was all underground and very closely monitored. My thesis was that with this new tool called the internet, law enforcement would never have the ability or resources to contain the problem and that the technology itself should be developed in a way that could contain the problem, lest it get too far out of control. Unfortunately, I was correct.

Even with the most sophisticated law enforcement agencies, superior technology, and excellent nonprofit watchdogs like the National Center for Missing & Exploited Children working day and night to try and combat it, Nicholas Kristof of the  New York Times recently reported in his article “The Children of Pornhub” that a Google search for “young porn” still returns an astounding 920 million videos. Moreover, major payment platforms have made news by canceling their contracts amidst reports that so-called “mainstream” porn sites like Pornhub host child pornography. Indeed, while sites such as Pornhub employ moderators to remove “banned” content, including that depicting children or rape, Kristof reported that searches for terms such as “13yo” turn up 155,000 videos – often accompanied by suggested “related searches” such as “exxxtra small teens.” Users with unprintable names are allowed to freely post, using tags and titles that leave little question about the pedophiles the site attracts. (These horrific numbers don’t even include the material that resides on the “Dark Web.”) Thus, these companies must be forced to develop and manage their platforms differently to end this onslaught against our children once and for all.

Since the inception of Section 230, I have spoken to hundreds of groups about how to keep kids safe on the internet, reported thousands of offenders, given tens of thousands of dollars, and volunteered thousands of hours on boards of nonprofit organizations committed to preventing child abuse and exploitation. Thousands of other private citizens and advocates have done the same and joined the fight, creating awareness campaigns worldwide. Unfortunately, we have simply not made a dent in this problem.  

If we are going to fix this, the companies that control the platforms and the technology will need to be forced to do so. The technology exists to fix it; about 10 years ago, Microsoft created a virtual DNA program that can be used to scan online images, identify those that show child pornography, and delete them. Unfortunately, ISPs have no obligation to use the program, and without being forced to, will not police any content for fear they will lose their users and their beloved advertising revenues. A repeal of Section 230, however, would force those companies to use this technology to remove those materials on their platforms, because if they don’t, they could incur legal liability for hosting such materials (the way a magazine or movie would).

I fully understand and respect the need to protect free speech in our society, notwithstanding my critics (I once got rejected from a job at a large law firm, who wrote in their rejection letter that they do not hire “Socialists who oppose free speech”). But the Supreme Court has rightly determined that child pornography is not speech. And, as the Court has also determined, when the sexual exploitation of a child is recorded, that child is victimized every time that image is viewed. Thus, with a single post, that child can be victimized millions of times in just a few seconds for years to come. How many more children need to be exploited before society makes a real commitment to protecting them?

I therefore implore Congress, and particularly our Long Island delegation, to not let this moment pass; repeal Section 230 of the Communications Decency Act in a unified and bipartisan way and send a message to these technology companies that we care more about our children than their profits.