The Daily Tar Heel
Printing news. Raising hell. Since 1893.
Friday, Nov. 22, 2024 Newsletters Latest print issue

We keep you informed.

Help us keep going. Donate Today.
The Daily Tar Heel

Column: An incel fell, but Google dug the rabbit hole

johnson-10172022-section-280.3.jpg
DTH Photo Illustration: A student falls into an Internet addiction on Monday, Oct. 17, 2022.

Let’s talk about Section 230. 

Dude, stop scrolling. 

I know everyone but the political science kids left when I started talking about legislation, but this is important. Every meme in your group message, every conspiracy theory on Twitter, every Nicki Minaj music video, every human organ sold on the black market (well, maybe not that last one) have a platform because Section 230 stands.

Section 230 of the Communication Decency Act says: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

From this, we can derive two things. First, internet platforms are not responsible for what others post on their platform. For example, let’s say I tweeted a picture of Justin Bieber captioned, “Justin kicks puppies”, and it blew up because social media algorithms recommended the post to every Belieber. Justin could sue me, but not Twitter. Section 230 gives Twitter immunity, even if it helped spread a lie.

The second part of Section 230 immunizes internet platforms from selective moderation. Remember when Twitter permanently suspended Donald Trump’s Twitter account? Twitter was safe to do that under Section 230. Tech companies have the power to bolster and block any and all content without fear of catching a case. 

So, who gave the Internet the ultimate “get out of jail free” card? This elusive clause has an even more obscure origin. You can actually trace Section 230 back to a 1950s erotic novel. Seriously, the adult book “Sweeter Than Life" catalyzed the case for the Communications Decency Act.  

Eleazer Smith was a book store owner from Los Angeles. In 1956, Smith was given 30 days in jail for possessing lewd literature because a Los Angeles ordinance said it’s illegal to sell obscene material, even if you have no knowledge of them in your store.

Soon, a civil liberty lawyer took the case, Smith v. California, to the U.S. Supreme Court. In December of 1959, the Court ruled in favor of Smith. The Los Angeles ordinance violated the First Amendment's freedom of speech. 

Now fast forward a few decades to the 1990’s when the Internet was born. Courts heard an influx of cases on what was and wasn’t allowed on the Internet. Almost every ruling cited Smith v. California until the Communications Decency Act was signed into law in 1996.

Fast forward a few decades and you’ll see Section 230 is back in deliberation. The impact of rabbit-hole algorithms isn't just clickbait celebrity gossip. It's also how radical groups emerge and recruit in modern internet culture – and how they gain the manpower to pose an increasingly dangerous threat.

The Supreme Court recently agreed to hear Gonzalez v. Google, a case questioning the reach of the consequential clause in April 2022. In 2015, Nohemi Gonzalez, a young American law student, was killed in an ISIS attack in Paris. Her parents blame YouTube for their daughter’s death. They are suing Google, YouTube’s parent company, arguing the algorithm's recommendations of ISIS videos violate the Anti-Terrorism Act. 

This case is not the first time Section 230 is being questioned. Politicians, red and blue, have publicly criticized the extensive immunity of big tech. Both parties believe big tech companies have too much power. Because social media sites have become a sort of “town square,” Republicans believe it is unjust for platforms to delete some speech while amplifying others. They argue that this clause allows big tech to violate users' freedom of speech. At the same time, Democrats argue that the expansion of disinformation and hate speech is harmful, and should be regulated. 

Now, back to Gonzales v. Google. The Supreme Court could rule in one of two ways. They could create make a decision based on the narrow guidelines only applicable to the Anti-Terrorism Act. This would result in a relatively low level of change. 

The second way they could rule would be revolutionary. The Supreme Court could say that catering content to users through algorithms is not covered by Section 230. 

This second way would be interesting considering every incel, flat Earther and conspiracy theorist was conceived in a rabbit hole. These personalized algorithms cause polarization, which, in many cases, manifests into acts of violence. From terrorist attacks to Capitol riots, to misinformation and conspiracy – one click, then you could trip and fall down a rabbit hole.

But who’s to blame? You fell, but Google dug the hole.

@noelleharff

opinion@dailytarheel.com

To get the day's news and headlines in your inbox each morning, sign up for our email newsletters.