A Blog by Jonathan Low

 

Sep 24, 2024

US Appellate Court May Make Tech Companies Liable For Actions Of Their Algorithms

The Third Circuit's ruling may have removed the immunity under Section 230 that many tech companies claimed. 

It's one thing if a human posted. It's legally quite another if an algorithm created or purchased by a company does so. JL 

Andy Kessler reports in the Wall Street Journal
:

Because algorithms are now an “expressive activity,” companies are publishers and not protected from liability from third-party content under Section 230. On her TikTok “For You Page,” 10-year-old Nylah Anderson was shown a “Blackout Challenge” video encouraging the recording of self-asphyxiation. She hanged herself. Her mother sued and the case wound its way to the appeals court, which noted that TikTok’s FYP algorithm “is not immunized by Section 230 because the algorithm is TikTok’s own expressive activity.” The court did suggest a person searching for videos would fall under Section 230. Humans post, AI publishes.

Is TikTok toast? Fearing Chinese spying, the U.S. decided to ban TikTok last April if it wasn’t sold by January 2025. TikTok owner ByteDance refuses to sell and sued. At a U.S. Circuit Court of Appeals for the District of Columbia hearing last week, Judge Douglas Ginsburg, asked, “Why is this any different, from a constitutional point of view, than the statute precluding foreign ownership of a broadcasting license?”

Why indeed. Broadcasters are publishers. Is TikTok?

 

Social-media companies flourished under the protection of Section 230 of the Communications Decency Act of 1996, which states: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” In other words, anyone hosting content on a server or platform can’t be sued by others for that content.

But we’re a long way from 1996. Online service Prodigy—remember it?—lost a 1995 defamation suit brought by bucket-shop brokerage Stratton Oakmont, which sparked passage of Section 230. Remember, no one even had a BlackBerry back then. Now, algorithms and artificial intelligence, not humans, decide what’s on our feeds. TikTok’s FYP or “For You Page” algorithm uses AI to serve up videos. The $200 billion question is: Does using AI or algorithms make you a publisher? I think so.

And so does the Third Circuit. On her TikTok “For You Page,” 10-year-old Nylah Anderson was shown a “Blackout Challenge” video encouraging the recording of self-asphyxiation. She hanged herself. Her mother sued and the case wound its way to the appeals court, which noted that TikTok’s FYP algorithm “is not immunized by Section 230 because the algorithm is TikTok’s own expressive activity.” The court did suggest a person searching for videos would fall under Section 230. Humans post, AI publishes.

Social-media companies are allowed to moderate content, removing or leaving up anything they want. The U.S. Supreme Court further established this as a First Amendment right in Moody v. NetChoice (2024). But because these algorithms are now an “expressive activity,” the companies are publishers and not protected from liability from third-party content under Section 230.

This changes everything. Last year I suggested we needed a Section 230.ai. I think we got it, an implicit word “human” added, meaning platforms won’t be treated as a publisher of “information provided by another human information content provider.” Congress should codify this.

It’s a new world. Large language models churn out speech by the mile. Chatbots hallucinate and write strange things, spewing statements they think are true but are false—like many politicians. Both OpenAI and Microsoft have been sued for defamation for their chatbot’s output. Congress and courts should label generative AI companies as publishers, which they are, with all the ensuing copyright and liability issues. They will fight it tooth and nail, but let’s call a bot a bot—they don’t host, they publish.

 

Sen. Tom Cotton (R., Ark.), a member of the Intelligence Committee who has seen more than most, told Fox News, “TikTok is a tool of Chinese Communist propaganda.” That “expressive activity” is a long way from “human information content publisher.” Let’s treat them this way as publishers. Russian bots feeding us lies aren’t human either. We need laws structured to shut them down.

Facebook, Instagram, Snapchat and X may have to change how they operate—how their algorithms and AI fill our feeds. Good. Their outputs are messy anyway, filled with useless ads and other garbage. To maintain their protections under an implicit or explicit Section 230.ai, they need to clean up their act, enabling a more human touch to what we see, with less reliance on algorithms or bots.

Under Moody v. NetChoice, these privately owned and operated platforms can still apply “community standards” to moderate content and take down whatever they want. Don’t like it? Start your own platform. Sadly, whether government can pressure social-media companies to censor is still an open issue. For lack of standing, the Supreme Court recently tossed a lawsuit trying to limit such interference.

Thirty years is a long time in Techworld. The norm of the mid-1990s was big clunky monitors, 28K dial-up modems and America Online. With no clue about smartphones or 5G, legislation written then inadvertently spawned the digital world we know today, good and bad. This time, legislators should think hard about a future world of chatbots and machine-learning algorithms, with laws that affect billions of digitally connected users. I can easily imagine an AI bot defaming whoever it wants to or, gulp, whoever dares speak against it.

0 comments:

Post a Comment