Suing Social Media

                               

It is possible that one might have overlooked it, but last November there was a quadruple homicide out in a sedate small town in Idaho. As police looked into what is a rare occurrence (homicide) in that little town, there were a number of individuals that took to the world wide web to express their angst and criticism of the town police force. In fairness, the police knew they were in over their heads and quickly called for assistance from both state and federal agencies. 

Those who were on the Internet espousing thoughts, criticism, and suggestions ranged from the curious to the critical. They appeared in comment sections of news articles and social media. I have observed in the past that social media can be a vicious place. I was never on Faceplace or Instapic or WhatsUp or Bluedit, or SnapApp. But, I have been using YouTube and LinkedIn for years. I made a foray onto Twitter years ago and abandoned it in about 2021 over the venom and anger there. I returned to Twitter last fall, but not with enthusiasm. 

The Idaho murders drew me back to some of the social media platforms and I have to admit disappointment, or perhaps disgust is a better word. The Internet sleuths have been downright brutal in their willingness to reach conclusions, make allegations, and insult so many people. Everyone is driven, it seems, by the desire for response, debate, and affirmation (likes, follows, etc.). Some seemingly have no filter or restraint. There is a certain power that comes from anonymity and being able to hide behind a keyboard. See Anonymity and Emotional Intelligence (July 2022).  

But even those without anonymity seem to lack inhibitions. One self-annointed sleuth has been sued for defamation regarding allegations of involvement in the killings. Though we may be disgusted, we nonetheless view their content, drawn to it, mesmerized. And, somehow their malicious content rises to the fore in our feeds, pages, or other participations.  

The potential for doing harm through social media is nothing new. See Public Harm and Social Media (February 2019). And, it can certainly be a particularly challenging venue for judges. See Judicial Commenting (October 2017). As a new day is envisioned for social media, those investing in the next iteration are focused on glitz and glamour, but largely seem ignorant to, or ambivalent about, the potential for challenges or harm in this desert. See The Metaverse (November 2021). They are not striving to make content or participation better, but only more appealing and addicting.  

There is known potential for harm on social media. Public health officials have been telling us for years about the potential harm. The posting at Columbia University is reasonably representative:

"social media can also provide platforms for bullying and exclusion, unrealistic expectations about body image and sources of popularity, normalization of risk-taking behaviors, and can be detrimental to mental health"

There are many benefits to social media. It has become a source of information. Pew reports that eighty percent of us get news from social media. The potential for good has to be recognized. But, we have to persistently remember that just because something is said on social media does not make it news. Just because it is posted on some app does not make it true, desirable, or worthy. Just because someone has followers does not make them wise, honest, or exemplary. Can we tell the difference? Nonetheless, it is a fixture in our world. An undeniable presence that will not change on its own and will not go away. 

This week, the Seattle public school system "filed a novel lawsuit" alleging that the "tech giants" behind some of these social platforms should be held responsible "for the mental health crisis among youth." It claims that social media has "created a public nuisance by targeting their products to children." The alleged results are broad: "including anxiety, depression, disordered eating and cyberbullying."  

The lawsuit alleges that efforts have been exerted by the software engineers and others to "exploit" and "hook" youth. It claims that the result is "excessive use and abuse of Defendants' social media platforms," and persistent exposure to "harmful and exploitive" information and data. There is acknowledgment that the Communications Decency Act protects much of the platform's activity (or inactivity) as regards content. This lawsuit claims liability beyond the horrible postings of third parties (you, me, and the rest of the common people). Instead, it asserts liability based on the way the platforms identify, manipulate, and focus on the young and impressionable.  

Some of us are perhaps older, greyer, and calmer, but it is likely these platforms may be potentially striving just as hard to identify and manipulate us all.  

The hope of the school system is an order ending a "public nuisance." This is essentially injunctive relief where the court would tell the media giants to stop particular practices, programs, or processes. The lawsuit also seeks damages that the schools say are needed for a variety of media-generated, or media-exacerbated, problems among the young and impressionable. Is their effort to attract and retain attention any different than any other marketing effort?  

It is reminiscent, perhaps, of the long and relentless campaign against those who pushed, prodded, and cajoled opioid dependency. Finally, in that national crisis, there was litigation success in one opioid case. One manufacturer was held responsible, then another. There were settlements, bankruptcy, and recrimination. Litigation continues. This is not because opioids are inherently evil or inappropriate, but because the manner of some in marketing and delivering them was wrong and actionable.  

Will social media companies be similarly held responsible? Will the implications be as broad and deep? Or, is this school district lawsuit destined to fail? Time will tell, and it will be interesting to watch. 

By Judge David Langham

Courtesy of Florida Workers' Comp

  • Read Also

    About The Author

    • Judge David Langham

      David Langham is the Deputy Chief Judge of Compensation Claims for the Florida Office of Judges of Compensation Claims at the Division of Administrative Hearings. He has been involved in workers’ compensation for over 25 years as an attorney, an adjudicator, and administrator. He has delivered hundreds of professional lectures, published numerous articles on workers’ compensation in a variety of publications, and is a frequent blogger on Florida Workers’ Compensation Adjudication. David is a founding director of the National Association of Workers’ Compensation Judiciary and the Professional Mediation Institute, and is involved in the Southern Association of Workers’ Compensation Administrators (SAWCA) and the International Association of Industrial Accident Boards and Commissions (IAIABC). He is a vocal advocate of leveraging technology and modernizing the dispute resolution processes of workers’ compensation.