Jump to content
TTL News

Supreme Court To Consider Giving First Amendment Protections To Social Media Posts

Recommended Posts

file-20231121-4426-i5zrwh.jpg?ixlib=rb-1

Citizens have sometimes been surprised to find public officials blocking people from viewing their social media feeds. alashi/DigitalVision Vectors via Getty Images

 

by Lynn Greenky, Syracuse University

The First Amendment does not protect messages posted on social media platforms.

The companies that own the platforms can – and do – remove, promote or limit the distribution of any posts according to corporate policies. But all that might soon change.

The Supreme Court has agreed to hear five cases during this current term, which ends in June 2024, that collectively give the court the opportunity to reexamine the nature of content moderation – the rules governing discussions on social media platforms such as Facebook and X, formerly known as Twitter – and the constitutional limitations on the government to affect speech on the platforms.

Content moderation, whether done manually by company employees or automatically by a platform’s software and algorithms, affects what viewers can see on a digital media page. Messages that are promoted garner greater viewership and greater interaction; those that are deprioritized or removed will obviously receive less attention. Content moderation policies reflect decisions by digital platforms about the relative value of posted messages.

As an attorney, professor and author of a book about the boundaries of the First Amendment, I believe that the constitutional challenges presented by these cases will give the court the occasion to advise government, corporations and users of interactive technologies what their rights and responsibilities are as communications technologies continue to evolve.

Public forums

In late October 2023, the Supreme Court heard oral arguments on two related cases in which both sets of plaintiffs argued that elected officials who use their social media accounts either exclusively or partially to promote their politics and policies cannot constitutionally block constituents from posting comments on the officials’ pages.

In one of those cases, O’Connor-Radcliff v. Garnier, two school board members from the Poway Unified School District in California blocked a set of parents – who frequently posted repetitive and critical comments on the board members’ Facebook and Twitter accounts – from viewing the board members’ accounts.

In the other case heard in October, Lindke v. Freed, the city manager of Port Huron, Michigan, apparently angered by critical comments about a posted picture, blocked a constituent from viewing or posting on the manager’s Facebook page.

Courts have long held that public spaces, like parks and sidewalks, are public forums, which must remain open to free and robust conversation and debate, subject only to neutral rules unrelated to the content of the speech expressed. The silenced constituents in the current cases insisted that in a world where a lot of public discussion is conducted in interactive social media, digital spaces used by government representatives for communicating with their constituents are also public forums and should be subject to the same First Amendment rules as their physical counterparts.

If the Supreme Court rules that public forums can be both physical and virtual, government officials will not be able to arbitrarily block users from viewing and responding to their content or remove constituent comments with which they disagree. On the other hand, if the Supreme Court rejects the plaintiffs’ argument, the only recourse for frustrated constituents will be to create competing social media spaces where they can criticize and argue at will.

Content moderation as editorial choices

Two other cases – NetChoice LLC v. Paxton and Moody v. NetChoice LLC – also relate to the question of how the government should regulate online discussions. Florida and Texas have both passed laws that modify the internal policies and algorithms of large social media platforms by regulating how the platforms can promote, demote or remove posts.

NetChoice, a tech industry trade group representing a wide range of social media platforms and online businesses, including Meta, Amazon, Airbnb and TikTok, contends that the platforms are not public forums. The group says that the Florida and Texas legislation unconstitutionally restricts the social media companies’ First Amendment right to make their own editorial choices about what appears on their sites.

In addition, NetChoice alleges that by limiting Facebook’s or X’s ability to rank, repress or even remove speech – whether manually or with algorithms – the Texas and Florida laws amount to government requirements that the platforms host speech they didn’t want to, which is also unconstitutional.

NetChoice is asking the Supreme Court to rule the laws unconstitutional so that the platforms remain free to make their own independent choices regarding when, how and whether posts will remain available for view and comment.

A man in a military uniform stands at a lectern looking out at a group of people sitting in chairs.

In 2021, U.S. Surgeon General Vivek Murthy declared misinformation on social media, especially about COVID-19 and vaccines, to be a public health threat. Chip Somodevilla/Getty Images

Censorship

In an effort to reduce harmful speech that proliferates across the internet – speech that supports criminal and terrorist activity as well as misinformation and disinformation – the federal government has engaged in wide-ranging discussions with internet companies about their content moderation policies.

To that end, the Biden administration has regularly advised – some say strong-armed – social media platforms to deprioritize or remove posts the government had flagged as misleading, false or harmful. Some of the posts related to misinformation about COVID-19 vaccines or promoted human trafficking. On several occasions, the officials would suggest that platform companies ban a user who posted the material from making further posts. Sometimes, the corporate representatives themselves would ask the government what to do with a particular post.

While the public might be generally aware that content moderation policies exist, people are not always aware of how those policies affect the information to which they are exposed. Specifically, audiences have no way to measure how content moderation policies affect the marketplace of ideas or influence debate and discussion about public issues.

In Missouri v. Biden, the plaintiffs argue that government efforts to persuade social media platforms to publish or remove posts were so relentless and invasive that the moderation policies no longer reflected the companies’ own editorial choices. Rather, they argue, the policies were in reality government directives that effectively silenced – and unconstitutionally censored – speakers with whom the government disagreed.

The court’s decision in this case could have wide-ranging effects on the manner and methods of government efforts to influence the information that guides the public’s debates and decisions.The Conversation

 

Lynn Greenky is Professor Emeritus of Communication and Rhetorical Studies at Syracuse University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Share this post


Link to post
Share on other sites

why do i have suspicion theyll find in favor of the politicians..

Share this post


Link to post
Share on other sites
11 hours ago, Elmira Telegram said:

If the Supreme Court rules that public forums can be both physical and virtual, government officials will not be able to arbitrarily block users from viewing and responding to their content or remove constituent comments with which they disagree.

If in fact they rule politicians can’t block users, can someone let Joe know? I can’t, Mr. Constitution blocked me.

  • Like 2

Share this post


Link to post
Share on other sites

Having thought about this more on a macro scale, I think there should be some limitations on the way social media can do business. Specifically, how their algorithms work. 

We've seen how toxic social media is, and we know how it's designed to tap into that anger. For God's sake, I click on one article about the officiating for Sunday's game, and now I don't just get a ton of those, oh no. I'm also getting every "Buffalo and their fans are trash" post made out there. It is designed to make me angry and get me more engaged. And that's just for football! Unless you've been living under a rock, you can see what it's done to our conversations about politics and social issues. 

On Twitter, I subscribe to the things that make me smile or are fun to engage in. Nostalgia stuff, Ricky Gervais, Mark Hamill ( although he is super political sometimes ) and a lot of writers and writing oriented posts. So that should be what the algorithm picks for me to keep me engaged, right?

Oh, no.

Here's what I get under "For You", which is the feed that shows up first:

Screen Shot 2023-11-29 at 10.03.38 AM.png

It changes frequently, but you get the drift. The first version I saw this morning had "Mike Obama" trending in reference to the belief some hold ( certainly not enough that it should be trending  ) that Michelle Obama was originally a man. 

This shit is designed and intended to cause division and anger. 

 

Share this post


Link to post
Share on other sites

This could soon trickle down to school board members as well:

 

file-20231218-15-v903xy.png?ixlib=rb-1.1

A California couple sued two school board members who blocked them on Facebook after they made critical remarks. OsakaWayne Studios via Getty Images

 

by Charles J. Russo, University of Dayton

If a school board member has a social media account, would it be wrong for them to block someone and delete their comments? That’s a question the Supreme Court has decided to take up after public officials, including two school board members, blocked constituents from seeing their accounts or removed critical comments.

At stake is what constitutes state action – or action taken in an official governmental capacity – on social media. Under the First Amendment, officials engaging in state action cannot restrict individuals’ freedom of speech and expression.

A ruling in the case, likely to come in spring or early summer 2024, could have broad implications for American society, where nearly three-fourths of the population use social media in their daily lives. The ruling could also establish whether social media accounts of public officials should be treated as personal or governmental.

In a joint oral argument, the Supreme Court heard two separate cases on the matter, including the one involving school board members, in late October 2023. Interestingly, lower courts reached opposite outcomes, prompting the question of whether a post on a personal social media page can be considered state action.

The school board case

Beginning around 2014, two school board candidates in the Poway Unified School District in San Diego created Facebook and Twitter, now X, pages as part of their campaigns for office. They continued to use them after they were elected to communicate with residents and seek their input.

In 2017, the school board members blocked a couple with children in the district from commenting on their pages. Christopher and Kimberly Garnier repeatedly posted criticism on those pages over such issues as the board members’ handling of race relations in the district and alleged financial wrongdoing by the then-superintendent. The Garniers responded to being blocked by filing a lawsuit.

In the resulting case, O'Connor-Ratcliff v. Garnier, the U.S. Court of Appeals for the 9th Circuit affirmed that the two school board members violated the Garniers’ First Amendment rights to free speech and expression. The court rejected the board members’ claims that their accounts were private because they were not controlled by their boards and their posts were not directly related to their official duties.

file-20231218-23-9xvonp.jpeg?ixlib=rb-1.

Christopher and Kimberly Garnier. Courtesy of Cory Briggs

The 9th Circuit judges made three points in ruling that the board members violated the First Amendment. First, the pages identified the board members as government officials and displayed their titles prominently. Second, the social media accounts provided information about school activities. And third, the board members solicited constituent input about school matters on the social media pages in question.

However, the court concluded that the board members were not liable for monetary damages. This is because at the time the school board members blocked the Garniers, no court had yet established whether the First Amendment applies to public officials’ speech in the context of social media. It was – and remains – a new frontier in the law.

Critical comments over COVID-19

Conversely, in a similar case in Port Huron, Michigan, the 6th Circuit made the opposite ruling.

Years before he was appointed city manager in 2014, a man named James Freed created a personal Facebook page that he eventually made public when he reached the limit of “friends” allowed on Facebook. Once in office, he used the page for both personal and professional reasons, posting updates about his family as well as policies he was working to implement. During the pandemic, constituent Kevin Lindke posted on Freed’s page, criticizing his handling of the public health crisis. Freed deleted Lindke’s comments and blocked him from the page. Lindke sued.

In Lindke v. Freed, the 6th Circuit affirmed that Freed did not violate the First Amendment in deleting and blocking Lindke’s comments. And like the 9th Circuit in O'Connor-Ratcliff v. Garnier, the court concluded that people’s First Amendment rights to comment on public officials’ social media pages had not yet been established.

The 6th Circuit ruled that Freed posted on his social media page as a private citizen, rather than as a governmental official. The court determined this for three reasons. First, no state law required him to run a social media page. Second, state funds and resources were not used to run the page. And third, the page belonged to Freed as an individual, rather than to the office of city manager – unlike the @POTUS page on X, for example. Therefore, the court concluded that the postings did not constitute state action subject to the First Amendment.

In April 2023, the Supreme Court agreed to intervene in both cases.

The future of the cases

Both cases not only have consequences for citizens’ First Amendment rights but also for social media companies and users. The Court may decide whether social media platforms such as Facebook and X can be liable for allowing a public official to block private citizens from commenting on their accounts.

These cases might also establish rules and standards about how public officials can control their social media accounts and the role of the courts in these disputes.

In a brief supporting the city manager in Lindke v. Freed, the U.S. Department of Justice basically argued that if the government neither owns nor controls the personal social media accounts of public officials, their behavior on the platforms “will rarely be found to be state action.”

The DOJ added that preventing public officials from blocking some messages might make them less willing to speak out about important issues. They warned that this could reduce, rather than enhance, free speech and discourse on matters of public interest, whether in schools or other agencies.

On the other hand, organizations such as the ACLU argue that allowing public officials to restrict comments on social media would be detrimental to democracy by limiting free speech.

“The upshot of the government officials’ argument is that they should have a constitutional blank check to silence or retaliate against their constituents for expressing disfavored viewpoints on social media,” the ACLU wrote about the two cases. “This would give officials a way to short-circuit our most fundamental First Amendment protections.”

Depending on how the court rules, social media may be headed into a new era of who can access and comment on the accounts of public officials.The Conversation

 

Charles J. Russo, Joseph Panzer Chair in Education and Research Professor of Law, University of Dayton

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Share this post


Link to post
Share on other sites

How will they go about stopping these entities from blocking posts?  Several of our govt reps already do this and it's in violation. Doesn't change anything. They continue to do it.  Trying to file a complaint with one of the social media companies almost impossible too.

  • Like 1

Share this post


Link to post
Share on other sites

Someone will have to have it happen to them and then file a lawsuit. And good freaking luck, unless they have deep pockets for a private attorney. Because as I found, the civil liberty groups out there don’t give a damn.

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...