Jump to content

All Activity

This stream auto-updates     

  1. Today
  2. Chris


    Yeah I understand all that. My question was, or is, didn’t the county say there was a 2% increase in taxes on the county side.
  3. Adam


    2% is the maximum increase without having to go to voters so often many who either do not attend meetings or at least read the minutes wouldnt necessarily notice it...of course you could just do like some and claim no increases over last 4 years; you'd be surprised how many don't question it
  4. Yesterday
  5. The following was written by Seth Adams, owner of the Werdenberg building on the corner of Water and Main, in response to Legislator Morse's column ( reposted here with permission 😞
  6. Hal

    First Arena

    Respectfully, I am ever so slightly confused here . “At current, the arena is owned by the IDA (not the county) “ . So , the IDA is a County sanctioned private entity or a straight up part of the County Legislature itself ? Who funds the CCIDA to run and maintain the Arena ?
  7. In this week's Guest View at ElmiraTelegram.com, County Legislator Lawana Morse offers her thoughts on the Arena and what changed her mind to vote in support of funding out of ARP funds. Read it here.
  8. Not to belabor the point, but I thought this was an interesting article on the subject ( it should be a freebie ) Read the rest here.
  9. If the determination was just a couple weeks ago, then it's still possible that he'll be filing one of his lawsuits.
  10. Chris


    And the county is raising taxes another 2% I think I read?
  11. There was a full report given at the republican committee meeting on 11/8. Both Joe and Art have a life time ban from the committee and the report has been sent to the state election board.
  12. The city of Elmira raised taxes 1% in 2023, as you all must know. Did you also notice that, right after the recent elections, the mayor and city manager announced another tax increase of 2% for 2024.
  13. I honestly thought they were both dead already.
  14. If in fact they rule politicians can’t block users, can someone let Joe know? I can’t, Mr. Constitution blocked me.
  15. Nope. I think lawsuits just waiting for the statute of limitations to almost expire, are in the works.
  16. Last week
  17. I was just talking to someone about this the other day. We never did hear what brought all this about, did we?
  18. The "officiating" this year has been bad across the board, but last night was the worst case of it I've seen ever. Absolutely ridiculous. Two missed field goals lost the Bills the game*, but the refs were not helping. * I believe McD's call to run out the 20 seconds left and go to OT was BS. You have a quarterback with an arm like a cannon. Play a down or two and see if anything opens up down in field goal range. If not, then take a knee.
  19. Citizens have sometimes been surprised to find public officials blocking people from viewing their social media feeds. alashi/DigitalVision Vectors via Getty Images by Lynn Greenky, Syracuse University The First Amendment does not protect messages posted on social media platforms. The companies that own the platforms can – and do – remove, promote or limit the distribution of any posts according to corporate policies. But all that might soon change. The Supreme Court has agreed to hear five cases during this current term, which ends in June 2024, that collectively give the court the opportunity to reexamine the nature of content moderation – the rules governing discussions on social media platforms such as Facebook and X, formerly known as Twitter – and the constitutional limitations on the government to affect speech on the platforms. Content moderation, whether done manually by company employees or automatically by a platform’s software and algorithms, affects what viewers can see on a digital media page. Messages that are promoted garner greater viewership and greater interaction; those that are deprioritized or removed will obviously receive less attention. Content moderation policies reflect decisions by digital platforms about the relative value of posted messages. As an attorney, professor and author of a book about the boundaries of the First Amendment, I believe that the constitutional challenges presented by these cases will give the court the occasion to advise government, corporations and users of interactive technologies what their rights and responsibilities are as communications technologies continue to evolve. Public forums In late October 2023, the Supreme Court heard oral arguments on two related cases in which both sets of plaintiffs argued that elected officials who use their social media accounts either exclusively or partially to promote their politics and policies cannot constitutionally block constituents from posting comments on the officials’ pages. In one of those cases, O’Connor-Radcliff v. Garnier, two school board members from the Poway Unified School District in California blocked a set of parents – who frequently posted repetitive and critical comments on the board members’ Facebook and Twitter accounts – from viewing the board members’ accounts. In the other case heard in October, Lindke v. Freed, the city manager of Port Huron, Michigan, apparently angered by critical comments about a posted picture, blocked a constituent from viewing or posting on the manager’s Facebook page. Courts have long held that public spaces, like parks and sidewalks, are public forums, which must remain open to free and robust conversation and debate, subject only to neutral rules unrelated to the content of the speech expressed. The silenced constituents in the current cases insisted that in a world where a lot of public discussion is conducted in interactive social media, digital spaces used by government representatives for communicating with their constituents are also public forums and should be subject to the same First Amendment rules as their physical counterparts. If the Supreme Court rules that public forums can be both physical and virtual, government officials will not be able to arbitrarily block users from viewing and responding to their content or remove constituent comments with which they disagree. On the other hand, if the Supreme Court rejects the plaintiffs’ argument, the only recourse for frustrated constituents will be to create competing social media spaces where they can criticize and argue at will. Content moderation as editorial choices Two other cases – NetChoice LLC v. Paxton and Moody v. NetChoice LLC – also relate to the question of how the government should regulate online discussions. Florida and Texas have both passed laws that modify the internal policies and algorithms of large social media platforms by regulating how the platforms can promote, demote or remove posts. NetChoice, a tech industry trade group representing a wide range of social media platforms and online businesses, including Meta, Amazon, Airbnb and TikTok, contends that the platforms are not public forums. The group says that the Florida and Texas legislation unconstitutionally restricts the social media companies’ First Amendment right to make their own editorial choices about what appears on their sites. In addition, NetChoice alleges that by limiting Facebook’s or X’s ability to rank, repress or even remove speech – whether manually or with algorithms – the Texas and Florida laws amount to government requirements that the platforms host speech they didn’t want to, which is also unconstitutional. NetChoice is asking the Supreme Court to rule the laws unconstitutional so that the platforms remain free to make their own independent choices regarding when, how and whether posts will remain available for view and comment. In 2021, U.S. Surgeon General Vivek Murthy declared misinformation on social media, especially about COVID-19 and vaccines, to be a public health threat. Chip Somodevilla/Getty Images Censorship In an effort to reduce harmful speech that proliferates across the internet – speech that supports criminal and terrorist activity as well as misinformation and disinformation – the federal government has engaged in wide-ranging discussions with internet companies about their content moderation policies. To that end, the Biden administration has regularly advised – some say strong-armed – social media platforms to deprioritize or remove posts the government had flagged as misleading, false or harmful. Some of the posts related to misinformation about COVID-19 vaccines or promoted human trafficking. On several occasions, the officials would suggest that platform companies ban a user who posted the material from making further posts. Sometimes, the corporate representatives themselves would ask the government what to do with a particular post. While the public might be generally aware that content moderation policies exist, people are not always aware of how those policies affect the information to which they are exposed. Specifically, audiences have no way to measure how content moderation policies affect the marketplace of ideas or influence debate and discussion about public issues. In Missouri v. Biden, the plaintiffs argue that government efforts to persuade social media platforms to publish or remove posts were so relentless and invasive that the moderation policies no longer reflected the companies’ own editorial choices. Rather, they argue, the policies were in reality government directives that effectively silenced – and unconstitutionally censored – speakers with whom the government disagreed. The court’s decision in this case could have wide-ranging effects on the manner and methods of government efforts to influence the information that guides the public’s debates and decisions. Lynn Greenky is Professor Emeritus of Communication and Rhetorical Studies at Syracuse University This article is republished from The Conversation under a Creative Commons license. Read the original article.
  20. Read more here. What do you think about this game, or officiating in general this year?
  1. Load more activity
  • Create New...