Apr 222024
 

Yesterday, I looked up the latest “Parody Project” – a parody of “Cat’s in the Cradle” called “Rats who enable.” And I realized how long it’s been since I checked the site. Here’s a link to all the videos, in date order, newest first. Something else may well appeal to you as much or more. I admit I’m influenced by loving the original so much. Akso, Trinette was by, and we made some more space. She says hi to all. I couldn’t find aother artice (and ran out of time to look harder), so I’m putting in 2 videos of Beau. He’s been hot lately (and one of them scared the daylights out of me).

It wasn’t pretty – but I have to suspect she took him by surprise and he didn’t know what to do. It was just so different for a Republican to be minimally polite.

Share

Everyday Erinyes #366

 Posted by at 2:48 pm  Politics
Apr 162023
 

Experts in autocracies have pointed out that it is, unfortunately, easy to slip into normalizing the tyrant, hence it is important to hang on to outrage. These incidents which seem to call for the efforts of the Greek Furies (Erinyes) to come and deal with them will, I hope, help with that. As a reminder, though no one really knows how many there were supposed to be, the three names we have are Alecto, Megaera, and Tisiphone. These roughly translate as “unceasing,” “grudging,” and “vengeful destruction.”

Starting tomorrow at sunset and running through Tuesday is Yom HaShoah (“Holocaust and Heroism Remembrance Day,”) and I was planning on sharing an article on it today. But tomorrow and Tuesday are also the last two days of tax season, and what I had in mind deserves more time and more attention than is likely to be available this weekend. So it will run next week. Instead, I’m sharing an article looking at the platform “Discord,” a (presumably unwitting) player in the most recent classified document scandal to hit the news.
==============================================================

What is Discord? An internet researcher explains the social media platform at the center of Pentagon leak of top-secret intelligence

Some of the nation’s most closely guarded secrets were posted to a small online gaming community.
AP Photo/Jeff Chiu

Brianna Dym, University of Maine

The Justice Department on April 14, 2023, charged Jack Teixeira, a 21-year-old Massachusetts Air National Guard member, with unauthorized retention and transmission of national defense information and unauthorized removal and retention of classified documents or material. Media reports suggest that Teixeira didn’t intend to leak the documents widely but rather shared them on a closed Discord community focused on playing war games.

Some of the documents were then shared to another Discord community with a larger following and became widely disseminated from there.

So what is Discord and should you worry about what people are encountering there?

Ever since the earliest days of the internet in the 1980s, getting online has meant getting involved in a community. Initially, there were dial-up chat servers, email lists and text-based discussion groups focused on specific interests.

Since the early 2000s, mass-appeal social media platforms have collected these small spaces into bigger ones, letting people find their own little corners of the internet, but only with interconnections to others. This allows social media sites to suggest new spaces users might join, whether it’s a local neighborhood discussion or a group with the same hobby, and sell specifically targeted advertising. But the small-group niche community is making a comeback with adults, and with kids and teens.

When Discord was initially released in 2015, many video games did not provide players with live voice chat to talk to one another while playing the game – or required them to pay premium prices to do so. Discord was an app that enabled real-time voice and text chatting, so friends could team up to conquer an obstacle, or just chat while exploring a game world. People do still use Discord for that, but these days most of the activity on the service is part of wider communities than just a couple of friends meeting up to play.

Examining Discord is part of my research into how scholars, developers and policymakers might design and maintain healthy online spaces.

A little bit old school

Discord first came onto my radar in 2017 when an acquaintance asked me to join a writer’s support group. Discord users can set up their own communities, called servers, with shareable links to join and choices about whether the server is public or private.

The writer’s group server felt like an old-school chat room, but with multiple channels segmenting out different conversations that folks were having. It reminded me of descriptions of early online chat and forum-based communities that hosted lengthy conversations between people all over the world.

The people in the writers’ server quickly realized that a few of our community members were teenagers under the age of 18. While the server owner had kept the space invite-only, he avoided saying “no” to anyone who requested access. It was supposed to be a supportive community for people working on writing projects, after all. Why would he want to exclude anyone?

He didn’t want to kick the teens out, but was able to make some adjustments using Discord’s server moderation system. Community members had to disclose their age, and anyone under 18 was given a special “role” that tagged them as a minor. That role prevented them from accessing channels that we marked as “not safe for work,” or “NSFW.” Some of the writers were working on explicit romance novels and didn’t want to solicit feedback from teenagers. And sometimes, adults just wanted to have their own space.

While we took care in constructing an online space safe for teens, there are still dangers present with an app like Discord. The platform is criticized for lacking parental controls. The terms of service state that no one under 13 should sign up for Discord, but many young people use the platform regardless.

Additionally, there are people who have used Discord to organize and encourage hateful rhetoric, including neo-Nazi ideologies. Others have used the platform to traffic child pornography.

However, Discord does maintain that these sorts of activities are illegal and unwelcome on its platform, and the company regularly bans servers and users it says perpetuate harm.

Options for safety

Every Discord server I’ve joined since then has had some safeguard around young people and inappropriate content. Whether it’s age-restricted channels or simply refusing to allow minors to join certain servers, the Discord communities I’m in share a heightened concern for keeping young people on the internet safe.

This does not mean that every Discord server will be safe at all times for its members, however. Parents should still take the time to talk with their kids about what they’re doing in their online spaces. Even something as innocuous as the popular children’s gaming environment Roblox can turn bad in the right setting.

And while the servers I’ve been involved in have been managed with care, not all Discord servers are regulated this way. In addition to servers lacking uniform regulation, account owners are able to lie about their age and identity when signing up for an account. And there are new ways for users to misbehave or annoy others on Discord, like spamming loud and inappropriate audio.

But, as with other modern social media platforms, there are safeguards to help administrators keep online communities safe for young people if they want to. Server members can label an entire server “NSFW,” going beyond single channel labels and locking minor accounts out of entire communities. But if they don’t, company officials can do it themselves. When accessing Discord on an iOS device, NSFW servers are not visible to anyone, even accounts belonging to adults. Additionally, Discord runs a Moderator Academy to support training up volunteer moderators who can appropriately handle a wide range of situations.

A screenshot of a Discord community
Discord is another way for people to gather and communicate online.
Discord

Stronger controls

Unlike many other current popular social media platforms, Discord servers often function as closed communities, with invitations required to join. There are also large open servers flooded with millions of users, but Discord’s design integrates content moderation tools to maintain order.

For example, a server creator has tight control over who has access to what, and what permissions each server member can have to send, delete or manage messages. In addition, Discord allows community members to add automations to a server, continuously monitoring activity to enforce moderation standards.

With these protections, people use servers to form tight-knit, closed spaces safe from chaotic public squares like Twitter and less visible to the wider online world. This can be positive, keeping spaces safer from bullies, trolls and disinformation spreaders. In my own research, young people have mentioned their Discord servers as the safe, private space they have online in contrast to messy public platforms.

However, moving online activity to more private spaces also means that those well-regulated, healthy communities are less discoverable for vulnerable groups that might need them. For example, new fathers looking for social support are sometimes more inclined to access it through open subreddits rather than Facebook groups.

Discord’s servers are not the first closed communities on the internet. They are, essentially, the same as old-school chat rooms, private blogs and curated mailing lists. They will have the same problems and opportunities as previous online communities.

Discussion about self-protection

In my view, the solution to this particular problem is not necessarily banning particular practices or regulating internet companies. Research into youth safety online finds that government regulation aimed at protecting minors on social media rarely has the desired outcome, and more often results in disempowering and isolating youth instead.

Just as parents and caring adults tell the kids in their lives about recognizing dangerous situations in the physical world, talking about healthy online interactions can help young people protect themselves in the online world. Many youth-focused organizations, and many internet companies, have internet safety information aimed at kids of all ages.

Whenever young people hop onto the next technology fad, there will inevitably be panic over how the adults, companies and society may or may not be keeping young people safe. What is most important in these situations is to remember that talking to young people about how they use those technologies, and what to do in difficult situations, can be an effective way to help them avoid serious harm online.

This is an updated version of an article originally published on March 15, 2022.The Conversation

Brianna Dym, Lecture of Computer Science, University of Maine

This article is republished from The Conversation under a Creative Commons license. Read the original article.

==============================================================
Alecto, Megaera, and Tisiphone, I am a firm believer that there is nothing wrong with the Internet which a race of humans smarter, better educated, and more conscientious than we are couldn’t handle. Unfortunately, that race is not what we have. And figuring out how to operate and regulate the Internet in such a way that those of us who are educated and conscientious have all the access we need nd want, while those who – are not – are protected from it (and we from them), and still fulfill the promise of the First Amendment – well, that is a nightmare. In fact, we need to protect ourselves, since there is really no one doing it for us.

The Furies and I will be back.

Share
Mar 292023
 

Glenn Kirschner – Donald Trump’s threats to NY DA Alvin Bragg VIOLATE BOTH NEW YORK AND FEDERAL criminal statutes.

PoliticsGirl – TikTok Ban

Ring of Fire – Jim Jordan’s Staffers Whine That Manhattan DA’s Office Keeps Hanging Up On Them (I’d be inclined to send everything to VMX, with an OGM which apologizes to legitimate callers and explains why. Put one person – maybe hire someone for it – to traanscribe it all overnight.)

Mrs. Betty Bowers – The Game of Anti-Woke DISTRACTIONS

Hissing Feral Cat Falls In Love With The Guy Who Rescued Him

Beau – Let’s talk about a win at the Supreme Court for Americans with disabilities….

Share

Everyday Erinyes #349

 Posted by at 3:25 pm  Politics
Dec 182022
 

Experts in autocracies have pointed out that it is, unfortunately, easy to slip into normalizing the tyrant, hence it is important to hang on to outrage. These incidents which seem to call for the efforts of the Greek Furies (Erinyes) to come and deal with them will, I hope, help with that. As a reminder, though no one really knows how many there were supposed to be, the three names we have are Alecto, Megaera, and Tisiphone. These roughly translate as “unceasing,” “grudging,” and “vengeful destruction.”

Heaven knows we have a Second Amendment problem in the United States. But the magnitude of our Second Amendment problem partly stems from, and also distracts from, the huge First Amendment problem we also have – which we have had for a long time, but which has been made painfully obvious by the rise of the internet and social media.

To put it bluntly, hate speech leads to violence, and wide availability of guns leads to that violence being gun violence. To paraphrase the reasoning attributed to Karl Popper, a society cannot be a tolerant society if it tolerates intolerance. It’s easy to say – but it’s extremely hard to legislate and regulate. That’s why I was immediately drawn to this article about what regulating social media need to look like.

Because we cannot afford THIS.
==============================================================

What social media regulation could look like: Think of pipelines, not utilities

Is the law coming for Twitter, Meta and other social media outlets?
new look casting/iStock via Getty Images

Theodore J. Kury, University of Florida

Elon Musk’s takeover of Twitter, and his controversial statements and decisions as its owner, have fueled a new wave of calls for regulating social media companies. Elected officials and policy scholars have argued for years that companies like Twitter and Facebook – now Meta – have immense power over public discussions and can use that power to elevate some views and suppress others. Critics also accuse the companies of failing to protect users’ personal data and downplaying harmful impacts of using social media.

As an economist who studies the regulation of utilities such as electricity, gas and water, I wonder what that regulation would look like. There are many regulatory models in use around the world, but few seem to fit the realities of social media. However, observing how these models work can provide valuable insights.

Families across the U.S. are suing social media companies over policies that they argue affected their children’s mental health.

Not really economic regulation

The central ideas behind economic regulation – safe, reliable service at fair and reasonable rates – have been around for centuries. The U.S. has a rich history of regulation since the turn of the 20th century.

The first federal economic regulator in the U.S. was the Interstate Commerce Commission, which was created by the Interstate Commerce Act of 1887. This law required railroads, which were growing dramatically and becoming a highly influential industry, to operate safely and fairly and to charge reasonable rates for service.

The Interstate Commerce Act reflected concerns that railroads – which were monopolies in the regions that they served and provided an essential service – could behave in any manner they chose and charge any price they wanted. This power threatened people who relied on rail service, such as farmers sending crops to market. Other industries, such as bus transportation and trucking, would later be subjected to similar regulation.

Individual social media companies don’t really fit this traditional mold of economic regulation. They are not monopolies, as we can see from people leaving Twitter and jumping to alternatives like Mastodon and Post.

While internet access is fast becoming an essential service in the information age, it’s debatable whether social media platforms provide essential services. And companies like Facebook and Twitter don’t directly charge people to use their platforms. So the traditional focus of economic regulation – fear of exorbitant rates – doesn’t apply.

Fairness and safety

In my view, a more relevant regulatory model for social media might be the way in which the U.S. regulates electricity grid and pipeline operations. These industries fall under the jurisdiction of the Federal Energy Regulatory Commission and state utility regulators. Like these networks, social media carries a commodity – here it’s information, instead of electricity, oil or gas – and the public’s primary concern is that companies like Meta and Twitter should do it safely and fairly.

In this context, regulation means establishing standards for safety and equity. If a company violates those standards, it faces fines. It sounds simple, but the practice is far more complicated.

First, establishing these standards requires a careful definition of the regulated company’s roles and responsibilities. For example, your local electric utility is responsible for delivering power safely to your home. Since social media companies continuously adapt to the needs and wants of their users, establishing these roles and responsibilities could prove challenging.

Texas attempted to do this in 2021 with HB 20, a law that barred social media companies from banning users based on their political views. Social media trade groups sued, arguing that the measure infringed upon their members’ First Amendment rights. A federal appellate court blocked the law, and the case is likely headed to the Supreme Court.

A woman in a suit testifies before a congressional committee.
President Biden named Lina Khan, a prominent critic of Big Tech companies, as chair of the Federal Trade Commission in 2021. The agency investigates issues including antitrust violations, deceptive trade practices and data privacy lapses.
AP Photo/Saul Loeb

Setting appropriate levels of fines is also complicated. Theoretically, regulators should try to set a fine commensurate with the damage to society from the infraction. From a practical standpoint, however, regulators treat fines as a deterrent. If the regulator never has to assess the fine, it means that companies are adhering to the established standards for safety and equity.

But laws often inhibit agencies from energetically policing target industries. For example, the Office of Enforcement at the Federal Energy Regulatory Commission is concerned with safety and security of U.S. energy markets. But under a 2005 law, the office can’t levy civil penalties higher than US$1 million per day. In comparison, the cost to customers of the California power crisis of 2000-2001, fueled partially by energy market manipulation, has been estimated at approximately $40 billion.

In 2022 the Office of Enforcement settled eight investigations of violations that occurred from 2017 to 2021 and levied a total of $55.5 million in penalties. In addition, it opened 21 new investigations. Clearly, the prospect of a fine from the regulator is not a sufficient deterrent in every instance.

From legislation to regulation

Congress writes the laws that create regulatory agencies and guide their actions, so that’s where any moves to regulate social media companies will start. Since these companies are controlled by some of the wealthiest people in the U.S., it’s likely that a law regulating social media would face legal challenges, potentially all the way to the Supreme Court. And the current Supreme Court has a strong pro-business record.

If a new law withstands legal challenges, a regulatory agency such as the Federal Communications Commission or the Federal Trade Commission, or perhaps a newly created agency, would have to write regulations establishing social media companies’ roles and responsibilities. In doing so, regulators would need to be mindful that changes in social preferences and tastes could render these roles moot.

Finally, the agency would have to create enforcement mechanisms, such as fines or other penalties. This would involve determining what kinds of actions are likely to deter social media companies from behaving in ways deemed harmful under the law.

In the time it would take to set up such a system, we can assume that social media companies would evolve quickly, so regulators would likely be assessing a moving target. As I see it, even if bipartisan support develops for regulating social media, it will be easier said than done.The Conversation

Theodore J. Kury, Director of Energy Studies, University of Florida

This article is republished from The Conversation under a Creative Commons license. Read the original article.

==============================================================
Alecto, Megaera, and Tisiphone, Heather Cox Richardson closed her Letter for December 14 with this: “[I]n June, the Supreme Court handed down the sweeping New York State Rifle & Pistol Association, Inc. v. Bruen decision requiring those trying to place restrictions on gun ownership to prove similar restrictions were in place when the Framers wrote the Constitution. Already, a Texas judge has struck down a rule preventing domestic abusers from possessing firearms on the grounds that domestic violence was permissible in the 1700s.” (Emphasis mine)

Originalism. If it isn’t checked, it will kill us all. And the founders would absolutely not have wanted it. They were not idiots – they knew that circumstances would change, and that government of, by, and for the people would need to change with them. They said so – including in the Constitution itself – if not, why would they have included in it a provision for amending it?

I do have one thought regarding the setting of the amounts of fines for non-compliance. Setting dollar amounts clearly doesn’t work – values change and fines simply become an accepted “cost of doing business.” We need to start settimg fines not as “no more than X dollars” but instead as “not greater than Z percent of the defendant’s total net worth,” or some other indicator. “Y percent of the degendant’s gross annual profits in the most recent year” might work.

The Furies and I will be back.

Share
Sep 022022
 

Glenn Kirschner – Lindsey Graham says there will be riots if DOJ charges Donald Trump for his crimes

Meidas Touch – BREAKING: Google BANS Trump’s Truth Social because NO MODERATION of HATE

The Lincoln Project – Tough on Crime

MSNBC – Familiar Trump Stall Strategy Plays Out In ‘Special Master’ Gambit

Ring of Fire – Kids-For-Cash Criminal Judges Ordered To Pay Victims Over $200 Million

Beau – Let’s talk about access to higher education….

Share
Nov 262021
 

Glenn Kirschner – Michael Cohen Tells CNN that New York and Federal Prosecutors are Working Toward Indictments

politicsrus – Insulin

Thom Hartmann – Why Won’t Social Media Stop “Tokyo Roses” From Destroying Our Democracy?

Vox – Why you don’t hear about the ozone layer anymore. How did we manage to not get this dismissed as a hoax? And how can we do it again?

MSNBC – There’s Still Work To Do’ Says Rev. Al Sharpton On Arbery Trial Guilty Verdict

Rocky Mountain Mike – Auto Con-Bot 3000 (not a song.  But funny.

Beau – Let’s talk about hot takes, the Christmas parade, and intent….

Share
Oct 042021
 

Yesterday, I did my best to rest. I worked a little more on that last cotton project I mentioned … I had to undo some after discovering I was working on the wrong side. But not too much. And it gives me a second chance to get a color change over short rows right. So I’m not complaining.

Cartoon – 4 4Cartoon.jpg

Short Takes –

AP News – NOT REAL NEWS: A look at what didn’t happen this week
Quote – Vaccinated people do not carry more coronavirus than the unvaccinate…. Social media users are misrepresenting comments made by Dr. Leana Wen, former Baltimore health commissioner, to make the false claim.
Click through for this and other stories. And maybe boolmark the home page of this feature, which comes out every Friday. 

Law & Crime – Texas Man Accused of Throwing Molotov Cocktail at Democratic Party HQ While Wearing American Flag as Disguise
Quote – Ryan Faircloth, 30, stands accused of arson and possessing a prohibited weapon. He is currently being detained at the Travis County Jail in Austin after being arrested on Friday. He also has a federal charge pending according to the Austin American-Statesman.
Click through for data and reminders of the crime – which we all saw on video. So glad they got him. I hope they keep him.

The Guardian – Facebook whistleblower to claim company contributed to Capitol attack
Quote – A whistleblower at Facebook will say that thousands of pages of internal company research she turned over to federal regulators proves the social media giant is deceptively claiming effectiveness in its efforts to eradicate hate and misinformation and it contributed to the January 6 attack on the Capitol in Washington DC.
Click through for story. This wil have been on 60 Minutes last night. She’s not claiming financial contributions, but she is claiming motivatinal contribution.

Food for Thought –

Share