Everyday Erinyes #366

 Posted by at 2:48 pm  Politics
Apr 162023
 

Experts in autocracies have pointed out that it is, unfortunately, easy to slip into normalizing the tyrant, hence it is important to hang on to outrage. These incidents which seem to call for the efforts of the Greek Furies (Erinyes) to come and deal with them will, I hope, help with that. As a reminder, though no one really knows how many there were supposed to be, the three names we have are Alecto, Megaera, and Tisiphone. These roughly translate as “unceasing,” “grudging,” and “vengeful destruction.”

Starting tomorrow at sunset and running through Tuesday is Yom HaShoah (“Holocaust and Heroism Remembrance Day,”) and I was planning on sharing an article on it today. But tomorrow and Tuesday are also the last two days of tax season, and what I had in mind deserves more time and more attention than is likely to be available this weekend. So it will run next week. Instead, I’m sharing an article looking at the platform “Discord,” a (presumably unwitting) player in the most recent classified document scandal to hit the news.
==============================================================

What is Discord? An internet researcher explains the social media platform at the center of Pentagon leak of top-secret intelligence

Some of the nation’s most closely guarded secrets were posted to a small online gaming community.
AP Photo/Jeff Chiu

Brianna Dym, University of Maine

The Justice Department on April 14, 2023, charged Jack Teixeira, a 21-year-old Massachusetts Air National Guard member, with unauthorized retention and transmission of national defense information and unauthorized removal and retention of classified documents or material. Media reports suggest that Teixeira didn’t intend to leak the documents widely but rather shared them on a closed Discord community focused on playing war games.

Some of the documents were then shared to another Discord community with a larger following and became widely disseminated from there.

So what is Discord and should you worry about what people are encountering there?

Ever since the earliest days of the internet in the 1980s, getting online has meant getting involved in a community. Initially, there were dial-up chat servers, email lists and text-based discussion groups focused on specific interests.

Since the early 2000s, mass-appeal social media platforms have collected these small spaces into bigger ones, letting people find their own little corners of the internet, but only with interconnections to others. This allows social media sites to suggest new spaces users might join, whether it’s a local neighborhood discussion or a group with the same hobby, and sell specifically targeted advertising. But the small-group niche community is making a comeback with adults, and with kids and teens.

When Discord was initially released in 2015, many video games did not provide players with live voice chat to talk to one another while playing the game – or required them to pay premium prices to do so. Discord was an app that enabled real-time voice and text chatting, so friends could team up to conquer an obstacle, or just chat while exploring a game world. People do still use Discord for that, but these days most of the activity on the service is part of wider communities than just a couple of friends meeting up to play.

Examining Discord is part of my research into how scholars, developers and policymakers might design and maintain healthy online spaces.

A little bit old school

Discord first came onto my radar in 2017 when an acquaintance asked me to join a writer’s support group. Discord users can set up their own communities, called servers, with shareable links to join and choices about whether the server is public or private.

The writer’s group server felt like an old-school chat room, but with multiple channels segmenting out different conversations that folks were having. It reminded me of descriptions of early online chat and forum-based communities that hosted lengthy conversations between people all over the world.

The people in the writers’ server quickly realized that a few of our community members were teenagers under the age of 18. While the server owner had kept the space invite-only, he avoided saying “no” to anyone who requested access. It was supposed to be a supportive community for people working on writing projects, after all. Why would he want to exclude anyone?

He didn’t want to kick the teens out, but was able to make some adjustments using Discord’s server moderation system. Community members had to disclose their age, and anyone under 18 was given a special “role” that tagged them as a minor. That role prevented them from accessing channels that we marked as “not safe for work,” or “NSFW.” Some of the writers were working on explicit romance novels and didn’t want to solicit feedback from teenagers. And sometimes, adults just wanted to have their own space.

While we took care in constructing an online space safe for teens, there are still dangers present with an app like Discord. The platform is criticized for lacking parental controls. The terms of service state that no one under 13 should sign up for Discord, but many young people use the platform regardless.

Additionally, there are people who have used Discord to organize and encourage hateful rhetoric, including neo-Nazi ideologies. Others have used the platform to traffic child pornography.

However, Discord does maintain that these sorts of activities are illegal and unwelcome on its platform, and the company regularly bans servers and users it says perpetuate harm.

Options for safety

Every Discord server I’ve joined since then has had some safeguard around young people and inappropriate content. Whether it’s age-restricted channels or simply refusing to allow minors to join certain servers, the Discord communities I’m in share a heightened concern for keeping young people on the internet safe.

This does not mean that every Discord server will be safe at all times for its members, however. Parents should still take the time to talk with their kids about what they’re doing in their online spaces. Even something as innocuous as the popular children’s gaming environment Roblox can turn bad in the right setting.

And while the servers I’ve been involved in have been managed with care, not all Discord servers are regulated this way. In addition to servers lacking uniform regulation, account owners are able to lie about their age and identity when signing up for an account. And there are new ways for users to misbehave or annoy others on Discord, like spamming loud and inappropriate audio.

But, as with other modern social media platforms, there are safeguards to help administrators keep online communities safe for young people if they want to. Server members can label an entire server “NSFW,” going beyond single channel labels and locking minor accounts out of entire communities. But if they don’t, company officials can do it themselves. When accessing Discord on an iOS device, NSFW servers are not visible to anyone, even accounts belonging to adults. Additionally, Discord runs a Moderator Academy to support training up volunteer moderators who can appropriately handle a wide range of situations.

A screenshot of a Discord community
Discord is another way for people to gather and communicate online.
Discord

Stronger controls

Unlike many other current popular social media platforms, Discord servers often function as closed communities, with invitations required to join. There are also large open servers flooded with millions of users, but Discord’s design integrates content moderation tools to maintain order.

For example, a server creator has tight control over who has access to what, and what permissions each server member can have to send, delete or manage messages. In addition, Discord allows community members to add automations to a server, continuously monitoring activity to enforce moderation standards.

With these protections, people use servers to form tight-knit, closed spaces safe from chaotic public squares like Twitter and less visible to the wider online world. This can be positive, keeping spaces safer from bullies, trolls and disinformation spreaders. In my own research, young people have mentioned their Discord servers as the safe, private space they have online in contrast to messy public platforms.

However, moving online activity to more private spaces also means that those well-regulated, healthy communities are less discoverable for vulnerable groups that might need them. For example, new fathers looking for social support are sometimes more inclined to access it through open subreddits rather than Facebook groups.

Discord’s servers are not the first closed communities on the internet. They are, essentially, the same as old-school chat rooms, private blogs and curated mailing lists. They will have the same problems and opportunities as previous online communities.

Discussion about self-protection

In my view, the solution to this particular problem is not necessarily banning particular practices or regulating internet companies. Research into youth safety online finds that government regulation aimed at protecting minors on social media rarely has the desired outcome, and more often results in disempowering and isolating youth instead.

Just as parents and caring adults tell the kids in their lives about recognizing dangerous situations in the physical world, talking about healthy online interactions can help young people protect themselves in the online world. Many youth-focused organizations, and many internet companies, have internet safety information aimed at kids of all ages.

Whenever young people hop onto the next technology fad, there will inevitably be panic over how the adults, companies and society may or may not be keeping young people safe. What is most important in these situations is to remember that talking to young people about how they use those technologies, and what to do in difficult situations, can be an effective way to help them avoid serious harm online.

This is an updated version of an article originally published on March 15, 2022.The Conversation

Brianna Dym, Lecture of Computer Science, University of Maine

This article is republished from The Conversation under a Creative Commons license. Read the original article.

==============================================================
Alecto, Megaera, and Tisiphone, I am a firm believer that there is nothing wrong with the Internet which a race of humans smarter, better educated, and more conscientious than we are couldn’t handle. Unfortunately, that race is not what we have. And figuring out how to operate and regulate the Internet in such a way that those of us who are educated and conscientious have all the access we need nd want, while those who – are not – are protected from it (and we from them), and still fulfill the promise of the First Amendment – well, that is a nightmare. In fact, we need to protect ourselves, since there is really no one doing it for us.

The Furies and I will be back.

Share

Sorry, the comment form is closed at this time.