Discord + War Thunder: The NatSec risks that blindsided America

With help from Derek Robertson

The recent leak of classified documents on Discord has raised uncomfortable questions in intelligence circles about how people’s online activities could undercut national security.

The leak serves as a window into how loyalty to tight-knit online communities and messaging apps can trump loyalty to the oath someone took to serve and protect their nation, meaning similar security breaches could happen in the future. Brian Hughes, co-founder and associate director of the Polarization and Extremism Research and Innovation Lab (PERIL) at American University, affirmed that the recent leaks on Discord and online forums for the videogame War Thunder highlight this phenomenon.

The Discord leaks revealed a broad array of sensitive information about U.S. spying activities and Russo-Ukrainian war plans, while the latest War Thunder leak involved restricted information about the F-16A fighter jet.

These “mid-sized online spaces” where membership can be both anonymous and selective can create “highly insular communities where loyalty and affinity transcend national identity,” Hughes said.

I sat down with Alex Golub, a professor of anthropology at the University of Hawaii at Mānoa and a gamer/tech culture blogger himself to learn about why digital communities have become lightning rods for spilled military secrets. Read on to hear Golub’s take on how ordinary people build strong online bonds over a shared project, what factors feed into the creation of a digital persona and why people keep leaking sensitive information in these anonymous internet communities.

The interview has been edited for length and clarity.

MC: Do you play video games, professor Golub?

AG: I play video games all the time, yeah. The two games that I’m playing right now are Pathfinder: Wrath of the Righteous (which is a single-player role playing game) and Marvel StrikeForce — that’s a mobile game. So that’s the one I’m in Discord for most of the time. First thing I do in the morning: I look on my phone and do my errands in that game. And it’s usually the last thing I do before I go to bed at night. My main body of research is on World of Warcraft — which I played five hours a day for, like, two or three years. I’m still working on the book manuscript on that, but I don’t play WoW regularly anymore.

MC: I should tell you that I don’t often play video games. But by virtue of being married to someone who does, I’ve been inducted into the world of gaming through forced osmosis.

AG: You know, 95% of people play video games, according to some surveys. You’ve never played Words With Friends or Wordle or just played Candy Crush on your phone?

A lot of times, when people think about video games, they think about guys being racist and blowing each other to smithereens. But people often do play video games — they just don’t think about them as video games because they don’t fit that violent gamer stereotype.

MC: Point well taken. One thing that stood out to me in the discourse surrounding the leaked documents on Discord was that those “secretive” online communities that drew national attention are also simply where I sometimes go to hang out with my geographically distant friends, bonding over common interests.

AG: Discord is for normal people what Slack is for East Coast intellectual elites.

MC: Hah, I love that description. So what (if anything) actually sets a gamer chatroom apart culturally from other anonymous online spaces?

AG: You know, the categorization we should be using is not “gamer vs non-gamer.” It’s kind of like saying: “a chatroom about books vs a chatroom that isn’t about books.

What I have found in my research is that in video games — like World of Warcraft and in many other games — there is a collaborative project that people undertake together. In World of Warcraft, it was raiding. In Marvel StrikeForce, well, it’s still raiding, although less serious. In War Thunder, that highly accurate World War II massive online game, it’s historical accuracy.

So what you’re really looking at is a community of practice — a community of people who form around a project that is so compelling to them. It’s so central to their biographies and their sense of self that it becomes central to who they spend time with and hang out with.

These communities of practice are not just found inside of game worlds. They’re found everywhere. Some of Politico’s newsletters focus on communities of practice — where the practice is just passing legislation. Out of that collaborative act, a whole social network grows. It’s just that people are bonding over video games in a simulated space instead of the actual world.

MC: Jumping off of that, something you wrote about recently was about how video game streams were creating new forms of community.

Raising funds for a dog surgery on a Twitch stream, rallying a fringe political movement on Telegram, spilling national security secrets on Discord — these are all concrete, real-world impacts of people just living their digital lives. But I’m not sure a 21-year-old airman would feel as empowered to divulge Pentagon docs in real life as he did in an anonymous online space. What is it about our sense of digital self that makes us reorient our risk framework like that?

AG: Well, those projects we’re committed to biographically — they can happen online or offline. Nothing about the digital space is different from the physical world in that sense. People can get involved in these projects that become really important to them. And those projects spill out from the original space to, say, Discord.

These digital worlds do not especially reorient your life. But there’s the Discord where, as you’ve said, you hang out with your friends and your kids. There’s also, you know, Nazi Discord. But the problem with that digital space is not Discord. It’s the Nazis.

MC: So the kind of community-building we’re talking about — it can be platform agnostic as a phenomenon?

AG: There might be a certain set of features — technical affordances — about a platform that incentivize certain behaviors. In South Asia, there’s a huge problem with people organizing lynchings over anonymous group chats. In that case, the anonymity of a social platform and its lack of accessibility for local authorities is a technical feature that enables some kinds of behavior and not others.

MC: Now we’re walking into a digital privacy debate where giving blanket access to user data that law enforcement can use to track down bad actors is also maybe a terrible idea for safeguarding people’s digital privacy. Do you have any thoughts on how to find that balance between protecting national interests and protecting civil liberties?

AG: There’s always that classical trade-off between having platforms control and curate the content users see versus the user being free from the platform’s regulation.

In the case of Discord, the fact that you’ve never met people face-to-face makes it easier to engage in flame wars. The structure of the space — the anonymity, the difficulty of reading emotional stances via text — can affect how pleasant the culture is. Many of the Discord servers I’m on, they tell you, “There’s a code of conduct here. You can’t do this or it will get you kicked off.” And you get kicked off if you do anything wrong.

That basic social contract is what many tech platforms don’t have. Maybe because they don’t want to lose members. Or they feel like it might threaten their ability to claim that they’re not a publisher, and just a platform, which deals with Section 230. Anyway, the culture of the local community creates a set of technical affordances. And some online spaces foster discourse where people don’t prioritize civility. They want the 4chan-ness of it all. And so you have those sorts of online places.

I mentioned earlier that the technical affordances shape behavior, but there’s also this concept of ethical affordance, where your life experiences give you new possibilities for who you’re going to be. A guy named Webb Keene has written a lot about this. For instance, in the 1960s, when first-wave feminists went around and explained to bored housewives that they weren’t nervous, they were actually oppressed and pissed off at the patriarchy — that gave women the ethical affordance to radically reconceptualize who they were. So the other thing about online spaces is that we’re constantly giving people options about what kind of role they want to inhabit. So, what ethical selves can secular working-class white guys in the United States inhabit online?

MC: One last question about defending digital personas. One commonality across the Discord and War Thunder intelligence leaks was this notion of wanting to prove one’s authenticity in an anonymous, online space. Any thoughts on that chest-thumping tendency to defend one’s online reputation?

AG: Reputation is something that everyone cares about. Trying to prove that you know what you’re talking about — and people’s need to reinforce that reputation — is a very common thing. It happens everywhere, from college dorm rooms to Congress. In that sense, these online spaces are not that different from the rest of life.

render unto chatgpt-aesar

Happy Tax Day! (Only some irony implied.)

And while you’re at it: Watch out for a new, relatively sophisticated tax scam, brought to you by none other than ChatGPT. As POLITICO’s Benjamin Guggenheim reported in Morning Tax yesterday, large language models have the potential to supercharge existing efforts to scam people out of their personal information around tax season.

“When Sergey Shykevich, a threat intelligence manager at the cybersecurity company Check Point Research, prompted ChatGPT to produce an example of a tax scam email containing malware, the AI chatbot spit back a grammatically immaculate email on the Employee Retention Credit that asked the recipient for their Employer Identification Number, payroll information and a list of their employees and their social security numbers,” Benjamin writes. Fun!

Shykevich proceeded to share with Morning Tax a few more ChatGPT-generated examples, all under the guise of producing examples meant to “educate” readers on the exact types of scams to avoid. Not that he had to try very hard to skate the software’s guardrails: Shykevich told Benjamin he had “no problem” eliciting schemes similar to those on the “Dirty Dozen” list of scams the IRS publishes each year. — Derek Robertson

more metaverse rule-setting

The group of tech-world heavyweights who gathered informally last year to figure out how early metaverse developers can set common standards announced today that it’s officially incorporating as a nonprofit a move its leaders said will help formalize its membership structure and accelerate that process of (eventual) rule-setting.

The Metaverse Standards Forum, a body that includes Meta, Microsoft, Epic Games, and more than 2400 other companies, has been quietly working since last summer to coordinate what topics (like safety and privacy, 3D graphics standards, and digital identity) should be prioritized when it comes time to develop the metaverse’s standards. It’ll now be redoubling those efforts as an incorporated nonprofit, something I spoke about yesterday morning with Neil Trevett, the forum’s chair, who said the move is essentially a formal statement that these groups are in the process of metaverse development for the long haul.

I also asked him about a major elephant in the room when it comes to this process, and the Forum itself the absence of Apple, which is widely expected to make its own entry into virtual reality later this year.

“I think their participation could be very, very significant,” Trevett said, pitching Apple’s lack of participation so far as replicating a longstanding dynamic between “open” and “closed” business models. “More participation from more companies is always welcome, but we shouldn’t stop fostering opportunity on [the open] end of the spectrum just because any company decides to be on the other end.” — Derek Robertson

Tweet of the Day

the future in 5 links

Stay in touch with the whole team: Ben Schreckinger ([email protected]); Derek Robertson ([email protected]); Mohar Chatterjee ([email protected]); Steve Heuser ([email protected]); and Benton Ives ([email protected]). Follow us @DigitalFuture on Twitter.

If you’ve had this newsletter forwarded to you, you can sign up and read our mission statement at the links provided.