For years, the popular social media platform Reddit has played a cat-and-mouse game with removing its controversial subreddits—and this doesn’t seem to be slowing down anytime soon.
In spite of moderation, fringe communities continue to proliferate on Reddit, often developing more covert vocabulary to avoid detection and censorship. Some corners of Reddit are becoming instrumental in promoting hate speech, radicalizing suggestable users, and circulating extremist worldviews linked to acts of terror.
Reddit users using covert Boogaloo terms like “luau” and “pig roast” to refer to law-enforcement targeted violence—discovered using Beacon
As Reddit becomes more prevalent as a springboard for extremism, the platform is a crucial resource for intelligence communities seeking to understand how these movements communicate, recruit, and function.
How Reddit Breeds Extremism
In 2012, Reddit’s then-CEO expressed: “We stand for free speech. This means we are not going to ban distasteful subreddits. We will not ban legal content even if we find it odious or if we personally condemn it.”
Even though the site’s content policies have somewhat tightened since then, the notion of “free speech” still governs many of its questionable subreddits—such as those populated by incels (involuntary celibates) and other groups falling under the umbrella of right-wing extremism, like the Boogaloo movement.
Mainstream platforms like Reddit are central to recruiting suggestable individuals from a wider audience than is available on unregulated networks like Gab and the dark web. In this sense, Reddit is often the gateway to more explicit forms of extremism—it’s popular, accessible, and hosts a variety of subreddits where disenfranchised individuals can find solidarity with those who have similar grievances and interests.
Incel post on Reddit—discovered using Beacon.
But in an online space where users encourage each other’s outrage and validate violent notions, it doesn’t take much for these grievances to escalate into severe hate.
Take incel extremism, for example. While exact demographics are up for debate, many incels are teens and turn to Reddit to make sense of their plight and feel less alone. This vulnerable perspective is easily influenced by more extremist narratives circulating on the site. This kind of radicalization is common on Reddit, whether the initial grievance is prompted by misogyny, racism, or anti-establishment views.
Observing these online trends is necessary for intelligence professionals seeking to understand and address domestic radicalization and terrorism, and predict imminent public safety threats by high-risk individuals.
Incel post on Reddit. The term “go ER” refers to Elliot Rodger and often implies an intent to act out violently—discovered using Beacon.
Why Does This Matter?
Extremist communities, like incels and other right-wing offshoots, are often considered anomalous groups at the fringes of society. However, the prevalence of these movements—which is reflected by their online participation and real-world acts of terror—has real, powerful implications.
For one, suggestions of violence and extremism on platforms like Reddit tend to become more widespread and influential in the context of social and political upheaval or following terrorist acts. This has the potential to affect public safety as extremists may be further inspired to carry out physical acts of violence. When extremists are able to reach a wider audience on mainstream social media, memes, disinformation, and other narratives can also have a significant impact on public opinion and discourse.
Reddit user exalting Elliot Rodger as a “supreme gentleman.” This keyword is often used to praise mass shooters on social media—discovered using Echosec
In fact, one needs to look no further than the US President’s Twitter activity to observe elements of the far-right creeping into mainstream American politics. Rather than an anomaly, extremist movements are more clearly becoming a symptom of the society we live in. Understanding how these movements function, particularly in an online space, is central to informing an effective response and support for vulnerable individuals—and Reddit is a valuable piece of this puzzle.
Reddit as a Threat Intelligence Feed
Redditors produced 199 million posts in 2019. While a fraction of this includes harmful content, the site’s role in radicalization necessitates its use as a threat intelligence feed for government, defense, and counter-terror organizations.
Because of Reddit’s post volume and the constant flux of new and banned subreddits, intelligence professionals require specialized tools to gather relevant Reddit data easily and efficiently.
Threat intelligence platforms and APIs support counter-extremism initiatives by gathering Reddit data in line with requirements unique to the intelligence community. These tools enable analysts to:
- Aggregate and filter pertinent, real-time content in an external interface with no need to manually navigate Reddit
- Access invite-only and semi-closed groups that would otherwise require account creation
- Access public posts that have since been removed from Reddit
- Identify trends in extremism and radicalization, as well as indicators of public violence and high-risk users across multiple subreddits simultaneously
The Echosec Systems Platform offers Reddit as a data source, allowing users to aggregate and filter content in line with these requirements. To reduce time and resources spent manually separating false positives and contextualizing data, the Platform also uses machine learning algorithms to automatically detect and classify content in specific threat categories—like toxicity and hate speech.
Echosec Systems also offers an API to meet requirements beyond the scope of a pre-built UI. This enables intelligence teams to integrate Reddit (and many other) data crawled by Echosec Systems directly into their existing tooling and feeds.
In theory, it would be easier for extremists to move off of Reddit to unregulated networks like the dark web to engage in hate speech. But this assumes that extremism is born on fringe websites—when in reality, it relies on platforms like Reddit to reach a wider audience and serve as a jumping-off point for those most vulnerable to radicalization.
As governments tackle the rise of domestic extremism in the west, streamlined access to social data feeds like Reddit helps intelligence personnel better understand these movements—and ultimately inform more effective de-escalation strategies.
Working in counter-terrorism?
Contact us to address any gaps in your threat intelligence feeds.