Loading AI tools
List of notable controversial communities on Reddit From Wikipedia, the free encyclopedia
On the social news site Reddit, some communities (known as "subreddits" or "subs") are devoted to explicit, violent, propagandist, or hateful material. These subreddits have been the topic of controversy, at times receiving significant media coverage. Journalists, attorneys, media researchers, and others have commented that such communities shape and promote biased views of international politics, the veracity of medical evidence, misogynistic rhetoric, and other disruptive concepts.
This article needs additional citations for verification. (October 2024) |
The founders of Reddit have claimed they did not intend the platform to be a "bastion of free speech", where even hate speech would be tolerated.[1] However, for a period of time, Reddit allowed these controversial communities to operate largely unrestricted. The site's General Manager, Erik Martin, has argued that objectionable material is a consequence of allowing free speech on the site.
Eventually, Reddit administrators instituted usage rules to allow for the banning of groups and members who stole or exposed personal information/images or promoted illegal activity, violence, shaming, racial or gender hatred, harassment, or extremist speech. Nevertheless, there remain various active and heavily-trafficked subreddits which skirt the edges of the rules.
Critics argue that while concerned Redditors and moderators often report these subs, they often remain open until a specific incident, or the actions of an individual, forces them to come under more intense scrutiny and requires administrators to decide between allowing distasteful content or suppressing dangerous or destructive communities. Critics have also charged that the site has been inconsistent in what it bans. Some banned users and communities have created or moved to other platforms, with some even saving a duplicate of their subreddit in order to preserve it elsewhere, in the event it gets banned.
At least one controversial subreddit was started or maintained by a high-profile user, New Hampshire legislator Robert Fisher.
When Reddit was founded in 2005, there was only one shared space for all links, and subreddits did not exist. Subreddits were created later, but initially they could only be created by Reddit administrators. In 2008, subreddit creation was opened to all users.[2]
Reddit rose to infamy in October 2011, when CNN reported that Reddit was harboring the r/Jailbait community, a subreddit devoted to sharing suggestive or revealing photos of underage girls. In a 2011 incident, an r/Jailbait user posted a provocative image of an underage girl. A wave of Reddit users ("Redditors") sent private messages to the poster requesting more photos of the girl. Various news sources criticized r/Jailbait, and Reddit administrators closed the forum.[3]
In 2012, the subreddit r/Creepshots received major backlash for sharing suggestive or revealing photos of women taken without their awareness or consent. Adrian Chen wrote a Gawker exposé of one of the subreddit's moderators and identified the person behind the account, starting discussion in the media about the ethics of anonymity and outing on the Internet.[4]
In 2020, administrators banned the subreddit r/The_Donald for harassment, having previously taken steps to lower the sub's immediate visibility (such as creating an opt-in button).[citation needed]
In 2015, Reddit introduced a quarantine policy to make it more difficult to visit certain subreddits. To visit or join a quarantined subreddit, users must bypass a warning prompt.[5] In addition, to prevent users from viewing their content accidentally, quarantined subreddits do not appear in non-subscription based (aggregate) feeds such as r/all.[6] Additionally, quarantined subs do not generate revenue, and their user count is not visible. Since 2018, subreddits have been allowed to appeal a quarantine.[7]
Reddit is highly prone to spreading misinformation and disinformation due to its decentralized moderation, user anonymity, and lack of fact-checking systems.[8] A 2023 NPR article suggested that Redditors should exercise caution before taking user-created unsourced content as fact.[9] Reddit communities exhibit the echo chamber effect, in which repeated unsourced statements come to be accepted among the community as fact, leading to distorted worldviews among users.[10]
A 2021 letter from the United States Senate to Reddit CEO Steve Huffman expressed concern about the spread of COVID-19 misinformation on the platform.[11] A study the following year revealed an abundance of unsourced and potentially harmful medical advice on Reddit for urinary tract infections, like suggesting fasting as a cure.[12]
Critics have argued that since 2019, Russian-sponsored troll accounts and bots have taken over prominent left-wing and right-wing subreddits such as r/antiwar, r/greenandpleasant, and r/aboringdystopia, "suggest[ing] a Russian-led attempt to antagonize and influence Americans online, which is still ongoing."[13]
Some subreddits are dedicated to discussing unapproved or illegal drugs, including meth;[14] opioids;[15][16][17] novel psychoactive substances;[18][19] performance-enhancing drugs such as anabolic steroids and SARMs;[20] and 2,4-Dinitrophenol, a weight loss drug which the FDA declared unfit for human use in 1938 because it can cause fatal overdoses and cataracts.[21] However, drugs-related subreddits have also enabled research and could provide information that would be difficult or impossible to obtain otherwise.[18][21] Reddit also contains subreddits dedicated to addiction recovery.[22]
In snark subreddits, members (known as "snarkers") gossip about, express frustration towards, or "snark" on public figures.[23] Some of these subs specifically target female influencers such as YouTubers and TikTokers.[24][25][26]
Snark subreddits have been criticized, both by critics and by their targets, as an invasive form of cyberbullying.[citation needed]
Banned subreddits refer to subreddits that Reddit has shut down indefinitely.
On June 9, 2014, Reddit closed a subreddit called r/beatingwomen. The community, which featured graphic depictions of violence against women, was banned after its moderators were found to be sharing users' personal information online. These moderators were also collaborating to protect one another from site-wide bans. After r/beatingwomen was banned, the community's founder rebooted the subreddit under the name r/beatingwomen2 in an attempt to circumvent the ban; in response, Reddit banned his user account.[27][28]
After r/Incels was banned in November of 2017 (see below), r/Braincels took its place as the most popular subreddit for incels, or "involuntary celibates". Within five months 16,900 users had joined the sub, which promoted rape and suicide. It was banned in 2019 for violating Reddit's content policy with respect to bullying and harassment.[29][30][31]
r/ChapoTrapHouse was a subreddit dedicated to the leftist podcast Chapo Trap House. It is associated with the term dirtbag left.[32][33] The community had 160,000 regulars before being banned on June 29, 2020 because they "consistently host[ed] rule-breaking content and their mods ... demonstrated no intention of reining in their community."[34] Previously, the community had been quarantined for content that promoted violence.[33] The community of the subreddit later migrated to an instance of Lemmy, a Reddit alternative.[35]
The term "Chimpire" refers to a collection of subreddits and affiliated websites that promoted anti-black racism and frequently used racial slurs.[citation needed]
In June 2013, Reddit banned the subreddit r/niggers for engaging in vote manipulation, inciting violence, and disrupting other communities with racist content. Reddit general manager Erik Martin noted that the sub was given multiple chances to comply with site rules: "users can tell from the amount of warnings we extended to a subreddit as clearly awful as r/niggers that we go into the decision to ban subreddits with a lot of scrutiny".[36]
Following the ban of r/niggers, the subreddit r/Coontown grew to become the most popular "Chimpire" site, with over 15,000 members at its peak.[37] Many of the posters on these subreddits were formerly involved with r/niggers.[38][39][40]
r/Chodi, whose name is derived from a crude Hindi sexual slang term, was a right-wing Indian subreddit that claimed to be a "free speech sub for memes, jokes, satire, sarcasm and fun". By January 2022 the sub had over 90,000 subscribers, who called open for genocide against Muslims and frequently propagated Islamophobic, anti-Christian, homophobic, and misogynistic content. According to a Time article, subscribers used intentional misspellings and slang to circumvent Reddit's anti-hate speech software.[41][42] The Quint noted that Reddit is used as a haven for hate speech in India, citing r/Chodi's popularity as an example.[43] The sub was banned on March 23, 2022 for promoting hate, causing its users to move to Telegram.[44]
r/ChongLangTV, whose name is derived from the Great Wave off Kanagawa, was a Chinese-language subreddit that espoused extreme anti-Chinese sentiment. On March 2, 2022, when it had over 53,000 subscribers, Reddit administrators banned the sub for "exposing privacy of others." One subscriber told Radio Free Asia that the Reddit ban was due to Chinese long-arm internet censorship.[45] The community's founder rebooted the subreddit under the name r/CLTV in an attempt to circumvent the ban, but Reddit banned his user account in response.[citation needed]
A year after r/jailbait was closed, another subreddit called r/CreepShots drew controversy in the press for hosting sexualized images of women taken without their knowledge.[46] In the wake of this media attention, the user u/violentacrez was added to r/CreepShots as a moderator.[47] This user moderated dozens of controversial subreddits as well as a few hundred general-interest communities.
In late 2012, reports emerged that Adrian Chen of Gawker was planning an exposé which would reveal u/violentacrez's real-life identity. In response to the impending article, the account u/violentacrez was deleted and several major subreddits banned links to Gawker.[48][49][50] Moderators defended this decision, arguing that the impending article would constitute "doxxing," and that such exposure threatened the site's structural integrity.[50]
When Chen informed u/violentacrez about the impending exposé, the user pleaded with Chen not to publish it. He expressed concern about its potential impact on his employment and finances, noting that his wife was disabled and that he had a mortgage to pay. He also worried that he would be falsely labeled a child pornographer or antisemite due to some of the subreddits he had created. Despite u/violentacrez's offer to delete his postings and leave Reddit, Chen insisted he would still publish the piece.[4][51]
Chen published his exposé on October 12, 2012, revealing that u/violentacrez was a middle-aged programmer from Arlington, Texas named Michael Brutsch.[4][52] By the next day, Brutsch had been fired by his employer, and Reddit briefly banned the link to the exposé.[53][54] Brutsch wrote on Reddit that he received numerous death threats after the article was published.[55]
Reddit CEO Yishan Wong defended the content Brutsch contributed to the site, arguing that it constituted free speech, while criticizing efforts to ban the Gawker link on the same basis.[56] Wong stated that Reddit staff had considered a site-wide ban on the link, but rejected the idea for fear that it would be ineffective while also creating a negative impression of the site.[57] Later, Brutsch briefly returned to Reddit on a different account, criticizing what he stated were numerous factual inaccuracies in the Gawker exposé.[58]
A week after the exposé, Brutsch did an interview with CNN journalist Drew Griffin. In the interview, which aired on Anderson Cooper 360°, Brutsch was apologetic about his activity on Reddit. He explained that he enjoyed the appreciation he got from other Redditors, and that Reddit helped him relieve stress. Brutsch also described the support he had from administrators, stating that he had received an award for his contributions. Reddit responded that they regretted sending this award (for being named "Worst Subreddit" via a community vote); they also claimed that u/violentacrez had been banned on several occasions.[59][60] Brutsch subsequently noted on Reddit that he regretted doing the interview, and he criticized the accuracy of Reddit's statement to CNN.[61]
Chris Slowe, who was a lead Reddit programmer until 2010, said of the relationship between Brutsch and the Reddit staff: "We just stayed out of there and let him do his thing and we knew at least he was getting rid of a lot of stuff that wasn't particularly legal."[4]
Gawker's outing of Brutsch as u/violentacrez led to contentious discussion about privacy and anonymity on the Internet.[62] Some argued that that outing, or "doxing", was necessary to draw attention to objectionable content so it could be removed. Others claimed that fear of doxing and public retribution impeded people from exercising their right to legal free speech online.[63][64]
Writing for The Guardian, Jude Doyle (then known as Sady Doyle) argued that certain doxings may be justified, comparing Gawker's article to the outing of Amanda Todd's alleged blackmailer. On the other hand, he argued that by engaging in "sensationalism" at the expense of cultural reform, doxings may unduly focus attention on individuals without confronting the underlying problems.[65] In PC Magazine, Damon Poeter stated that while he had defended protecting anonymity on the Internet, he supported Brutsch being outed and felt the doxing was justifiable, as he thought the various subreddits that u/violentacrez contributed to were serious invasions of privacy regardless of legality.[64]
The public outpouring of hostility towards Brutsch following the exposé prompted commentators such as Danah Boyd of Wired and Michelle Star of CNET to question the morality of outing as a way to enforce societal standards online.[66][67] Several commentators expressed concern that the public shaming of Brutsch to serve as an example to others legitimizes Internet vigilantism and exposes individuals such as Brutsch to mass retribution.[66][67][68][69]
r/CringeAnarchy was a subreddit themed around "cringe," "edgy", politically incorrect content. Originally an uncensored (hence "anarchy") spinoff of r/cringe,[70] its content later shifted to the far right, with anti-transgender and anti-"SJW" content taking over.[71][72][73] The subreddit was quarantined in September of 2018, at which point it had over 400,000 subscribers.[74][75][76]
Following the 2019 Christchurch mosque shootings, anti-Muslim posts on r/CringeAnarchy increased.[76] The sub was banned on April 25, 2019, for violating Reddit's content policy regarding violent content.[77]
In 2015, federal agents[who?] requested Reddit employees to turn over the personal information of several users active in r/DarkNetMarkets, a darknet market discussion forum.[78][79] The federal investigation's focus was illegal sales of drugs, weapons, and stolen financial details.[79]
r/Deepfakes was a controversial subreddit where subscribers used FakeApp to superimpose the faces of famous female actresses onto pornographic videos without their consent.[82] These actresses included Emma Watson and Daisy Ridley.[82][83] After the subreddit received notoriety from the press, Gfycat and Discord banned its videos. Pornhub followed suit on February 6, 2018, and a day later the subreddit was banned.[84]
r/European was a far-right white nationalist subreddit focused on news relating to Europe. It was founded in 2013 after r/europe banned hate speech. Its users often promoted anti-Semitic, Islamophobic, and racist content, and an informal survey showed that 17% of the sub's users openly identified as Nazis.[85][73][86] The sub was set to private by its moderators, and in 2016 it was quarantined by the sitewide administrative staff in response to a post where a user bragged about assaulting a Muslim refugee. The users subsequently migrated to r/The_Donald, and then to r/Mr_Trump following a dispute with r/The_Donald's moderators.[86] Reddit banned r/European on March 12, 2018, for violating its content policies.[87]
On June 10, 2015, citing an anti-harassment policy, Reddit banned five subreddits: r/FatPeopleHate, r/hamplanethatred, r/neofag, r/transfags, and r/shitniggerssay.[88][89]
The largest of the five, r/FatPeopleHate, had an estimated 151,000 subscribers at the time.[88] The sub hosted photos of overweight people for the purpose of mocking them.[90] A Reddit admin said: "We will ban subreddits that allow their communities to use the subreddit as a platform to harass individuals when moderators don't take action".[88]
Following the ban, Reddit users flooded the site with pictures of overweight people, as well as photos of Reddit's interim CEO Ellen Pao.[91] Some users moved to Voat, a social aggregation website similar to Reddit,[92] although other fat-shaming forums continued to exist on Reddit.[93]
Following the Boston Marathon bombing in April of 2013, members of the subreddit r/FindBostonBombers wrongly identified several people as suspects, including a 17-year-old track athlete and a 22-year-old Brown University student who had been missing since March.[94] The missing student's body was found in the Providence River in Rhode Island on April 25, 2013, as reported by the Rhode Island Health Department.[95][96] The cause of death was found to be suicide.[97] The subreddit was later made private.[98]
Reddit general manager Erik Martin later issued an apology for this behavior, criticizing the "online witch hunts and dangerous speculation" that took place in these investigation-oriented communities.[99] In September 2013, Reddit admins banned a similar subreddit dedicated to finding the Navy Yard shooter(s).[100] These events were dramatized in TV shows The Newsroom[101][102] and The Good Wife.[103]
r/frenworld, whose title is derived from the alt-right meme "Clown World", attracted controversy over its use of Pepe the Frog edits and clown imagery to promote anti-Semitic and racist dog whistles. The Times of Israel and The Daily Dot found numerous references in the subreddit to Holocaust denialism, the USS Liberty incident, and alleged racial crime statistics.
A major aspect of the sub was users' use of slang and childish diction, such as "nose-fren" and "longnose" for Jews, "bop" for committing violence or genocide, and "Honk honk" as a euphemism for "Heil Hitler". On June 20, 2019, after it had accumulated around 60,346 subscribers, the sub was banned for glorifying violence. r/Honkler, which hosted similar content, was banned on July 2, 2019.[75][104][105]
r/GasTheKikes was an antisemitic subreddit, the name of which alluded to the gas chambers used in the Holocaust. New York magazine described it as a "massive online Jew-hating community" among "the worst of the worst" subreddits.[106] The community was banned from Reddit,[106] after which a successor subreddit named r/KikeTown took its place.[107] In 2015 r/KikeTown was first quarantined, then banned.[108][109][110]
The subreddit r/GenderCritical, which had 64,400 users, described itself as "Reddit's most active feminist community" for "women-centred, radical feminists" to discuss "gender from a gender-critical perspective". Described by Jillian York of the Electronic Frontier Foundation as "a subreddit where transphobic commentary has thrived", the subreddit frequently hosted posts asserting that transgender women are not women. On June 29, 2020, the subreddit was "banned for violating Reddit's rule against promoting hate".[111][112][113][114] After r/GenderCritical was banned, several of its users migrated to Ovarit, a trans-exclusionary radical feminism-centered website similar to Reddit.[35]
In January 2014 Mother Jones published a story about gun sales on Reddit, suggesting that sellers were using the platform to exploit a loophole in U.S. federal law.[115] Nearly 100 AR-15s were engraved with the Reddit logo as part of a licensing deal made with the sub in 2011.[116] It was banned on March 21, 2018, after Reddit updated its content policies to forbid subreddits that facilitate transactions involving certain goods and services.[117]
A subreddit founded for "involuntary celibates", r/Incels was a forum wherein members discussed their lack of romantic success.[118] The sub defined an "incel" as a person over age 20 who has unintentionally gone at least six months without a romantic partner.[119] Self-described incels are largely heterosexual men.[119]
Many members of r/incels adhered to the "black pill" ideology,[120][121] which espouses despondency often coupled with misogynistic views that condone, downplay, or advocate rape.[119] Notable black pill posts were titled "Reasons why women are the embodiment of evil" and "Proof that girls are nothing but trash that use men".[122] Posts often referred to women as "femoids," "foids", "cunts", "cum dumpsters", and "sluts".[119] Moderators banned users who were deemed too female-friendly or claimed that women experienced inceldom to the same extent as men.[120] The subreddit's users intermittently revered or hated "normies" and "Chads" for their courtship abilities, while some admired murderers such as Elliot Rodger, a self-identified "incel" who committed the 2014 Isla Vista killings.[123][119][124]
In the summer of 2017, a petition on Change.org called for r/Incels to be banned for inciting violence against women.[123] Following the October implementation of a new Reddit policy that prohibited the incitement of violence, the subreddit was banned on November 7, 2017.[125] At that time, r/incels had around 40,000 subscribers.[122]
r/Braincels subsequently became the most popular subreddit for incels, gaining 16,900 followers by April 2018. After the 2018 Toronto van attack, posts appeared on this subreddit praising the actions of Alek Minassian, the alleged perpetrator. Later, the subreddit's leaders disavowed the attack and deleted some of the posts that praised Minassian.[126] r/Braincels was banned in September 2019.[127]
Reddit's staff was initially opposed to the addition of obscene material to the site, but they eventually became more lenient when prolific moderators, such as a user named u/violentacrez, proved capable of identifying and removing illegal content at a time when Reddit had insufficient paid staff to do so.[4]
Communities devoted to explicit material saw rising popularity, and in a 2008 "Best of Reddit" user poll, users chose r/Jailbait (a sub featuring provocative photos of underage teenagers) as "subreddit of the year".[4] At one point, "jailbait" was the second most common search term on Reddit.[4] Erik Martin, Reddit's general manager, defended r/Jailbait, arguing that such controversial pages were a consequence of allowing free speech on the site.[128]
r/Jailbait came to wider attention outside Reddit when Anderson Cooper of CNN devoted a segment of his program to condemning the subreddit and criticizing Reddit for hosting it.[129][130] Initially, this caused a spike in Internet traffic to the subreddit, causing the page to peak at 1.73 million views on the day of the report.[131]
In the wake of these news reports, a Reddit user posted an image of an underage girl to r/Jailbait and subsequently claimed to have nude images of her. In response, dozens of Reddit users posted requests for the nude photos to be shared with them via private message.[132] Other Reddit users drew attention to this discussion, and Reddit administrators closed the r/Jailbait forum on October 11, 2011.[132] Critics of the ban, such as r/Jailbait's creator, charged that Reddit administrators used the thread as an excuse to close down a controversial subreddit following the negative media coverage it had attracted.[3] Others claimed that members of the Something Awful forum, not r/Jailbait's regulars, had created the thread in an attempt to get the sub shut down.[133]
Following the closure of r/Jailbait, The Daily Dot declared the community's creator, u/violentacrez, "The Most Important Person on Reddit in 2011", calling the r/Jailbait controversy "the first major challenge to the site's voluntary doctrine of absolute free speech".[134]
In January 2019 a Philippine-based subreddit, r/Jakolandia, was accused of "distributing" posts with photos of women (including celebrities) that were apparently taken without their consent. "A number" of secret Facebook groups had been taking similar actions, engaging in illegal activity by sharing "obscene" photos of women and possibly child pornography.[135] r/jakolandia was later banned as a result.[136]
r/MGTOW was a subreddit for Men Going Their Own Way, an anti-feminist, misogynistic, mostly online community advocating for men to separate themselves from women. It also advocates separation from society, which they believe feminism has corrupted.[137][138] In January 2020, a group of researchers published a preprint of an analysis of the manosphere, which listed r/MGTOW among a group of growing online communities involved in "online harassment and real-world violence".[139] Reddit quarantined the subreddit shortly afterward.[140] In August 2021, Reddit banned the subreddit for violating its policies prohibiting content that "incites violence or promotes hate based on identity or vulnerability".[141]
r/MillionDollarExtreme was dedicated to the comedy group Million Dollar Extreme, who were accused of having connections with the alt-right. Its users propagated various anti-Semitic conspiracy theories and heavily promoted racist, homophobic, and transphobic content. On September 10, 2018, when the sub had around 43,000 subscribers, it was banned for violating Reddit's content policy regarding violent content. Million Dollar Extreme's YouTube channel and Instagram account had already been terminated earlier that year.[142]
Associated subreddits r/BillionShekelSupreme, r/milliondollarextreme2, r/ChadRight, and several others were subsequently banned.[142][143]
r/NoNewNormal was a subreddit critical of the responses to the COVID-19 pandemic. It propagated various conspiracies about the pandemic and measures to control it, including lockdowns, masking, vaccines, and the implementation of a "new normal." It was quarantined for misinformation on August 12, 2021, when it had accumulated over 112,000 subscribers. Subreddits r/rejectnewnormal and r/refusenewnormal were subsequently banned for trying to circumvent the quarantine, and r/PandemicHoax and r/truthseekers, which hosted similar content, set themselves to private.[144][145]
In a thread on r/vaxxhappened, a community opposing vaccine misinformation, a Redditor called upon administrators to ban subreddits that primarily spread medical misinformation.[146] Admins responded that Reddit is a platform for free speech and discussion, and would continue to allow subreddits that challenge the consensus views on the pandemic.[147] In protest of Reddit's response, the moderators of 135 subreddits (including r/florida, r/futurology, r/pokemongo, r/startrek, and r/tifu) made their subreddits private.[148][149][150][151]
On September 1, 2021, Reddit banned r/NoNewNormal for brigading subreddits that criticized it,[152][153] and quarantined 54 other subreddits associated with COVID-19 denial.[154]
Members of r/Physical_Removal advocated for the forced deportation or physical removal of political leftists from the United States. Its name references a quote by right-wing libertarian philosopher Hans-Hermann Hoppe, who wrote: "There can be no tolerance toward democrats and communists in a libertarian social order. They will have to be physically separated and removed from society".
The sub was controversial for its promotion of violence against leftists and other groups. For instance, users would refer to throwing people from helicopters, an extrajudicial execution method used by Chilean dictator Augusto Pinochet. After the 2017 Unite the Right rally in Charlottesville, Virginia, r/Physical_Removal drew criticism after mocking the death of Heather Heyer, who was struck and killed by a car driven by a far-right terrorist at the rally.[155][156]
"Pizzagate" is a conspiracy theory that emerged from social media and fake news websites in early November 2016. It falsely alleged that a child trafficking ring existed which involved Democratic Party officials and restaurants such as Comet Ping Pong.
Over 20,000 subscribers joined r/pizzagate (a spinoff from r/The_Donald) to discuss this conspiracy theory.[157] Because users would post the personal details of people allegedly connected to Pizzagate, Reddit banned the sub on November 23, 2016 for violating its anti-doxing policy.[158]
Reddit attracted attention from mainstream publications in 2018 for the role it played in spreading the QAnon conspiracy theory from 4chan and 8chan to the wider internet. At QAnon's peak, tens of thousands of users were subscribed to various subreddits promoting the conspiracy theory. In response, Reddit began to ban these subreddits for breaking sitewide rules.[159][160][143]
In March 2018, the original QAnon sub r/CBTS_stream was banned for inciting violence and sharing confidential personal information. The sub (whose name refers to the "calm before the storm") had accumulated over 20,000 subscribers. r/GreatAwakening, which had a more active userbase with over 71,000 subscribers and an average of 10,000 comments per day, was banned in September that year for repeated content violations, such as harassing a user they misidentified as the suspect of the Jacksonville Landing shooting. Around 17 other subreddits, such as r/BiblicalQ, r/Quincels, and backup sub r/The_GreatAwakening, were also banned.[161][143][162] By 2020, these bans had significantly decreased QAnon-related discussions on Reddit, and the remaining discussions focused on criticisms of the conspiracy theory.[160]
r/SanctionedSuicide was a subreddit that approached the topic of suicide from a pro-choice perspective. It included discussions surrounding the ethics of suicide as well as posts containing rants from Reddit users.[163] Reddit banned the subreddit on March 14, 2018, for violating its guidelines;[164] this prompted the creation of its own website, Sanctioned Suicide, where many of the subreddit's users migrated to thereafter.[165]
On December 15, 2014, Reddit took the unusual step of banning a subreddit, r/SonyGOP, which was being used to distribute hacked Sony files.[166]
The subreddit r/Shoplifting was devoted to stories, tips, and questions for the purpose of shoplifting at large commercial retail chain stores. It dissuaded people from shoplifting from smaller stores, which were presumed to suffer greater losses from theft. Users often posted pictures of items they had supposedly "lifted".[167]
Near the end of its existence, over 77,000 people were subscribed to the subreddit.[168] It was banned on March 21, 2018 because it violated an amendment to the Reddit User Agreement - added that same day - which states: "Users may not use Reddit to solicit or facilitate any transaction or gift involving certain goods and services, including: ... Stolen goods".[169][170]
r/The_Donald was a community created for supporters of Donald Trump's 2016 presidential campaign. In November 2016 Reddit banned many of the sub's "toxic" users, alleging that they harassed Reddit administrators and manipulated the site's algorithms in order to push content to Reddit's front page.[171] Reddit's CEO Steve Huffman (known as u/spez on Reddit) had recently admitted to silently editing comments in r/The_Donald which attacked him. Subsequently, the term "spez" entered The_Donald's terminology as a synonym for "edit".[172]
In response, Reddit modified the site's algorithms to specifically prevent the sub's moderators from gaming them.[173] Additionally, Reddit introduced a filtering feature which allowed individual users to block content from any sub. While Reddit had been developing this feature before problems with r/The_Donald arose, critics suggested that it was introduced specifically to allow users to block that community.[173] Huffman referred to r/The_Donald's users' complaints of harassment "hypocritical" because they had harassed others.[174]
After the Christchurch mosque shootings in 2019, many posts filled with anti-Muslim hate appeared in the subreddit arguing that the shootings were justified.[175]
The subreddit was quarantined by Reddit admins in June 2019 for "threats of violence against police and public officials".[176][177] On June 29, 2020, Reddit banned the subreddit for frequent rule-breaking, for antagonizing the company and other communities, and for failing to "meet our most basic expectations".[178]
In August 2014, Reddit users began sharing a large number of naked pictures of celebrities stolen via phishing from their private Apple iCloud accounts.[179][180] r/TheFappening was created as a hub to share and discuss these stolen photos. Most of the stolen images were posted within the subreddit.[181][182]
Victims of the scandal (called "CelebGate" by the media)[183] included Jennifer Lawrence, Kate Upton, Mary Elizabeth Winstead, and other high-profile individuals.[184][185] Several leaked photos of Liz Lee and McKayla Maroney may have been taken when the women were underage, which would constitute child pornography, though this remains controversial.[186]
Reddit administrators closed the subreddit in September 2014. The scandal led to wider criticisms from The Verge and The Daily Dot concerning the website's moderation.[187][188]
In January 2021, Reddit banned r/TruFemcels, a subreddit for female incels ("femcels"), for promoting hate.[189] Critics accused the sub of lookism, racism, transphobia, spreading alt-right conspiracy theories, and using incel terminology, . After the ban, the community migrated to a dedicated website, ThePinkPill.co.[190]
In June 2022, Reddit banned r/TumblrInAction (TiA) for promoting hate. TiA was an anti-gender movement subreddit created to mock Tumblr "gender ideology" and "social justice warriors (SJWs)". At the time, the subreddit had over 470,000 members, including some who joined after r/GenderCritical was banned. r/SocialJusticeInAction, a sister subreddit to TumblrInAction, was also banned.
Reddit user Hatman, a former moderator of both communities, alleged that Reddit banned both subreddits because of their discussions about transgender politics.[191][unreliable source?] Months prior, in December 2021, Slate had referred to TumblrInAction as "a breeding ground for online hate...[linked] to Gamergate and all sorts of online harassment tactics".[192]
r/UncensoredNews was a far-right subreddit that claimed to be the "free speech" alternative to the more popular news-related subs. Founded by users who moderated several white nationalist subreddits before June 2016, it saw a massive increase in subscribers following the Orlando nightclub shooting, as the moderators of r/news were accused of censoring the name, religion, and motive of perpetrator Omar Mateen.[85][193]
r/UncensoredNews primarily promoted stories about crimes committed by minorities or left-wing people, such as attacks on white farmers in South Africa. Their stories often had a xenophobic, Islamophobic, and racist bent. For example, a post stickied by one of the sub's moderators was titled "Here at uncensored news we love racism, bigotry, misogyny, hatred, xenophobia, transphobia, homo phobia [sic] etc." while another user compared miscegenation to bestiality.[73][194][87]
r/UncensoredNews and its moderators were banned on March 12, 2018 for inciting violence, possibly in response to a thread where users debated whether Jews or Muslims were more dangerous.[87]
The subreddit r/WatchPeopleDie featured media depicting real-life human deaths, such as workplace accidents, vehicular manslaughter, gun violence, suicides, and various forms of homicide. After it disseminated links to video of the 2019 Christchurch mosque shootings, the sub was banned.[195][196][197] The similar subreddit r/Gore was banned at the same time, as was r/WPDTalk, a subreddit for discussion on what went on in r/WatchPeopleDie.[198]
The sub had previously been quarantined for over half a year, but less than a day after the Christchurch shootings, Reddit banned it completely for violating Reddit's policy against "glorifying or encouraging violence." Moderators of the subreddit had initially allowed the video to be shared.[199]
Active subreddits refer to subreddits that have been, or are presently, contentious yet not removed.
The subreddit r/antiwork was established in 2013.[200] The subreddit was intended for supporters of a society in which people did not have to work at all, or at least had a much smaller obligation to work, according to a longtime moderator. During the COVID-19 pandemic, new posters who were unhappy with working conditions joined.[201]
In 2019, the number of subscribers was 13,000,[200] which increased to 100,000 in early 2020.[201] The subreddit's popularity rose after people began posting text messages of employees giving notice to their employers that they no longer wanted their jobs.[200] In November 2021, the subscriber number exceeded one million.[201] By December 2021, that number had grown to 1.4 million,[200] and in January 2022, it had reached over 1.7 million. On 26 January, r/antiwork was the subreddit with the highest increase of traffic that was not one of Reddit's "default" front page subreddits.[202]
In January 2022, a longtime r/antiwork moderator agreed to be interviewed by Fox News host Jesse Watters, who The Independent described as "openly contemptuous about the [anti-work] movement".[202] Members of the subreddit criticized the moderator, and the other moderators in turn temporarily made the subreddit private.[203] Ultimately, the interviewee was asked to give up her moderation duties. Noah Berlatsky, writing for The Independent, stated that the Fox News segment became "a publicity disaster for r/antiwork", and that r/antiwork became "widely ridiculed".[204]
Following this "publicity disaster," a similar subreddit called r/workreform was formed around similar ideas and content as r/antiwork.
r/aznidentity and r/AsianMasculinity are communities operated by and for Asian-American men. They discuss various topics related to lifestyle, dating, fitness, and world events from the perspective of the male Asian diaspora.
Users, who are sometimes called "Men's Rights Asians" or "MRAsians" (a pun on "men's rights activists"), argue that American culture emasculates Asian men sexually.[205] They claim that Asian-American women perpetuate these stereotypes, and thus uphold white supremacy, if they date white men.[206] However, the subreddits support interracial relationships between Asian men and White women.[207]
They also claim that anti-Asian racism is disproportionately committed by Black people and gets little attention, while coverage of anti-Black violence, especially when perpetuated by Asians, gets unfair attention.[205]
On April 1, 2019, r/BlackPeopleTwitter began requiring users to prove they were Black—by sending a photo with their forearm and their Reddit username—before allowing them to post comments. The moderators described this action as an April Fools' Day prank, albeit one with a "very real reason."[208] The April Fools' prank lasted only a few days, but the moderators now limit some contentious threads to a "country club" consisting of verified people of color and white people who complete an application process that includes writing "about what white privilege means to them." Additionally, verified Black commenters (but not other people of color) receive a check mark next to their username.[209][210]
r/FemaleDatingStrategy (FDS) was created in 2019.[211] It has been accused by r/AgainstHateSubreddits of promoting homophobia, transphobia, misandry, and discrimination against sex workers. The Verge described the advice given to women in the sub as socially conservative, sexually conservative, and oppressive to women. FDS posters must follow strict rules to avoid being banned, with support for consensual BDSM, pornography consumption, or casual sex being bannable offenses.[212] As of August 2021, FDS had about 179,000 members who were described as mostly heterosexual women.[211]
The group has a strict hierarchy, with moderators called "Ruthless Strategists" on top. Community is prioritized over the individual, and members are advised against speaking with journalists, practices which have been described as cult-like.[213] The subreddit advises against dating men with mental illnesses, and has banned members for believing men can be victims of sexual assault. The members oppose liberal feminism, or "libfems",[213][211] and endorse TERF-like views; transgender women are prohibited from posting. The sub has also been criticized for contradictory advice, such as encouraging independence from men while expecting them to pay for dates and be the primary breadwinners.[214]
Critics have compared r/FemaleDatingStrategy to the manosphere subreddits it was created to oppose.[214] Quoted in a 2022 Guardian article, a co-host for FDS's podcast said: "[FDS] isn't about trying to manipulate men into trying to behave a certain way ... it's more about finding a man who is comfortable with you having boundaries and standards, and who understands how to treat a woman."[211]
The subreddit historically made extensive use of female-incel ("femcel") language, but when the femcel jargon interfered with the recruitment of new members, users gradually adopted new terminology. It includes terms like "scrotes" for men and "pickmeisha" for women, who FDS claims degrade themselves for men. "Pickmeisha" has been used to label members that criticize the moderators or claim to enjoy banned behavior such as casual sex,[213] and it has been targeted at women in other subreddits for issues such as seeking advice on their partner's erectile dysfunction.[212]
Time identified r/GenZedong, a self-described "Dengist" subreddit focused on China, as a haven for anti-Uyghur racism and denial of oppression against Uyghurs.[41][44]
In 2022, the hacker group Anonymous hacked a server hosting Chinese government websites. The group uploaded a meme mocking r/GenZedong on a government site promoting tourism in China.[215]
The subreddit was quarantined on 23 March 2022 for spreading disinformation about the Russian invasion of Ukraine. At the time of its quarantine, the subreddit had over 57,000 subscribers.[44]
r/HermanCainAward awards the "Herman Cain Freedom Award" to people who "made public declaration of their anti-mask, anti-vax, or COVID-hoax views," but were later infected with COVID-19 and were hospitalized or died from it.[216][217][218][219]
According to Le Monde, "In its early days, HCA was primarily fueled by articles found in the press, [but] in recent months, the examples have been drawn directly from a Facebook page of COVID-19 victims. Publication after publication, the pattern invariably repeats itself: one person (anonymized to respect Reddit rules) says all the bad things they think about vaccines, masks, or sometimes even doubts the existence of the pandemic. Often the memes (humorous diversions) used to illustrate mistrust of the vaccine are the same. The following screenshot tells us that the person has just fallen ill, and sometimes that the illness does not really give them a break. Calls to pray for help may follow, before a loved one finally announces the death."[220]
F. Diane Bart, a psychotherapist writing for NBC News, described the subreddit as "a dark and sardonic corner of the internet" that "captures the rage and outrage of presumably vaccinated, mask-wearing individuals, many of whom have either been infected with COVID-19 in the past or have watched friends and family become ill—and even die."[218]
r/KotakuInAction was one of the main online hubs for participants in the misogynistic harassment campaign known as Gamergate.[221][222][223] When they join r/KotakuInAction, users are warned that they will be banned from other subreddits, including r/OffMyChest (where users express opinions and share personal thoughts); r/NaturalHair; and r/Rape (a support forum for rape survivors that was brigaded by r/KotakuInAction users).[224]
BuzzFeed's Joseph Bernstein reported that many of r/KotakuInAction's moderators also moderate other subreddits "devoted to either the physical and emotional degradation and humiliation of women, or in subreddits devoted to mocking and delegitimizing the arguments and appearances of feminists and 'social justice warriors'."[225]
In 2016, three scholars from the Georgia Institute of Technology wrote an academic paper analyzing r/KotakuInAction and "the border between controversial speech and harassment."[226]
On July 12, 2018, r/KotakuInAction's creator and head moderator removed all of the sub's other moderators and set the forum to private, alleging that the sub had become "infested with racism and sexism". A Reddit employee restored the forum and its moderators an hour later.[227][228]
A 2020 review analyzing ten discussion boards on r/KotakuInAction suggested a connection between Gamergate and right-wing extremism (RWE). According to the review, the three main themes in these discussion boards were "RWE bigotry", "always anti-left" and "hate speech is free speech".[229]
The antifeminist[230][231]: 323 subreddit r/MensRights was created in 2008. It had over 300,000 members as of April 2021[update].[230]
Media studies researcher Debbie Ging described the "extreme misogyny and proclivity for personal attacks" of several men's rights subreddits, including r/MensRights, as "the most striking features of the new antifeminist politics".[232]: 645–6
In the spring 2012 issue of the Southern Poverty Law Center's (SPLC) Intelligence Report (titled "The Year in Hate and Extremism"), r/MensRights was included in a section called "Misogyny: The Sites" along with 11 other websites. The SPLC reported that "although some of the sites make an attempt at civility and try to back their arguments with facts, they are almost all thick with misogynistic attacks that can be astounding for the guttural hatred they express".[233] Using a moderator's statements as an example, the SPLC feature charged that r/MensRights in particular "trafficks in various conspiracy theories" and shows anger "toward any program designed to help women".[234]
In a March 2012 interview, the issue's editor, Mark Potok, noted that while the SPLC "wrote about the subreddit Mens Rights ... we did not list it as a hate group" and probably never would. He added: "it's a diverse group, which certainly does include some misogynists—but I don't think that's [its basic] purpose".[235]
Later that year, the SPLC noted that the report "provoked a tremendous response among men's rights activists (MRAs) and their sympathizers."[236] They added: "it should be mentioned that the SPLC did not label MRAs as members of a hate movement; nor did our article claim that the grievances they air on their websites – false rape accusations, ruinous divorce settlements and the like – are all without merit. But we did call out specific examples of misogyny and the threat, overt or implicit, of violence."[236]
In April 2013, Reddit administrators threatened to shut down r/MensRights after subscribers gathered personal information on the purported author of a blog about feminist issues. The subreddit's moderators advised users on how to dox the blogger without running afoul of site rules.[237] In fact, the sub's users had identified the wrong woman, who subsequently received numerous death threats at her school and workplace. After Georgetown University received threatening messages, it confirmed that the woman named in the threats was not the blog's author.[237]
In mid-December 2013, users from r/MensRights and 4chan spammed the Occidental College Online Rape Report Form with hundreds of false rape reports. A user had recently complained that because the form could be submitted anonymously, it was vulnerable to abuse.[238][239] Men's rights activists made around 400 false rape accusations against members of the college, feminists, and fictional people.[238]
r/NoFap is a subreddit dedicated to supporting those who wish to give up pornography or masturbation.
Some journalists have described NoFap's forums as filled with misogyny: "there is a darker side to NoFap. Among the reams of Reddit discussions and YouTube videos, a 'fundamentally misogynistic rhetoric' regularly emerges".[240]
According to critics, r/NoFap idolizes testosterone and inherently masculine qualities, and "the NoFap community has become linked to wider sexism and misogyny, reducing women to sexual objects to be attained or abstained from, and shaming sexually active women."[241][unreliable source]
In 2019, Reddit threatened to ban r/piracy after receiving dozens of DMCA takedown notices. The moderators responded that Reddit did not investigate the infringement claims to find if they actually infringed copyright law. Often, they argued, users had actually been sharing URLs for streaming sites, asking if such sites were working, and posting guides about installing programs. Ultimately r/Piracy's users voted to delete all content older than six months, as it was not feasible to investigate all past content.[242][243]
On August 17, 2022 Reddit banned r/PiratedGames, which focused specifically on pirated video games and was among the largest piracy-related subreddits with over 300,000 subscribers. Though the subreddit explicitly prohibited sharing pirated content, it was banned for excessive DMCA claims. Following an appeal from the moderators, it was restored the next day.[244] In several articles, TorrentFreak said the ban was part of Reddit's increasing crackdown on copyright infringement, noting that the year prior around 2,625 subreddits had been banned for similar reasons, and that DMCA takedowns on Reddit had increased by over 15,000% in the past five years.[244][245][246]
The subreddit r/Portugueses is often home to Portuguese nationalist and nativist rhetoric. It also contains racism, homophobia, sexism, and other Reddit policy violations. Moderators from other subreddits have received threats after removing or reporting policy violations like hate speech in r/Portugueses.[41]
On March 1, 2022 Reddit administrators quarantined Russia's national subreddit, r/Russia, and removed one of its moderators for spreading "disinformation about the Russian invasion of Ukraine". The sub's moderators had promoted various pieces of disinformation, including claims that the Ukrainian military was controlled by Nazis; that Ukraine was using human shields to raise the conflict's death toll; and that the Ukrainian leadership was refusing calls for peace negotiations. When it was quarantined, r/Russia had over 265,000 subscribers. Its sister sub, r/RussiaPolitics, was also quarantined for similar reasons.[247][248][249]
r/Sino is a subreddit focused on China which features pro-CCP propaganda.[41] As in r/GenZedong, users often express anti-Uyghur racism and denial of oppression against Uyghurs.[41]
In April 2014, a Daily Dot article revealed that moderators of r/Technology were using automatic filters to remove submissions that contained certain keywords, including "Aaron Swartz", "Tesla",[250] "Comcast", "NSA", and "Snowden".[251] At the time, the subreddit had 5 million subscribers.
The article engendered protest among Redditors, who raised concerns about censorship, and r/technology lost its default subreddit status.[252][253]
Alluding to the symbol of the "red pill" from the film The Matrix,[254][255] r/TheRedPill is devoted to discussions of male sexual strategy in which participants are ranked as "alpha" or "beta" males.[256] The subreddit promotes antifeminism,[256][232] rape culture,[232] hegemonic masculinity, and traditional gender roles.[254] Users discuss diet and physical fitness; share "pick-up" techniques for seducing women; and display varying levels of misogyny, ranging from virulent hatred of women to simple frustration with contemporary male experience.[255]
In 2018, the Southern Poverty Law Center described r/TheRedPill as one of several male supremacist subreddits featuring xenophobic discourse.[257] One critic compared the ideology expressed in r/TheRedPill to alt-right philosophy, highlighting attacks on feminism and mockery of rape as common threads.[258] The New Statesman described it as one of the most misogynistic subreddits on Reddit, intended to radicalize men.[124]
In 2017, The Daily Beast revealed that New Hampshire legislator Robert Fisher created r/TheRedPill and had posted demeaning comments about women there. Fisher resigned.[259]
r/WhitePeopleTwitter is a popular Reddit community that has attracted controversy following several posts which portrayed satirical or hoax tweets as legitimate, then went viral on social media. One such tweet joked that Twitter CEO Elon Musk was introducing a "special verification for users of the Jewish faith".[261] Another purported to be written by conservative commentator Matt Walsh and claimed he engaged in sexual assault multiple times.[262] A third feigned support for then-recently arrested social media personality Andrew Tate.[263]
In 2011, the site's general manager, Erik Martin, stated that Reddit would not ban communities solely for featuring controversial content. He noted that "having to stomach occasional troll [sub]reddits like r/picsofdeadkids or morally questionable [sub]reddits like r/jailbait are part of the price of free speech on a site like this."[264] He argued that it is not Reddit's place to censor its users.[264]
Similarly, the site's former CEO, Yishan Wong, argued that Reddit should not ban distasteful subreddits because, as a platform, it should serve the ideals of free speech.[265][266] Critics responded that Reddit had not been consistent in following its free speech philosophy.[267][268] In a 2015 discussion on the site's content policy, founder Steve Huffman stated that "neither Alexis [Ohanian] nor I created Reddit to be a bastion of free speech".[1]
When it banned r/The_Donald in 2020, Reddit expanded upon the kinds of content it was willing to ban, implementing new rules that directly prohibit hate speech.[269] The following year, despite moderator criticism, Reddit vowed to allow conversations that "question or disagree with popular consensus" regarding the COVID-19 pandemic: "dissent is a part of Reddit and the foundation of democracy." Reddit did eventually ban r/NoNewNormal after moderator protests, but the ban was leveled for unduly influencing other communities, not for promoting misinformation.[153]
Seamless Wikipedia browsing. On steroids.
Every time you click a link to Wikipedia, Wiktionary or Wikiquote in your browser's search results, it will show the modern Wikiwand interface.
Wikiwand extension is a five stars, simple, with minimum permission required to keep your browsing private, safe and transparent.