Allison and her fiancé have a shared public Instagram account that they use to give their followers a glimpse into their life together in Chicago, often alongside the hashtags #love and #interracialcouple.
They’d grown accustomed to the occasional racist comment under their photos every now and then. But in early May, those comments started happening with more frequency, and took on a more sinister, even threatening tone — often containing references to obscure white supremacist or incel memes.
On May 5, Allison, whose real name is being withheld for her safety, received a strange DM. It was from a woman she didn’t know, who informed her that she was on a disturbing website that was compiling information about white women in interracial relationships.
When she went to the website, she found her name, photos, and social media handles under the label “traitors.”
“It was weird, and strange, and creepy,” said Allison, 28. “I was thinking, ‘Who takes the time to do this?’”
The website names, shames, and effectively promotes violence against interracial couples and families — and it’s been circulated in some of the darkest corners of the internet, including in neo-Nazi Discord servers and accelerationist Telegram channels.
White supremacists have long invoked “racial purity” to justify horrific racism and brutal acts of violence against nonwhite people. And in recent years, they’ve relied on fringe social media or forums to mount mass harassment or doxxing campaigns targeting people of color, Jews, women, and journalists.
“A website like this is concerning for reasons even beyond the repulsive hate it promotes. The site is yet another example of how certain online spaces are being designed to literally facilitate harassment,” said Oren Segal, vice president of the Anti-Defamation League’s Center on Extremism. “They have real-world impacts on real people. The online systems that a site like this requires to operate should take steps to respond.”
The website was created in April but was taken offline after their initial hosting provider cut ties with them. They then found a home with one of Russia’s largest domain registrars, R01. VICE News contacted R01 on Tuesday to ask whether the site violated their policies. An hour later, the site was taken offline, but as of Wednesday morning it was back up. Tatiana Agafonova, a spokesperson for R01, wrote in an email that the company would “diligently render its services to customers” unless a court rules otherwise or they’re contacted by law enforcement.
The owner of the website shields their identity and location through Cloudflare, a U.S.-based security company that protects customers from DDoS attacks (attempts to crash a website by overwhelming it with data). VICE News contacted Cloudflare to ask how this particular website squared with their policies. They declined to comment on individual websites but directed us to their blog from February 2019, where they “address complaints about content.” Their bottom line was that Cloudflare is a security company, and content moderation isn’t really their responsibility.
Cloudflare has made some exceptions in the past, however. After the violent white supremacist rally in Charlottesville in August 2017, big tech companies, including Cloudflare, found themselves scrambling to rein in the extremists who’d used their services to organize, share propaganda, and recruit. Cloudflare diverged from its long-standing policy to remain content-neutral when it decided to terminate services for the neo-Nazi website the Daily Stormer, after it mocked the death of Heather Heyer, who was killed when a neo-Nazi drove his car into a crowd of protesters during the rally. “This was my decision,” Cloudflare CEO Matthew Prince told Gizmodo at the time. “This is not Cloudflare’s general policy now, going forward.”
Cloudflare again took action after a white supremacist targeted Latinos in a mass shooting at a Walmart in El Paso in August 2019. The alleged shooter had shared his manifesto on 8chan, an imageboard site known for its links to far-right terrorists. Cloudflare decided to sever ties with 8chan. Prince said they “reluctantly tolerate content that we find reprehensible, but we draw the line at platforms that have demonstrated they directly inspire tragic events and are lawless by design.”
The push to deplatform some of the major players behind the Charlottesville rally (which left Heyer dead and dozens injured) was enormously successful; most of them are broke, still entangled in lawsuits, and have basically faded into obscurity. But in the years since, other online extremists have gotten very good at evading tech crackdowns by employing an ever-evolving shared language of memes and euphemisms used to signpost for the same racist views.
The website in question uses the same strategy, which seems to be carefully crafted in an effort to shield the owner from liability. The owner even explicitly states on the site that they do not encourage violence — all they’re doing is listing names and social media accounts as part of a database of “white women who have an interest in black men.” One section is titled “toll paid,” and it lists women who have been in interracial relationships, and had something horrible happen to them, like death or injury.
Allison said that the “toll paid” reference was one of the things that kept showing up in comments on her Instagram account. “Things like, ‘can’t wait until you pay the price when he beats you,’’” Allison said. “Or ‘the toll will be paid one day when he slams your big forehead into the ground.”
The owner of the website claims that the “toll paid” section is intended to catalog incidents where white women are victims of black violence, and isn’t an incitement.
But “all the disclaimers in the world” may not be enough to protect them from a lawsuit some day, especially if someone is harassed or harmed as a result, says Subodh Chandra, a former federal prosecutor who has handled high-profile civil rights cases, including a recent case against the Daily Stormer.
“The creators of this content are subjecting themselves to civil liability risk, despite all the nonsensical disclaimers,” Chandra said. “They could very well find themselves subject to prosecution, should anyone be harassed or otherwise harmed as a result of this activity.”
About a quarter of the approximately 80 women targeted by the site as “traitors” were already internet famous as models or influencers. One is Zienna Sonne, a Danish model who was previously married to a black man. Sonne said she’s been trolled a lot over the years, and so wasn’t particularly fazed when she found out she was on this website. “When we were still together, I would get messages almost daily,” Sonne said. “Unfortunately we live in a world where this is normal.”
But for the majority of the women on the site, like Allison, who works for the city of Chicago, this is the first time they’ve been targeted as part of a mass harassment campaign. And it feels random. There are millions of people who are in interracial relationships; according to the last U.S. census, 15% of all new marriages were between spouses of a different race or ethnicity.
The randomness stems from the fact that it’s not just one person identifying and targeting couples; the owner of the website solicits submissions from visitors. One woman, who is working a food-service job while studying to be a teacher, was targeted along with two other women she attended high school with. She thinks that the person who submitted their pictures was probably a former student there. Still, the fact she might know the person who did this doesn’t really worry her. “I’m stronger than I look, I’m doing good,” she said. “However, it disgusts me very much … I despise racism.”
These kinds of crowdsourced harassment efforts carry another legal risk, said Chandra. Should someone get their facts wrong about an individual they’re targeting, or misidentifies a person, they could be subject to even greater liability.
So far, the women that VICE News spoke to said that the level of harassment or threats hadn’t made them so worried for their safety that they felt compelled to contact law enforcement. In some other countries, including the U.K. and France, “incitement of ethnic or racial hatred” — which could include running a racist website — is considered a crime. Australian authorities are reportedly investigating the website, after they were contacted by a teen from Melbourne who was targeted by the site.
Cover: Stock image of a woman with a smartphone. Getty Images.