this post was submitted on 04 Dec 2024
104 points (94.8% liked)

Technology

35123 readers
63 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
 

Byrne joined Meta in September 2021.

She and her team helped draft the rulebook that applies to the world’s most diabolical people and groups: the Ku Klux Klan, cartels, and of course, terrorists. Meta bans these so-called Dangerous Organizations and Individuals, or DOI, from using its platforms, but further prohibits its billions of users from engaging in “glorification,” “support,” or “representation” of anyone on the list.

Byrne’s job was not only to keep dangerous organizations off Meta properties, but also to prevent their message from spreading across the internet and spilling into the real world. The ambiguity and subjectivity inherent to these terms has made the “DOI” policy a perennial source of over-enforcement and controversy.

A full copy of the secret list obtained by The Intercept in 2021 showed it was disproportionately comprised of Muslim, Arab, and southeast Asian entities, hewing closely to the foreign policy crosshairs of the United States. Much of the list is copied directly from federal blacklists like the Treasury Department’s Specially Designated Global Terrorist roster.

Byrne tried to focus on initiatives and targets that she could feel good about, like efforts to block violent white supremacists from using the company’s VR platform or running Facebook ads. At first she was pleased to see that Meta’s in-house list went further than the federal roster in designating white supremacist organizations like the Klan — or the Azov Battalion.

She was also unsure of whether Meta was up to the task of maintaining a privatized terror roster. “We had this huge problem where we had all of these groups and we didn’t really have … any sort of ongoing check or list of evidence of whether or not these groups were terrorists,” she said, a characterization the company rejected.

Byrne quickly found that the blacklist was flexible. "Meta’s censorship systems are “basically an extension of the government...”

you are viewing a single comment's thread
view the rest of the comments
[–] geneva_convenience@lemmy.ml 3 points 2 weeks ago

"As a Counterterrorism and Dangerous Organizations policy manager, Byrne’s entire job was to help form policies that would most effectively thwart groups like Azov. Then one day, this was no longer the case. “They’re no longer neo-Nazis,” Byrne recalls a policy manager explaining to her somewhat shocked team, a line that is now the official position of the White House.

Shortly after the delisting, The Intercept reported that Meta rules had been quickly altered to “allow praise of the Azov Battalion when explicitly and exclusively praising their role in defending Ukraine OR their role as part of the Ukraine’s National Guard.” Suddenly, billions of people were permitted to call the historically neo-Nazi Azov movement “real heroes,” according to policy language obtained by The Intercept at the time.

Byrne and other concerned colleagues were given an opportunity to dissent and muster evidence that Azov fighters had not in fact reformed. Byrne said that even after gathering photographic evidence to the contrary, Meta responded that while Azov may have harbored Nazi sympathies in recent years, posts violating the company’s rules had sufficiently tapered off.'"