Facebook actually stopped harmful content from spreading.
The Nikki Nova Archivescompany said in a press release Monday that it removed or added a content warning to 1.9 million pieces of "ISIS and al-Qaeda" content in January through March — twice the amount it removed in the previous three months.
SEE ALSO: A UK journalist is suing Facebook for defamation over fake adsSupposedly, 99 percent of that content was removed because Facebook's technology and employees found it, not because users reported it.
"In most cases, we found this material due to advances in our technology, but this also includes detection by our internal reviewers," wrote Monika Bickert, Facebook's vice president of global policy management, and Brian Fishman, its global head of counterterrorism policy.
Facebook's counter-terrorism team has grown to 200 people from 150 last June, the company said. Overall, terrorist material posted to Facebook was typically removed within a minute, according to the press release.
Facebook also made the unusual (dare we say editorial?) decision to define terrorism on its platform: “Any non-governmental organization that engages in premeditated acts of violence against persons or property to intimidate a civilian population, government, or international organization in order to achieve a political, religious, or ideological aim.”
No doubt cognizant that critics are sensitive about the social network's political leanings — in a Congressional hearing with Mark Zuckerberg earlier this month, Sen. Ted Cruz (R-Tex) trotted out familiar, paranoid concerns about its supposed anti-conservative bias — Facebook also went out of its way to say its "definition is agnostic to the ideology or political goals of a group."
Terrorist organizations have used Facebook in the past to recruit new members, boast about attacks, and even share gruesome images of acts of violence, such as beheadings. The U.S. Department of Justice has claimed that ISIS uses Facebook, Twitter and YouTube to target isolated young people in Europe, the United States, and Canada with recruitment messages.
Meanwhile, Facebook is still under attack for allowing the spread of propaganda and misinformation on its platform. It makes sense that it would want to show that technology and minor tweaks to staffing — as opposed to government regulation and changes to its business model — can prevent harmful messages, photos, and videos from going viral. Perhaps not coincidentally, Facebook will report its latest earnings this Wednesday.
"We’re under no illusion that the job is done or that the progress we have made is enough," the company wrote. "Terrorist groups are always trying to circumvent our systems, so we must constantly improve. Researchers and our own teams of reviewers regularly find material that our technology misses. But we learn from every misstep, experiment with new detection methods and work to expand what terrorist groups we target."
Topics Facebook
(Editor: {typename type="name"/})
Wordle today: Here's the August 17 Wordle answer and hints
Please enjoy this cute compilation of a little girl greeting her big sister after school
Wordle today: Here's the August 11 Wordle answer and hints
Koalas are endangered now, and climate change is a big reason why
Side hustle advice from 11 entrepreneurs who've been there
Twilio hack results in security issue for 1,900 Signal users
8 podcast for building your side hustle
I’m just a boy, standing in front a girl, telling her she has a huge ass.
A copy of the Jonas Brothers' 'Burning Up' ironically survived a house fire
接受PR>=1、BR>=1,流量相当,内容相关类链接。