Home / Uncategorized / Facebook removed 14 million pieces of terrorist content this year, and the numbers are rising

Facebook removed 14 million pieces of terrorist content this year, and the numbers are rising

Facebook must exert constant vigilance to prevent its platform from being taken over by ne’er-do-wells, but how exactly it does that is only really known to itself. Today, however, the company has graced us with a bit of data on what tools it’s using and what results they’re getting — for instance, more than 14 million pieces of “terrorist content” removed this year so far.

More than half of that 14 million was old content posted before 2018, some of which had been sitting around for years. But as Facebook points out, that content may very well have also been unviewed that whole time. It’s hard to imagine a terrorist recruitment post going unreported for 970 days (the median age for content in Q1) if it was seeing any kind of traffic.

Perhaps more importantly, the numbers of newer content removed (with, to Facebook’s credit, a quickly shrinking delay) appear to be growing steadily. In Q1, 1.2 million items were removed; in Q2, 2.2 million; in Q3, 2.3 million. User-reported content removals are growing as well, though they are much smaller in number — around 16,000 in Q3. Indeed, 99 percent of it, Facebook proudly reports, is removed “proactively.”

Something worth noting: Facebook is careful to avoid positive or additive verbs when talking about this content, for instance it won’t say that “terrorists posted 2.3 million pieces of content,” but rather that was the number of “takedowns” or content “surfaced.” This type of phrasing is more conservative and technically correct, as they can really only be sure of their own actions, but it also serves to soften the fact that terrorists are posting hundreds of thousands of items monthly.

The numbers are hard to contextualize. Is this a lot or a little? Both, really. The amount of content posted to Facebook is so vast that almost any number looks small next to it, even a scary one like 14 million pieces of terrorist propaganda.

It is impressive, however, to hear that Facebook has greatly expanded the scope of its automated detection tools:

Our experiments to algorithmically identify violating text posts (what we refer to as “language understanding”) now work across 19 languages.

And it fixed a bug that was massively slowing down content removal:

In Q2 2018, the median time on platform for newly uploaded content surfaced with our standard tools was about 14 hours, a significant increase from Q1 2018, when the median time was less than 1 minute. The increase was prompted by multiple factors, including fixing a bug that prevented us from removing some content that violated our policies, and rolling out new detection and enforcement systems.

The Q3 number is two minutes. It’s a work in progress.

No doubt we all wish the company had applied this level of rigor somewhat earlier, but it’s good to know that the work is being done. Notable is that a great deal of this machinery is not focused on simply removing content, but on putting it in front of the constantly growing moderation team. So the most important bit is still, thankfully and heroically, done by people.

Check Also

Facebook Messenger is building a ‘Watch Videos Together’ feature

Netflix and chill from afar? Facebook Messenger is now internally testing simultaneous co-viewing of videos. That means you and your favorite people could watch a synchronized video over group chat on your respective devices while discussing or joking about it. This “Watch Videos Together” feature could make you spend more time on Facebook Messenger while creating shared experiences that are more meaningful and positive for well-being than passively zombie-viewing videos solo. This new approach to Facebook’s Watch Party feature might feel more natural as part of messaging than through a feed, Groups or Events post. The feature was first spotted in Messenger’s codebase by Ananay Arora, the founder of deadline management app Timebound as well as a mobile investigator in the style of frequent TechCrunch tipster Jane Manchun Wong. The code he discovered describes Messenger allowing you to “tap to watch together now” and “chat about the same videos at the same time” with chat thread members receiving a notification that a co-viewing is starting. “Everyone in this chat can control the video and see who’s watching,” the code explains. A Facebook spokesperson confirmed to TechCrunch that this is an “internal test” and that it doesn’t have any more to share right now. But other features originally discovered in Messenger’s code, like contact syncing with Instagram, have eventually received official launches. Watch Party exists on Facebook, but could be more popular as a chat feature A fascinating question this co-viewing feature brings up is where users will find videos to watch. It might just let you punch in a URL from Facebook or share a video from there to Messenger. The app could put a new video browsing option into the message composer or Discover tab. Or, if it really wanted to get serious about chat-based co-viewing, Facebook could allow the feature to work with video partners, ideally YouTube. Co-viewing of videos could also introduce a new revenue opportunity for Messenger. It might suggest sponsored videos, such as recent movie trailers. Or it could simply serve video ads between a queue of videos lined up for co-viewing. Facebook has recently been putting more pressure on its subsidiaries like Messenger and Instagram to monetize as News Feed ad revenue growth slows due to plateauing user growth and limited News Feed ad space. Other apps like YouTube’s Uptime (since shut down) and Facebook’s first president Sean Parker’s Airtime (never took off) have tried and failed to make co-watching a popular habit. The problem is that coordinating these synced-up experiences with friends can be troublesome. By baking simultaneous video viewing directing into Messenger, Facebook could make it as seamless as sharing a link.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Disclaimer: Trading in bitcoins or other digital currencies carries a high level of risk and can result in the total loss of the invested capital. theonlinetech.org does not provide investment advice, but only reflects its own opinion. Please ensure that if you trade or invest in bitcoins or other digital currencies (for example, investing in cloud mining services) you fully understand the risks involved! Please also note that some external links are affiliate links.