Thursday, June 29, 2017

Facebook gives moderators "full access" to user accounts suspected of terror links

A Facebook data center. (Image: CNET/CBS Interactive)

Facebook has a fleet of low-paid contractors who are tasked with investigating possible connections with terrorism on it site.

The key takeaway: Moderators are granted "full access" to any account once it's been flagged by the social network's algorithms, which are looking for details or connections that might suggest a terror link. Moderators can track track a person's location and read their private messages.

The news comes from The Guardian, just days after Facebook chief executive Mark Zuckerberg announced the social network now has two billion users.

"The counter-terrorism unit has special clearance to carry out investigations into user accounts if they are suspected of having links to terrorist groups identified by the US State Department," says the report. "Moderators will then access the individual's private messages, see who they are talking to and what they are saying, and view where they have been."

The move appears to go far above and beyond the company's recently outlined efforts to use its artificial intelligence and human resources to counter terrorism on the platform. It's in response to growing pressure from several countries to act and to battle terrorism on their platforms, in the wake of several terror attacks in the UK and Europe.

Facebook declined to comment or answer several questions we had.

Among the chief problems with this largely secret internal surveillance is that Facebook doesn't define "terrorism" or "terrorist content." There is no one single definition, or hard-and-fast rule to follow, making the process of removing content arbitrary. Facebook only says that each company facing this kind of challenge "will continue to apply its own policies and definitions of terrorist content when deciding whether to remove content."

The only thing that is known about the rules that govern what content Facebook allows on its site is that it's a secret.

ProPublica this week published a trove of leaked documents that detail the largely arbitrary approach the company takes to deciding what is and what isn't allowed on the site. But even then, much of the enforcement of those rules lands at the mercy of the moderator, who makes the final call.

The Facebook Community Operations team of about 3,000 staff and 150 counter-terrorism experts, according to the company, includes academics and former law enforcement staff, who are working to crack down on extremist content. Exactly how Facebook will moderate the moderators isn't known, largely because the company refuses to say so. At least with government surveillance, there are rules and some oversight (even if it's deeply flawed at the best of times). Unlike the US intelligence community, which is governed by Fourth Amendment protections against unwarranted searches on Americans, private companies like Facebook are not. There is almost nothing legally stopping Facebook from reading your messages or terminating your account for any reason at any time.

Facebook is now employing a largely secret group of unaccountable staff working against a set of arbitrary and unknown rules against two billion people. What could possibly go wrong?

Without any shred of transparency, there's no telling who is or isn't under the watchful eye of Facebook's own internal surveillance.

Contact me securely

Zack Whittaker can be reached securely on Signal and WhatsApp at 646-755–8849, and his PGP fingerprint for email is: 4D0E 92F2 E36A EC51 DAAE 5D97 CB8C 15FA EB6C EEA5.



from Latest Topic for ZDNet in... http://ift.tt/2sVQ0O7

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.