Home / News & Analysis / Zuckerberg urges privacy carve outs to compete with China

Zuckerberg urges privacy carve outs to compete with China

Facebook’s founder said last month that the company is open to being regulated. But today he got asked by the US senate what sort of legislative changes he would (and wouldn’t) like to see as a fix for the problems that the Cambridge Analytica data scandal has revealed.

Zuckerberg’s response on this — and on another question about his view on European privacy regulations — showed in the greatest detail yet how he’s hoping data handling and privacy rules evolve in the US, including a direct call for regulatory carve outs to — as he couched it — avoid the US falling behind Chinese competitors.

Laying out “a few principles” that he said he believes would be “useful to discuss and potentially codify into law”, Zuckerberg first advocated for having “a simple and practical set of ways that you explain what you’re doing with data”, revealing an appetite to offload the problem of tricky privacy disclosures via a handy universal standard that can apply to all players.

“It’s hard to say that people fully understand something when it’s only written out in a long legal document,” he added. “This stuff needs to be implemented in a way where people can actually understand it.”

He then talked up the notion of “giving people complete control” over the content they share — claiming this is “the most important principle for Facebook”.

“Every piece of content that you share on Facebook, you own and you have complete control over who sees it and how you share it — and you can remove it at any time,” he said, without mentioning how far from that principle the company has been at earlier times in its history.

“I think that that control is something that’s important — and I think should apply to every service,” he continued, making a not-so-subtle plea for no other platforms to be able to leak data like Facebook’s platform historically has (and thus to close any competitive loopholes that might open up as a result of Facebook tightening the screw on developer access to data now in the face of a major scandal).

His final and most controversial point in response to the legislative changes question was about what he dubbed “enabling innovation”.

“Some of these use cases that are very sensitive, like face recognition for example,” he said carefully. “And I think that there’s a balance that’s extremely important to strike here where you obtain special consent for sensitive features like facial recognition. But don’t — but that we still need to make it so that American companies can innovate in those areas.

“Or else we’re going to fall behind Chinese competitors and others around the world who have different regimes for different, new features like that.”

Zuckerberg did not say which Chinese competitors he was thinking of specifically. But earlier this week ecommerce giant Alibaba announced another major investment in a facial recognition software business, leading a $600M Series C round in Hong Kong-based SenseTime — as one possible rival example.

A little later in the session, Zuckerberg was also directly asked whether European privacy regulations should be applied in the US. And here again he showed more of his hand — once again refusing to confirm if Facebook will implement “the exact same regulation” for North American users, as some consumer groups have been calling for it to.

“Regardless of whether we implement the exact same regulation — I would guess it would be somewhat different because we have somewhat different sensibilities in the US, as do other countries — we’re committed to rolling out the controls and the affirmative consent, and the special controls around sensitive types of technologies like face recognition that are required in GDPR, we’re doing that around the world,” he reiterated.

“So I think it’s certainly worth discussing whether we should have something similar in the US but what I would like to say today is that we’re going to go forward and implement that [the same controls and affirmative consent] regardless of what the regulatory outcome is.”

Given that’s now the third refusal by Facebook to confirm GDPR will apply universally, it looks pretty clear that users in North American will get some degree of second tier privacy vs international users — unless or until US lawmakers forcibly raise standards on the company and the industry as a whole.

That is perhaps to be expected. But it’s still a tricky PR message for Facebook to be having to deliver in the midst of a major data scandal — hence Zuckerberg’s attempt to reframe it as a matter of domestic vs foreign “sensibilities”.

Whether North American Facebook users buy into his repackaging of coach class privacy standards vs the rest of the world as just a little local flavor remains to be seen.

Check Also

Undercover report shows the Facebook moderation sausage being made

An undercover reporter with the UK’s Channel 4 visited a content moderation outsourcing firm in Dublin and came away rather discouraged at what they saw: queues of flagged content waiting, videos of kids fighting staying online, orders from above not to take action on underage users. It sounds bad, but the truth is there are pretty good reasons for most of it and in the end the report comes off as rather naive. Not that it’s a bad thing for journalists to keep big companies (and their small contractors) honest, but the situations called out by Channel 4’s reporter seem to reflect a misunderstanding of the moderation process rather than problems with the process itself. I’m not a big Facebook fan, but in the matter of moderation I think they are sincere, if hugely unprepared. The bullet points raised by the report are all addressed in a letter from Facebook to the filmmakers. The company points out that some content needs to be left up because abhorrent as it is, it isn’t in violation of the company’s stated standards and may be informative; underage users and content has some special requirements but in other ways can’t be assumed to be real; popular pages do need to exist on different terms than small ones, whether they’re radical partisans or celebrities (or both); hate speech is a delicate and complex matter that often needs to be reviewed multiple times; and so on. The biggest problem doesn’t at all seem to be negligence by Facebook: there are reasons for everything, and as is often the case with moderation, those reasons are often unsatisfying but effective compromises. The problem is that the company has dragged its feet for years on taking responsibility for content and as such its moderation resources are simply overtaxed. The volume of content flagged by both automated processes and users is immense and Facebook hasn’t staffed up. Why do you think it’s outsourcing the work? By the way, did you know that this is a horrible job? Short film ‘The Moderators’ takes a look at the thankless job of patrolling the web Facebook in a blog post says that it is working on doubling its “safety and security” staff to 20,000, among which 6,500 will be on moderation duty. I’ve asked what the current number is, and whether that includes people at companies like this one (which has about 650 reviewers) and will update if I hear back. Even with a staff of thousands the judgments that need to be made are often so subjective, and the volume of content so great, that there will always be backlogs and mistakes. It doesn’t mean anyone should be let off the hook, but it doesn’t necessarily indicate a systematic failure other than, perhaps, a lack of labor. If people want Facebook to be effectively moderated they may need to accept that the process will be done by thousands of humans who imperfectly execute the task. Automated processes are useful but no replacement for the real thing. The result is a huge international group of moderators, overworked and cynical by profession, doing a messy and at times inadequate job of it.

Leave a Reply

Your email address will not be published. Required fields are marked *

Disclaimer: Trading in bitcoins or other digital currencies carries a high level of risk and can result in the total loss of the invested capital. theonlinetech.org does not provide investment advice, but only reflects its own opinion. Please ensure that if you trade or invest in bitcoins or other digital currencies (for example, investing in cloud mining services) you fully understand the risks involved! Please also note that some external links are affiliate links.