Home / News & Analysis / Palmer Luckey, political martyr?

Palmer Luckey, political martyr?

In the middle of testimony over Facebook’s privacy scandal, Sen. Ted Cruz of Texas took a moment to grill Mark Zuckerberg over his company’s political loyalties.

In the course of a testy exchange between Sen. Cruz and Zuckerberg, the senator brought up the dismissal of Palmer Luckey, the controversial founder of virtual reality tech development pioneer, Oculus .

It was part of Cruz’s broader questioning about whether or not Facebook is biased in the ways it moderates the posts and accounts of members — and in its staffing policies.

Here’s the exchange:

Cruz: Do you know the political orientation of those 15 to 20,000 people engaged in content review?

Zuckerberg: No senator, we do not generally ask people about their political orientation when they’re joining the company.

Cruz: So, as CEO Have you ever made hiring or firing decisions based on political positions or what candidates they supported?

Zuckerberg: No.

Cruz: Why was Palmer Luckey fired?

Zuckerberg: That is a specific personnel matter that seems like it would be inappropriate to speak to here.

Cruz: You just made a specific representation that you didn’t make decisions based on political views, is that accurate?

Zuckerberg: I can commit that it was not because of a political view.

Luckey left Facebook last March, after reports surfaced that he was a member of a pro-Trump troll farm called Nimble America.

Luckey’s departure follows a lengthy period of absence from public view brought about by a Daily Beast piece revealing his involvement and funding of a pro-Trump troll group called Nimble America. News of his support came during a time when very few figures in Silicon Valley were publicly showing support for candidate Trump, the most notable being Peter Thiel, an early investor in Facebook who started the VC firm Founders Fund, which backed Oculus, as well.

Though Luckey initially denied funding the group, he ultimately took to social media to apologize in the midst of an upheaval that had many developers threatening to leave the platform. His last public statement (on Facebook, of course) was a mixture of regret and defense, reading, in part, “I am deeply sorry that my actions are negatively impacting the perception of Oculus and its partners. The recent news stories about me do not accurately represent my views… my actions were my own and do not represent Oculus. I’m sorry for the impact my actions are having on the community.”

Check Also

Undercover report shows the Facebook moderation sausage being made

An undercover reporter with the UK’s Channel 4 visited a content moderation outsourcing firm in Dublin and came away rather discouraged at what they saw: queues of flagged content waiting, videos of kids fighting staying online, orders from above not to take action on underage users. It sounds bad, but the truth is there are pretty good reasons for most of it and in the end the report comes off as rather naive. Not that it’s a bad thing for journalists to keep big companies (and their small contractors) honest, but the situations called out by Channel 4’s reporter seem to reflect a misunderstanding of the moderation process rather than problems with the process itself. I’m not a big Facebook fan, but in the matter of moderation I think they are sincere, if hugely unprepared. The bullet points raised by the report are all addressed in a letter from Facebook to the filmmakers. The company points out that some content needs to be left up because abhorrent as it is, it isn’t in violation of the company’s stated standards and may be informative; underage users and content has some special requirements but in other ways can’t be assumed to be real; popular pages do need to exist on different terms than small ones, whether they’re radical partisans or celebrities (or both); hate speech is a delicate and complex matter that often needs to be reviewed multiple times; and so on. The biggest problem doesn’t at all seem to be negligence by Facebook: there are reasons for everything, and as is often the case with moderation, those reasons are often unsatisfying but effective compromises. The problem is that the company has dragged its feet for years on taking responsibility for content and as such its moderation resources are simply overtaxed. The volume of content flagged by both automated processes and users is immense and Facebook hasn’t staffed up. Why do you think it’s outsourcing the work? By the way, did you know that this is a horrible job? Short film ‘The Moderators’ takes a look at the thankless job of patrolling the web Facebook in a blog post says that it is working on doubling its “safety and security” staff to 20,000, among which 6,500 will be on moderation duty. I’ve asked what the current number is, and whether that includes people at companies like this one (which has about 650 reviewers) and will update if I hear back. Even with a staff of thousands the judgments that need to be made are often so subjective, and the volume of content so great, that there will always be backlogs and mistakes. It doesn’t mean anyone should be let off the hook, but it doesn’t necessarily indicate a systematic failure other than, perhaps, a lack of labor. If people want Facebook to be effectively moderated they may need to accept that the process will be done by thousands of humans who imperfectly execute the task. Automated processes are useful but no replacement for the real thing. The result is a huge international group of moderators, overworked and cynical by profession, doing a messy and at times inadequate job of it.

Leave a Reply

Your email address will not be published. Required fields are marked *

Disclaimer: Trading in bitcoins or other digital currencies carries a high level of risk and can result in the total loss of the invested capital. theonlinetech.org does not provide investment advice, but only reflects its own opinion. Please ensure that if you trade or invest in bitcoins or other digital currencies (for example, investing in cloud mining services) you fully understand the risks involved! Please also note that some external links are affiliate links.