Home / News & Analysis / Facebook still wants to be a media company

Facebook still wants to be a media company

Facebook may have disbanded its “Trending” news section, but the social network is not abandoning its media company ambitions, despite whatever CEO Mark Zuckerberg said to Congress. In fact, the opposite of “not being a media business” is now occurring: Facebook is directly paying news publishers to create video, even as it claims its focus is on you and your “time well spent” on its site.

Sorry, Facebook, but you can no longer claim you’re just a platform, just a technology enabler, when you are directly funding journalism.

And you can’t claim you care about our time when you’re funding all these new videos meant to draw us in daily and keep us watching.

Facebook funds the news

It was recently announced that Facebook will roll out a series of news video shows from select partners, including TV news organizations CNN, ABC News, Fox News Channel and Univision, along with local news publisher Advance Local, and digital companies ATTN: and Mic. The shows will include a mix of live and breaking news as well as longer-form series and features.

The shows are being funded by Facebook for at least one year’s time, though the (undisclosed) terms will vary by network.

Even though publishers have had the rug pulled out from underneath them before — when, suddenly, Facebook decided it was time to focus on “quality time” on its network, and decreased publisher content in the News Feed as a result — they seem happy to create content for Facebook yet again.

I know, it’s baffling.

In addition, it’s building out a game-streaming competitor to battle Amazon’s Twitch and Google’s YouTube.

But what’s even worse is that Facebook continues to claim some sort of “we’re just a platform” sentiment — and one that cares about users’ time, no less! — even as it pursues these initiatives.

Time well spent… watching our news and videos

The move to fund news videos not only invalidates Facebook’s claims on the “just a tech platform” front, it calls into question how serious the company is about its “time well spent” focus.

This newer set of product development guidelines aims to increase the visibility of personal content at the expense of publishers and other junk.

The company is not alone in thinking about time well spent, even if it doesn’t understand what it’s doing about it.

Amid a growing backlash about the evils of technology addiction on our brain, emotional and social development and quality of life, other tech companies, including both Apple and Google, have now announced notable new efforts to regain control over our phones’ ability to interrupt, stress and addict. Both are rolling out new digital wellness tools in their next mobile operating system updates that will allow users to monitor and control their phone and app usage like never before.

Facebook, to some extent, has been attempting to participate in this movement as well, even as Apple in particular targets it as one of the apps we should all cut down on.

To its credit, Facebook reduced publisher content on the News Feed and the presence of viral videos, and saw its daily active users decline as a result.

Today, its latest “time well spent”-associated feature is arriving: “Memories,” a section where you can fondly look back on all the personal sharing and connections Facebook has enabled, and celebrate those moments with family and friends. (To be clear, Facebook is not calling Memories a part of “time well spent,” we are.)

The feature aims to remind users that the social network is truly about your personal connections, not the browsing of third-party content. It ties into Facebook’s larger self-image: The company still likes to envision itself, ever optimistically, as a force for good in the world. A platform that brings people together.

Yes, the platform used for Russian election meddling in the U.S. Yes, the one where millions of users had their data misappropriated through lax data handling policies. Yes, the one that contributes to genocide.

But look, have some old birthday party pictures! We care about you!

If ‘connecting people’ was the goal, Facebook would look a lot different

Look, there’s nothing wrong with Facebook pushing users to revisit their memories with family and friends, and many will even appreciate Facebook’s Memories feature and find joy in using it.

But it’s increasingly hard to take Facebook seriously when it claims “connecting people” and “quality time” are its larger goals while it puts its money elsewhere.

To date, Facebook has squandered so many opportunities to innovate on its platform around the subject of personal sharing, and has instead largely turned into a cloning machine where it adopts the innovations of others.

Really, what has it done lately that’s not a copy of Instagram copying Snapchat?

Even Apple now has a better Photos product than Facebook. Apple’s is infused with AI smarts and automatic sharing prompts in iOS 12, while Facebook is still figuring out where to stick its Stories module.

And why can’t Facebook users easily search back through their memories and photos, in the robust machine learning-infused ways that Apple and Google can?

Facebook can’t find old photos from the search bar

Google can with ease

Really, why hasn’t Facebook — at least more recently — built us anything useful with the data we provided?

After all those check-ins and posts about which books we’re reading or what we’re watching on TV, all we get are more targeted ads.

If the company cared about connecting us with our friends, it could have built dozens of features on the back of this data by now:

  • Robust search features that turned our shared data into our own private, personal search engines
  • What to watch recommendations and reminders for our favorite streaming services
  • TV Time-like tools for tracking our binges and meeting fellow fans
  • Book clubs based on what your friends are reading
  • Notifications about restaurant openings nearby based on where you’ve eaten before
  • Collaborative photo albums (yes, it tried this through its Moments app, which spiked in popularity, but instead of doubling down on the app it’s allowed Google Photos to dominate and Apple to catch up with AI features and iCloud advancements)
  • Personalized travel guides (another experiment that died)
  • Private family groups that offered things like digitization services for sharing photos from old albums (it could have partnered with third parties on this), grandma’s recipes, private updates, family histories and more, instead of pointing families to its general-purpose “groups” product, which isn’t built with the specific needs of families in mind

I mean, these are just a few off the top of my head. I’m sure you can think of a dozen more.

Instead we’ve got Facebook launching some round-up of old personal sharing features (and remember it stole On This Day from Timehop) and investing heavily in everything video by funding news and cloning Twitch — both of which aims to suck up your time.

I know, I know — it’s too late for Facebook to go back to being a just a social network.

It would require a radical revamp of what Facebook is and does. It would have to remove publisher content, destroy its video business and completely arrest the viral spread of news — fake and otherwise — by restricting URL-laden posts from being viewable by anyone but your friends or the Facebook Groups with which they were shared.

Facebook can’t do this. It won’t do this. Facebook wants to survive.

So instead, let’s just insist Facebook be honest about itself: Yes, we’re a media company AND a tech platform AND a video network AND a social network.

Anything claiming otherwise is a lie.

Read more

Check Also

Facebook’s new AI research is a real eye-opener

There are plenty of ways to manipulate photos to make you look better, remove red eye or lens flare, and so on. But so far the blink has proven a tenacious opponent of good snapshots. That may change with research from Facebook that replaces closed eyes with open ones in a remarkably convincing manner. It’s far from the only example of intelligent “in-painting,” as the technique is called when a program fills in a space with what it thinks belongs there. Adobe in particular has made good use of it with its “context-aware fill,” allowing users to seamlessly replace undesired features, for example a protruding branch or a cloud, with a pretty good guess at what would be there if it weren’t. But some features are beyond the tools’ capacity to replace, one of which is eyes. Their detailed and highly variable nature make it particularly difficult for a system to change or create them realistically. Facebook, which probably has more pictures of people blinking than any other entity in history, decided to take a crack at this problem. It does so with a Generative Adversarial Network, essentially a machine learning system that tries to fool itself into thinking its creations are real. In a GAN, one part of the system learns to recognize, say, faces, and another part of the system repeatedly creates images that, based on feedback from the recognition part, gradually grow in realism. From left to right: “Exemplar” images, source images, Photoshop’s eye-opening algorithm, and Facebook’s method. In this case the network is trained to both recognize and replicate convincing open eyes. This could be done already, but as you can see in the examples at right, existing methods left something to be desired. They seem to paste in the eyes of the people without much consideration for consistency with the rest of the image. Machines are naive that way: they have no intuitive understanding that opening one’s eyes does not also change the color of the skin around them. (For that matter, they have no intuitive understanding of eyes, color, or anything at all.) What Facebook’s researchers did was to include “exemplar” data showing the target person with their eyes open, from which the GAN learns not just what eyes should go on the person, but how the eyes of this particular person are shaped, colored, and so on. The results are quite realistic: there’s no color mismatch or obvious stitching because the recognition part of the network knows that that’s not how the person looks. In testing, people mistook the fake eyes-opened photos for real ones, or said they couldn’t be sure which was which, more than half the time. And unless I knew a photo was definitely tampered with, I probably wouldn’t notice if I was scrolling past it in my newsfeed. Gandhi looks a little weird, though. It still fails in some situations, creating weird artifacts if a person’s eye is partially covered by a lock of hair, or sometimes failing to recreate the color correctly. But those are fixable problems. You can imagine the usefulness of an automatic eye-opening utility on Facebook that checks a person’s other photos and uses them as reference to replace a blink in the latest one. It would be a little creepy, but that’s pretty standard for Facebook, and at least it might save a group photo or two.

Leave a Reply

Your email address will not be published. Required fields are marked *

Disclaimer: Trading in bitcoins or other digital currencies carries a high level of risk and can result in the total loss of the invested capital. theonlinetech.org does not provide investment advice, but only reflects its own opinion. Please ensure that if you trade or invest in bitcoins or other digital currencies (for example, investing in cloud mining services) you fully understand the risks involved! Please also note that some external links are affiliate links.