By Shira Ovide
It’s easy to imagine that powerful technology companies are all-knowing geniuses. Often, though, technology superpowers are merely a collection of mortals making best guesses in response to external threats or backlash. And when that doesn’t work, they try something else.
That’s a useful framing to understand the shake-up Facebook, Inc. disclosed last week. The company is trying to change behavior so people engage in less aimless scrolling through their streams of posts and more meaningful interaction. Only time will tell whether Facebook’s changes will achieve the company’s goals and whether its goals are worth pursuing.
But it’s already clear that Facebook is rebooting itself because those who run the company are deeply worried. Yes, they’re worried about damaging people who are wallowing in a Facebook cesspool. But they’re also worried about potentially lasting damaging to the franchise after more than a year of headlines about Facebook’s use as a tool for uncivil conduct or worse, attempted voter manipulation and entrenching despots.
This all hews to Facebook’s pattern of changing its priorities — and taking its users and partners along for the ride — when it’s under pressure. And then it dumps those priorities when the heat fades, when the changes do more harm than good or when other external threats become more dominant.
If you don’t believe me, take a trip into Facebook history. The company started out as a digital meeting place for friends, family and acquaintances, but that began to change a few years ago in response to the threat from Twitter, Inc. I know, that seems hard to imagine.
But Twitter was scary because everything there could be seen by the world in real time. That helped it become the place where politicians posted videos announcing their candidacy, where fans trash-talked a football team live during the Super Bowl and where people got an up close look at world events such as the Arab Spring protests.
So a few years ago, Facebook rewrote its own rule book to amp up the volume of these in-the-moment activities and other “public content” — posts and video from politicians, celebrities, news organizations and other groups. It encouraged public figures to use Facebook, especially for Web videos, and the company’s computer systems prioritized their messages above those of friends and family. Facebook allowed social network posts to be inserted into other Web sites to encourage messages to spread more widely.
Facebook changed into a mix of baby photos, news, gossip and viral videos. Facebook celebrated when the Ice Bucket Challenge, a stunt for charity, spread like wildfire in the summer of 2014. It was a sign that Facebook was no longer completely ceding real-time and public conversation to Twitter.
And then came the backlash. It turned out the Ice Bucket Challenge was inescapable on Facebook, and big news happening at the same time — notably the civil rights protests in Ferguson, Missouri — was far less visible. What did Facebook do? It corrected its priorities.
The company reordered the stream of posts to emphasize “timeliness” — news over more evergreen ice buckets, for example. It started a feature to show “trending news” handpicked by humans rather than computers to ensure that people found important information on Facebook. You can predict what happened next. Backlash again. Facebook took heat in 2016 for reportedly excluding right-leaning news sources from its approved news categories. Facebook was chagrined, and trending news was gutted.
ARE YOU DETECTING A PATTERN?
This short history shows Facebook’s fixes to problems sometimes need fixes of their own. And note that the changes Facebook is making now to double down on “meaningful interactions” are the opposite of the changes it made to counter Twitter’s rising influence. Back then Facebook fixed itself by doubling down on public content. Now the fix is less public content.
I’m not saying Facebook shifts gears repeatedly to be malicious, or even that its continual tinkering is bad. Companies should change priorities quickly when business and social needs dictate. I believe CEO Mark Zuckerberg when he says, as he did last week, that he wants Facebook to be a force for good. Whether he is capable of doing that, or taking the right steps to do so, are a different matter.
Facebook always has high-minded principles behind the changes it makes, but it’s important to see the full context. Like any organism, Facebook responds to external stimulus and then responds again — perhaps back in the original direction — to the next external stimulus. And it doesn’t necessarily know what it’s doing.
This is what all humans and human-run organizations do, of course. But it’s useful to remember that nothing at Facebook is permanent, not even its principles. That means Zuckerberg’s current clear-eyed worldview and his shake-up of Facebook’s strategy may not stick. — Bloomberg
This column does not necessarily reflect the opinion of Bloomberg LP and its owners.