Facebook's Corporate Reputation is in Free Fall

Facebook Corporate Reputation

 

In August 2018, we wrote that Facebook stood on the edge of a reputational precipice: If the company made the right choices and avoided any more public failures on privacy and data security, it could likely emerge from a three-year storm of scandal with its overall reputation intact. 

But following recent revelations that hundreds of millions of user passwords were accessible—even searchable—to Facebook employees as far back as 2012, we can reveal that Facebook now has one of the lowest scores in our U.S. RepTrak rankings.

The only company ranked lower is the Trump Organization; above Facebook is a cigarette company, longtime corporate villains in the American imagination.

How did this happen? 

How did the darling of Silicon Valley become less trustworthy in the public eye than a tobacco company? Let’s take a look at some selected scandals from last year: 


Looking at these scandals, we can see a few trends that connect them.

The company tried to protect its image instead of its reputation. 

Among the most damaging revelations of the past year was New York Timesinvestigation that revealed the extent to which Facebook had used political lobbyists and other tactics to disguise and downplay the company’s role in the manipulation of the 2016 U.S. presidential election. It showed Facebook—through a public relations firm—engaging in smear tactics to silence or discredit Facebook critics. 

The article revealed that COO Sheryl Sandberg was angry with security analyst Alex Stamos for investigating the Russia allegations without her permission and talking about them with select board members, and wanted to remove references to Russia from a 2017 white paper about foreign interference on Facebook.

That doesn’t sound like a company engaging in a good-faith effort to understand how its product was used to manipulate an election. It sounds like a company looking to cover up its own mistakes and avoid accountability.  

This attempt to downplay the seriousness of the election hacking backfired; the steady drip of damaging information that followed was much more devastating than a full and clear accounting from the company itself would have been. 

Mark Zuckerberg seems increasingly out of touch with Facebook users.

Last year, we said that Zuckerberg was an asset to Facebook, and that his actions would be crucial to Facebook’s ability to weather the reputational storm. We’re sorry to say the CEO’s behavior hasn’t lived up to our expectations. 

In January 2019, Zuckerberg took to the pages of the Wall Street Journal to answer Facebook’s critics. Rather than own up to Facebook’s many failures, Zuckerberg argued the platform was merely misunderstood. 

Tech journalist Kara Swisher published a response in the New York Times in which she took Zuckerberg to task for what she felt was an op-ed that elided as much as it admitted. Her rebuttal was a glaring example of how Facebook has also lost the confidence of the press, who once wrote glowing articles about its culture and leadership.

In his op-ed, Zuckerberg often relies on arguing semantics, asserting that Facebook would never, ever “sell” its users’ data. Rather, he explains, it merely charges companies a fee to access the vast trove of information it has accumulated over the past 15 years in order to target certain sections of users for advertising (a process that most people would call selling). 

Zuckerberg also falls back upon his oft-repeated anecdote that “people” are always telling him that, if they have to see ads, they want them to be relevant. Ask nearly anyone who is not Zuckerberg about this, however, and they will tell you how creeped out they are by the same ads following them around the internet, sometimes for things they’ve never searched for online. The story simply does not stand up to scrutiny, and his repeated use of it as justification for Facebook’s massive data collection makes him seem at best disconnected from users and, at worst, willfully blind.

So, what should Facebook do? We have a few ideas.

Be honest with users about the tradeoffs of using the platform. 

Facebook has a dilemma it can’t easily solve: Its billion-dollar valuation is based on an extremely lucrative advertising business, which is dependent on mining the massive amount of data it collects from its 2.7 billion users worldwide. At the same time, it’s all that data that makes its users extremely nervous. 

Instead of relying on meaningless distinctions, as he did in his WSJ piece, Zuckerberg should simply be honest: We make money through advertising and through the use of your data. If Facebook means that much to them, as Zuckerberg believes, many users may simply agree that the tradeoff is worth it. Others may beg off, but they will do so feeling that Facebook is at least being forthright. 

Rather than hiding privacy settings in a sub-sub-sub-menu, Facebook should make it easy for users to opt out of much of this data collection, especially of their personal information. And the platform should be up-front about how that data is used. 

Zuckerberg talks a lot about his commitment to transparency and choice. He needs to start walking the walk, which may require tough choices for his company. While he may have started Facebook in 2004 because he wanted to “connect people,” the platform’s raison d’être today is much broader—and there’s no point in pretending otherwise. 

He has shown signs this week that he understands Facebook needs to change. At the F8 developer summit on April 30, he declared, with his characteristic understatement: "We don't have the best reputation on privacy right now." He presented the summit as part of  Facebook's plan to build a "privacy-focused social platform" with a particular focus on private messaging. That seems like a start, but far from enough.

Get serious—really, really serious—about security.

If you’re going to be transparent with users about their data and its value, you need to make it your No. 1 priority to keep that data safe. 

Whatever processes are currently in place at Facebook, they aren’t good enough. That’s clear from the steady stream of failures we’ve seen over the past two years. Facebook needs to perform a serious internal audit (or, even better, bring in someone from the outside to do it) and it needs to publish the findings of that audit as well as a report on how it intends to safeguard user privacy going forward. 

It must be brutally honest about its own shortcomings and willing to invest serious money in fixing the problem. If it doesn't, regulators will do it instead. Already, Democratic presidential candidate Elizabeth Warren is talking about breaking up Facebook. (In another blunder, Facebook caused another firestorm by taking down one of Warren’s Facebook ads...about breaking up Facebook, making her own argument for her.) 

This reputation issue will not be resolved until Facebook has gone at least a year—heck, even a month or two—without some new revelation about its security practices. 

Invest big-time in content moderators and technology.

One of the most damaging scandals of the past year was the way Facebook—as well as Twitter and YouTube—has been used by foreign governments and fringe movements to incite violence and spread hate. Whether the targets were the Rohyinga in Myanmar, dissidents in Iran, or vulnerable voters in the lead-up to the 2016 election, again and again bad actors were able to use Facebook’s tools for nefarious ends, with sometimes deadly consequences. 

In response, Facebook has expressed regret but reflected a general sense of powerlessness—and has paid only lip service to trying to quell this tidal wave of propaganda. 

That can no longer continue. Facebook is one of the wealthiest companies in the world. It absolutely has the resources to seriously crack down on the misinformation and hate speech that proliferates on its platform. 

Now, following the March 2019 shooting at the mosque in New Zealand, which the white nationalist shooter streamed on Facebook Live, Facebook can no longer afford to throw up its hands about what gets published on the platform. It has to take responsibility—and be willing to put a lot of money into fixing the problem. This will probably mean investing in humans, rather than algorithms, to police the vast amount of content published on the site every day. 

In the 1980s, Johnson & Johnson spent $30 million—a huge sum at the time—to remove every bottle of Tylenol from U.S. shelves in wake of a string of poisonings. Facebook must do the same when it comes to toxic ideologies that spread hate and violence. 

What’s Next for Facebook?

Despite its shortcomings, billions of people worldwide still use Facebook and love it for the way it allows them to keep up with long-lost friends and family members. 

However, that love will be tested if Facebook does not address its many internal problems. People will become increasingly wary of being associated with a company they perceive as profiting off their personal data, lying to their faces about it, and contributing—however unintentionally—to the decline of democracy and the rise of extremist movements. 

Facebook rocketed to success by following the dictum “Move fast and break things.” For the company to continue to thrive, it must now work to fix what, in its haste, it has broken. And that includes its own reputation.

Learn more in our guide about reputation and how trust has declined in most recent years. 

Download Guide

 

 

SHG Stephen Hahn-Griffiths
 Executive Partner, Chief Reputation Officer
 Reputation Institute
 shahn@reputationinstitute.com
 @shahngriff

 

Previous
How Governance Drives Reputation
How Governance Drives Reputation

Society is demanding that companies be open and communicative with their stakeholders, as well as demonstra...

Next
7 Ways to Quantify Reputation
7 Ways to Quantify Reputation

Q: How does Reputation Institute quantify an intangible concept like corporate reputation? A: Accurately.