Find tour dates and live music events for all your favorite bands and artists in your city! Get concert tickets, news and more!

  • Analytics
  • Tour Dates

Rethinking Facebook: Why We Need To Ensure ‘Good For The World’ Is More Important Than ‘Good For Facebook’

Facebook
1050 0

(Hypebot) — Frances Haugen’s recent bombshell testimony revealed that Facebook knows exactly what it’s doing, and is a company that needs to act in a way that doesn’t focus purely on the company’s growth at the expense of all else.

Op-ed by Mike Masnick of Techdirt

I’m sure by now most of you have either seen or read about Facebook whistleblower Frances Haugen’s appearance on 60 Minutes discussing in detail the many problems she saw within Facebook. I’m always a little skeptical about 60 Minutes these days, as the show has an unfortunately long history of misrepresenting things about the internet, and similarly a single person’s claims about what’s happening within a company are not always the most accurate. That said, what Haugen does have to say is still kind of eye opening, and certainly concerning.

The key takeaway that many seem to be highlighting from the interview is Haugen noting that Facebook knows damn well that making the site better for users will make Facebook less money.

Frances Haugen: And one of the consequences of how Facebook is picking out that content today is it is — optimizing for content that gets engagement, or reaction. But its own research is showing that content that is hateful, that is divisive, that is polarizing, it’s easier to inspire people to anger than it is to other emotions. Scott Pelley: Misinformation, angry content– is enticing to people and keep– Frances Haugen: Very enticing. Scott Pelley:–keeps them on the platform. Frances Haugen: Yes. Facebook has realized that if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads, they’ll make less money.

Of course, none of this should be surprising to anyone. Mark Zuckerberg himself said as much in an internal email that was revealed a few years ago, in which he noted in response to a suggestion to make Facebook better: “that may be good for the world but it’s not good for us.”

Over the last few years that line has stuck with me, and I’ve had a few conversations trying to think through what that actually means. There is one argument, which partly makes sense to me, that much of this actually falls back on the problem being Wall Street and the (false) idea that a company’s fiduciary duty is solely to its shareholders above all else. This kind of thinking has certainly damned many companies that are so focused on making quarterly numbers and quarterly results that it makes it impossible to focus on long term sustainability and how, in the long term, being “good for the world” should also be “good for the company.” So many companies have been destroyed by needing to keep Wall Street happy.

[MORE: Facebook just proved why you need a direct connection with your fans]

And, of course, it’s tempting to blame Wall Street. And we’ve certainly seen it happen in other situations. The fear of “missing our numbers” drives so many stupid decisions in corporate America. I’m still nervous about how Wall St. is pressuring Twitter to make some questionable decisions. However, blaming Wall Street conveniently leaves Facebook off the hook, and that would also be wrong. As Haugen admits in the interview, she’s worked at other internet companies that weren’t like that.

I’ve seen a bunch of social networks and it was substantially worse at Facebook than anything I’d seen before.


So what is it about Facebook that leads them to believe that, when given the choice between “good for the world” and “good for Facebook,” it must lean in on “good for Facebook” at the cost of the world? That aspect has been less explored, and unfortunately Haugen’s revelations don’t tell us that much about why Facebook is so uniquely bad at this. I think some of it may be tied to what I wrote last week: Facebook’s internal hubris about what the company can and cannot accomplish — including a belief that maybe it can walk the fine line between pleasing Wall Street and beating its numbers… and not supporting genocide in Myanmar.

Part of me, though, wonders if the problem is not just the drive to meet Wall St.’s numbers, but that Zuckerberg, senior management, and (perhaps more importantly) Facebook’s Board actually believe that short term fiduciary duty to shareholders really is more important than being good for the world. Looking at Facebook’s Board, it’s not exactly composed of anyone who you’d think would step up to highlight that maybe “doing the right thing for society” outweighs keeping Wall Street happy. And that’s particularly disappointing given that Zuckerberg doesn’t need to keep Wall Street happy. The corporate structure of Facebook allows him to basically do what he wants (within certain limits) and still retain pretty much full control. He could come out and say that Facebook is going to stop worrying about its growth and focus on being better stewards. But he doesn’t seem interested in doing so.

This is obviously armchair psychologizing someone I do not know, but one of the most interesting traits that I’ve observed about Zuckerberg is that — more than just about any other CEO since Andy Grove — he truly seems to have internalized Andy Grove’s mantra that “only the paranoid survive.” I’ve talked before about how Facebook really seems to have completely bought into the idea of the Innovator’s Dilemma, and how competition can come from unexpected places and completely overwhelm incumbents before they even realize it. That has very clearly explained Facebook seeming to “overpay” for Instagram and WhatsApp (and then desperately try to buy out Snapchat, TikTok and others).

But that same thinking might easily apply to some of its other decisions as well, including a belief that if you’re not growing, you’re dying. And, as the NY Times notes, some of the recently leaked documents show real cracks in Facebook’s monolithic facade:

But if these leaked documents proved anything, it is how un-Godzilla-like Facebook feels. The documents, shared with The Journal by Frances Haugen, a former Facebook product manager, reveal a company worried that it is losing power and influence, not gaining it, with its own research showing that many of its products aren’t thriving organically. Instead, it is going to increasingly extreme lengths to improve its toxic image, and to stop users from abandoning its apps in favor of more compelling alternatives. You can see this vulnerability on display in an installment of The Journal’s series that landed last week. The article, which cited internal Facebook research, revealed that the company has been strategizing about how to market itself to children, referring to preteens as a “valuable but untapped audience.” The article contained plenty of fodder for outrage, including a presentation in which Facebook researchers asked if there was “a way to leverage playdates to drive word of hand/growth among kids?” It’s a crazy-sounding question, but it’s also revealing. Would a confident, thriving social media app need to “leverage playdates,” or concoct elaborate growth strategies aimed at 10-year-olds? If Facebook is so unstoppable, would it really be promoting itself to tweens as — and please read this in the voice of the Steve Buscemi “How do you do, fellow kids?” meme — a “Life Coach for Adulting?”

So if you’ve been brought up to believe with every ounce of your mind and soul that growth is everything, and that the second you take your eye off the ball it will stop, decisions that are “good for Facebook, but bad for the world” become the norm. Going back to my post on the hubris of Facebook, it also feels like Mark thinks that once Facebook passes some imaginary boundary, then they can go back and fix the parts of the world they screwed up. It doesn’t work like that, though.

And that’s a problem.

So what can be done about that? At an absolute first pass, it would be nice if Mark Zuckerberg realized that he can make some decisions that are “good for the world, but bad for Facebook” and he should do that publicly, transparently, and clearly explaining why he knows that this will harm their growth or bottom line, but that it’s the right thing to do. To some small extent he tried to do something like that with the Oversight Board, but it was a half measure, with limited power. But it was something. He needs to be willing to step up and do more things like that, and if Wall Street doesn’t like it, he should just say he doesn’t care, this is too important. Other CEOs have done this. Hell, Jeff Bezos spent the first decade or so of Amazon’s life as a public company constantly telling Wall Street that’s how things were going to work (people now forget just how much Wall Street hated Amazon, and just how frequently Bezos told them he didn’t care, he was going to build a better customer experience). Google (perhaps somewhat infamously) launched their IPO with a giant middle finger to Wall Street in noting that they weren’t going to play the bankers’ games (though… that promise has mostly disappeared from Google, along with the founders).

Of course, in doing so most people will dismiss whatever Zuckerberg decides to do as a cynical nothingburger. And they should. He’s not done nearly enough to build up the public trust on this. But if he can actually follow through and do the right thing over and over again, especially when it’s bad for Facebook, that would at least start things moving in the right direction.


There are plenty of other ideas on how to make Facebook be better — and Haugen actually has some pretty good suggestions herself, first noting that the tools most people reach for won’t work:

While some have called for Facebook to be broken up or stripped of content liability protections, she disagrees. Neither approach would address the problems uncovered in the documents, she said—that despite numerous initiatives, Facebook didn’t address or make public what it knew about its platforms’ ill effects.

That is, breaking up the company won’t make a difference for reasons we’ve discussed before, and taking away Section 230 will only give Facebook way more power — since smaller companies will be wiped out by the lack of liability protections.

Instead, Haugen notes, there needs to be way more transparency about how Facebook is doing what it’s doing:

In Ms. Haugen’s view, allowing outsiders to see the company’s research and operations is essential. She also argues for a radical simplification of Facebook’s systems and for limits on promoting content based on levels of engagement, a core feature of Facebook’s recommendation systems. The company’s own research has found that “misinformation, toxicity, and violent content are inordinately prevalent” in material reshared by users and promoted by the company’s own mechanics.

Tragically, Facebook has been going in the other direction and trying to make it harder for researchers to understand what’s going on there and study the impact.

I think there are some other structural changes that would also have some impact (a bunch of which I’ll lay out in an upcoming paper), but getting Zuckerberg, the Board, and the senior management team to be okay with focusing on something other than short term growth would be a huge step forward. Years back I noted that human beings have an unfortunate habit of optimizing for what we can measure and downplaying what we can’t. Engagement. Revenue. Daily average users. These are all measurable. What’s good for humanity is not measurable. It’s easy to prioritize one over the other — and somehow that needs to change.

There have been a few external steps in that direction. The Long Term Stock Exchange is an interesting experiment in getting companies past the “meet the quarterly numbers” mindset, and two big tech companies recently listed there — including Asana, which was founded by Zuckerberg’s co-founder and former righthand man, Dustin Moskovitz. That’s not a solution in and of itself, but it does show a direction in which we can look for solutions that might get past the constant focus on growth at the expense of everything else.

In the end, there are many complicating factors, but as noted earlier, Facebook seems pretty extreme in its unwillingness to actually confront many of these issues. Some of that, no doubt, is that many people are complaining about things that are unfixable, or are blaming Facebook for things totally outside of its control. But there are many things that Facebook does control and could do a much better job in dealing with. Yet, Facebook to date has failed to make it clear on a companywide basis that “good for the world, but bad for Facebook” is actually okay, and maybe it should be the focus for a while.


This is not to say that there aren’t people within the company who are working on doing such things — because there clearly are. The problem is that when a big issue needs a decision from the top, the end result is always to choose what’s good for Facebook over what’s good for the world. And however that can change, it needs to change. And that’s really up to one person.

Join CelebrityAccess Now