Facebook and Its Secret Policies

The Wall Street Journal reported this week that Facebook had studied for two years whether its social network makes people more polarized.

Researchers concluded that it does, and recommended changes to the company’s computerized systems to steer people away from vilifying one another. But the Journal reports that the company’s top executives declined to implement most of the proposed changes.

Fostering open dialogue among people with different viewpoints isn’t easy, and I don’t know if Facebook was right in shelving ideas like creating separate online huddles for parents arguing about vaccinations. But I do want to talk over two nagging questions sparked by this article and others:

  • Are politics rather than principles driving Facebook’s decisions?

  • Facebook said it didn’t want to make important policy decisions on its own. Then why did it make these important policy decisions alone and in secret?

These questions are important because Facebook is not an ordinary company. Whether we are aware of it or not, the ways the company designs its online hangouts shape how we behave and what we believe.

In an extreme example, Facebook has acknowledged that it failed to prevent its social network from being used to incite a genocide in Myanmar. That’s why it’s crucial that Facebook makes good policy choices in fair-minded and transparent ways.

On the first question, The Wall Street Journal reported that Facebook decided not to make most of the suggested changes aimed at reducing the spread of divisive content in part because more material from the right than the left would have been affected, and the company was worried about triggering claims (again) that the company was biased against conservatives.

If Facebook made these decisions on the merits, that would be one thing. But if Facebook picked its paths based on which political actors would get angry, that should make people of all political beliefs cringe.

This is not a partisan point. Facebook is obsessed with staying in the good graces of people with power, period. This is natural, to a point. (President Trump’s anger at Twitter for adding fact-checking notices to two of his tweets this week shows there are consequences to such decisions.) But there should be a line between understanding the political reality and letting politics dictate what happens on your site.

People at Facebook say the company doesn’t bend to politics. And Facebook in a blog post on Wednesday detailed its investments and changes that it said were intended to reduce polarization.

It also unnerves me that multiyear research into Facebook’s impact on the world stayed entirely inside its walls. What happens at Facebook is too important to stay secret.

Five years ago, Facebook’s chief executive, Mark Zuckerberg, touted research by its data scientists that found that the social network doesn’t worsen the problem of “filter bubbles,” in which people see only information that confirms their beliefs. The public could evaluate the data and discuss an important question affecting much of our society.

This is the opposite of Facebook’s stance that it wants input from lawmakers and the public about important topics like what speech is harmful and how to prevent cyberattacks. It also runs counter to Facebook’s efforts to work with independent fact checkers and create a quasi-independent board to adjudicate disputes over posts that violate the company’s rules.

“People shouldn’t have to rely on individual companies addressing these issues by themselves,” Zuckerberg wrote in an opinion piece for the Washington Post last year. “We should have a broader debate about what we want as a society and how regulation can help.”

It’s a good principle — but not if Facebook believes it only when it’s convenient.


The world of streaming video has inherited many of the entertainment industry’s money feuds and other battles. That is one reason it’s confusing to find something you want to watch.

Please admire this flow chart created by Edmund Lee, my colleague who writes about how the entertainment industry is changing. It shows how complex it can be for people to buy HBO Max, a Netflix-like video service from AT&T, the company that now owns HBO and the movie studio Warner Brothers.

If you’re not one of the tens of millions of people who already subscribe to TV or online versions of HBO, you’re all set to buy HBO Max directly if you like. (Although you might not be able to watch it on your TV set.) If you subscribe to a version of HBO already, consult Ed’s chart.

Why are there four versions of HBO? Why do you need to consult a flow chart to watch TV?

Well, it’s messy for companies to transition from the rabbit-ears era of home entertainment to the Netflix age. Some of this is natural in any industry evolution. But the HBO tangle also shows that we are at the mercy of companies fighting over the reordering of entertainment.

For the first time, companies like Disney and HBO’s parent company AT&T are trying to control everything from words on a script page to the pixels on a TV screen. It’s as if Ford made all the parts that go into its cars, assembled them in its factories and sold vehicles from its showrooms. (For non-car people, that’s not how it works.)

This has set off fights. Cable companies that have been the primary gateway to our living rooms don’t want to give an inch to companies like AT&T, which wants you to watch “Friends” (a show it owns) on digital spots that it alone controls.

“Friends” and “Star Wars” will increasingly be locked behind their corporate bosses’ digital walls. If you want to see both, you’ll probably have to figure out who owns them and pay two different companies for the privilege of watching. Isn’t TV fun and easy?!

  • That thing about policy having political consequences: In what would most likely be an unenforceable set of actions, the Trump administration is preparing an executive order that in part would make it easier for regulators to punish Facebook, Twitter and other internet companies for enforcing their policies on harmful posts. Please don’t let this distract us from consequential topics like how dangerous conspiracies spread online.

  • The key to internet success is making the stars $$$: My colleague Taylor Lorenz writes about the changes Facebook’s Instagram is making to let popular personalities make money more easily on the app, including allowing them to sell merchandise and share in the money Instagram generates from some video commercials. This hews closer to YouTube’s revenue options, which keep the stars (and their fans) glued to the site.

  • The retail king backs secondhand clothing: Walmart said it’s teaming up with a clothing resale website to let people buy previously owned clothing and accessories. Secondhand clothing websites have caught on recently among fashion lovers and those trying to buy less new stuff for environmental reasons. Walmart’s support could make this trend go much bigger.

This is an incredibly elaborate Rube Goldberg contraption for shooting a basketball.


We want to hear from you. Tell us what you think of this newsletter and what else you’d like us to explore. You can reach us at [email protected].

source: nytimes.com