No, Facebook Does Not Reflect Reality

This article is part of the On Tech newsletter. You can sign up here to receive it weekdays.

Mark Zuckerberg is the world’s most powerful unelected person, and it drives me bonkers when he misrepresents what’s happening on Facebook.

In an interview that aired on Tuesday, Zuckerberg was asked big and thorny questions about his company: Why are people sometimes cruel to one another on Facebook, and why do inflammatory, partisan posts get so much attention?

Zuckerberg told “Axios on HBO” that Americans are angry and divided right now, and that’s why they act that way on Facebook, too.

Zuckerberg and other Facebook executives consistently say that Facebook is a mirror on society. An online gathering that gives a personal printing press to billions of people will inevitably have all the good and the bad of those people. (My colleague Mike Isaac has talked about this view before.)

It’s true but also comically incomplete to say that Facebook reflects reality. Instead, Facebook presents reality filtered through its own prism, and this affects what people think and do.

Facebook regularly rewrites its computer systems to meet the company’s goals; the company might make it more likely that you’ll see a friend’s baby photo than a news article about wildfires. That doesn’t mean that wildfires aren’t real, but it does mean that Facebook is creating a world where the fires are not in the forefront.

Facebook’s ability to shape, not merely reflect, people’s preferences and behavior is also how the company makes money. The company might suggest to a video game developer that tweaking its social media ads — changing the pitch language or tailoring the ad differently for Midwestern college students than for 40-somethings on the West Coast — can help it sell more app downloads.

Facebook sells billions of dollars in ads each year because what people see there, and how Facebook chooses to prioritize that information, can influence what people believe and buy.

Facebook knows it has the power to shape what we believe and how we act. That’s why it has restricted wrong information about the coronavirus, and it doesn’t allow people to bully one another online.

Further proof: An internal team of researchers at Facebook concluded that the social network made people more polarized, The Wall Street Journal reported in May. American society is deeply divided, but Facebook contributes to this, too.

So why does Zuckerberg keep saying that Facebook is a mirror of society? Maybe it’s a handy media talking point that is intentionally uncomplicated.

There are no easy fixes to make Facebook or much of the world less polarized and divided, but it’s dishonest for Zuckerberg to say his company is a bystander rather than a participant in what billions of people on its site believe and how they behave.

Zuckerberg knows — as we all do — the power that Facebook has to remake reality.


Your Lead

A reader from El Dorado Hills, California, emailed a follow-up question to last week’s newsletter about Utah’s flawed, but still promising, virus-alert app. Why does any health authority need to persuade us to download another app, when our phones already follow our movements and could be redeployed to figure out whom we might have exposed to the coronavirus?

Yup, fair question. First, I would say that it’s not great for a zillion apps to already collect information about where we go and what we do. But it’s true that one flaw of many coronavirus-tracing apps around the world is that people have to be persuaded to download yet another app, and trust what it does.

Google and Apple are working together on technology that would make it easier for states to notify people who may have been exposed to the coronavirus by detecting phones that come close to one another. With this technology, the states would not necessarily have to create separate health apps.

People still need to trust this virus-alert technology and give it permission to track their whereabouts. Trust in both technology companies and public health authorities has been sorely lacking in this pandemic.

Google and Apple’s technology is also still in development, and some elected officials and public health authorities in the United States and other countries decided they needed to create their own apps to give people more information about the coronavirus or to help track possible exposures. It’s a good bet that some states and countries will incorporate Google and Apple’s virus-alert system into their own early app versions.

Public health experts have said this kind of virus exposure notification technology will be useful for as long as we’re battling the coronavirus. And most people who have followed Google and Apple’s work have said the companies are (mostly) doing the right things to listen to health authorities and also protect people’s privacy.

This virus-alert technology will be flawed, possibly creepy and not a silver bullet, but we need it.


  • Online school stinks. So does in-person school. Crashing websites, cyberattacks and a tangle of technology complicated the early days of back to virtual school for many American school children, my colleagues Dan Levin and Kate Taylor wrote. Online learning problems were a symptom of a lack of guidance from state and federal education officials, one expert told them.

    And at colleges that opted to reopen for classes in person, my colleague Natasha Singer reported that administrators have sometimes failed to help or effectively isolate students infected with or exposed to the coronavirus.

  • Don’t buy a new phone expecting it to be magically faster: The next generation of wireless technology promises to make our phones zippier and connect our cars and factory equipment to the internet more easily. But right now, the claims about 5G wireless are a lot of hot air. A Washington Post columnist found that smartphones connected to 5G phone networks surfed the internet at roughly the same or even slower speeds than older networks.

  • I’m sorry. It’s pointless to make your canned beans look beautiful. If you’ve been on Instagram, you’ve seen that aesthetic of hyper organized and color-coded food pantries, closets and sock drawers. Go read this New York Times Magazine article about the two people most responsible for this look and how they reflect an online subculture that both fetishizes control over some aspects of life, like stylish junk drawers, while also reveling in being imperfect.

Gus the hamster is going on a JOURNEY.


We want to hear from you. Tell us what you think of this newsletter and what else you’d like us to explore. You can reach us at [email protected].

If you don’t already get this newsletter in your inbox, please sign up here.

source: nytimes.com