Russian trolls are back. Here's what you need to know

As CNN’s Donie O’Sullivan reports: The disrupted operation used fake personas including realistic-looking computer-generated photos of people, a network of Facebook accounts and pages that had only a small amount of engagement and influence at the time it was taken down, and a website that was set up to look and operate like a left-wing news outlet.

The What Matters newsletter posed a few key questions to Donie, who covers the intersection of disinformation, politics and technology, about what this really means.

What Matters: Can you explain why this announcement from Facebook is a big deal and what it means as we head into the final months of the presidential election?

DO: There’s good news and bad news here. The good news is this particular operation seemed to be in its infancy and had a small following. It was just trying to get off the ground. The bad news is there’s probably a lot more of this happening that hasn’t been detected. And even though this was small, they still managed to convince unwitting real freelance writers, including Americans, to write for them.

Is Facebook doing better?

What Matters: You report in the story that these trolls had far more luck gaining followers and engagement in 2016. Is that because Facebook is getting better at spotting misinformation campaigns?

DO: Facebook and the US intel community were caught totally off-guard in 2016. There were some fake Russian Black Lives Matter Facebook pages, for instance, that had more than 300,000 followers. So Facebook and USG has more resources dedicated to rooting out this stuff now and I think that means that it would be a lot harder for a covert operation to gain such a huge following.

But, no doubt, the trolls are evolving. The campaign that was uncovered Tuesday showed signs of greater sophistication at covering its tracks and trying to conceal who was behind it than some previous attempts. We also saw them use “deepfake,” computer-generated images as profile photos on fake accounts — literally pictures of people that do not exist.

Before, a good way to tell if an account was fake would be to check if it was using a stolen photo from somewhere else — now there are tools freely available online to create thousands of faces of fake people, and they can look very real.

Sharing information

What Matters: Facebook actually shared its findings with Twitter. How rare is that?

DO: The companies say they are sharing information with each other, and a few weeks ago companies like Google, Facebook, Twitter, Reddit and others met with officials from FBI, DHS and DNI (one company that wasn’t there was TikTok!). So there is collaboration happening, but I know from speaking to people at the companies there are tensions, too. The companies have very different approaches in how they handle misinformation — even from the President.

Domestic information matters, too

What Matters: Anything else people should know about this?

DO: Foreign interference is obviously a huge story, but I think how this year’s election plays out online (and we are all online a lot more right now) will be characterized by misinformation from domestic actors and what tech companies do or don’t do about it. … Look at how many deceptive and misleading videos top Republicans shared over two days earlier this week!

source: cnn.com