The big tech giants at the gates of media

A senior executive at Facebook created something of a shitstorm after tweeting a series of comments about the Russia investigation.

The U.S. Justice Department indicted 13 people for their alleged involvement in Russian meddling in the election. Rob Goldman, Facebook’s VP of advertising, tweeted to say that he was not only excited by the indictment, but that it showed “very definitively that swaying the election was not the main goal.” In doing so, Goldman unravelled months of policy work that Facebook has been doing behind the scenes to stay above partisan politics. It got worse: Trump retweeted him.

Cue YouTube, which was caught displaying ads by anonymous attackers that leached your CPU power and electricity while they covertly mined cryptocurrency — and while you watched cat videos.

People were only alerted when their antivirus programs detected cryptocurrency mining code. Most cases used a a crypto mining service called Coinhive, which is controversial because it allows users to profit while surreptitiously using other people’s computers. The scripts use 80% of your CPU, which explains why it’s now a beachball that sounds like a 18-wheeler truck.
Ars Technica

Zuckerberg is on a roll. In the third major change this year (and we’ve only just started!), he announced that Facebook will start showing more stories from “your local town or city.”

At the start of the year, it was about friends and family, and less news (because it’s yucky to police). And then it was a 2-question survey to find out which news sources people trust. And this week, it’s all about local news — whatever that means. This time, he makes a key assumption: that people reading local news care about civic engagement. This may work well in markets that have a diversity of local, city- or district-level media, but that’s rare in Asia. It also assumes that local news is less prone to fake-ness. Right. Prediction: you’re going to see a pivot-to-local — publishers will start creating local content to ride the change in the algo.

Facebook, in its ongoing retreat from the publisher space, is now asking its users to identify what they consider trustworthy media sources.

It doesn't want to be the one deciding what people should read (I don't want them doing that either), so they're asking people to tell them. What could possibly go wrong? Let’s reframe this: So after deciding that people were susceptible to fake news and misinformation, Facebook now wants people to self-declare. I’m not sure how we’re defining trust in media, but I’m absolutely sure it doesn’t translate well around the world.