Thursday, January 31, 2019

Facebook's Catch-22

John Herrman's How Secrecy Fuels Facebook Paranoia takes the long way round to come to a very simple conclusion. My shorter version of Herrman's conclusion is this. In order to make money Facebook needs to:
  1. Convince advertisers that it is an effective means of manipulating the behavior of the mass of the population.
  2. Avoid regulation by convincing governments that it is not an effective means of manipulating the behavior of the mass of the population.
The dilemma is even worse because among the advertisers Facebook needs to believe in its effectiveness are individual politicians and political parties, both big advertisers! This Catch-22 is the source of Facebook's continuing PR problems, listed by Ryan Mac. Follow me below the fold for details.

Facebook has tried to finesse the dilemma by lying to both sides:
  • As Max Read describes in How Much of the Internet Is Fake? Turns Out, a Lot of It, Actually, they have lied to advertisers about what their money is buying, exaggerating the effectiveness of their platform:
    Metrics should be the most real thing on the internet: They are countable, trackable, and verifiable, and their existence undergirds the advertising business that drives our biggest social and search platforms. Yet not even Facebook, the world’s greatest data–gathering organization, seems able to produce genuine figures. In October, small advertisers filed suit against the social-media giant, accusing it of covering up, for a year, its significant overstatements of the time users spent watching videos on the platform (by 60 to 80 percent, Facebook says; by 150 to 900 percent, the plaintiffs say). According to an exhaustive list at MarketingLand, over the past two years Facebook has admitted to misreporting the reach of posts on Facebook Pages (in two different ways), the rate at which viewers complete ad videos, the average time spent reading its “Instant Articles,” the amount of referral traffic from Facebook to external websites, the number of views that videos received via Facebook’s mobile site, and the number of video views in Instant Articles.
    Josh Marshall has more on this side of the dilemma in How Facebook Punked and then Gut Punched the News Biz.
  • Roger Mcnamee's devastating I Mentored Mark Zuckerberg. I Loved Facebook. But I Can't Stay Silent About What's Happening describes how they have lied to governments, minimizing the effectiveness of their platform:
    when confronted with evidence that disinformation and fake news had spread over Facebook and may have influenced a British referendum or an election in the U.S., Facebook followed a playbook it had run since its founding: deny, delay, deflect, dissemble. Facebook only came clean when forced to, and revealed as little information as possible. Then it went to Plan B: apologize, and promise to do better.
    Trevor Timm has more on this side of the dilemma in How Facebook Borrows From the NSA Playbook.
The subtitle of Kieren McCarthy's Mark Zuckerberg did everything in his power to avoid Facebook becoming the next MySpace – but forgot one crucial detail… is No one likes a lying asshole, which pretty much sums it up:
Facebook, its CEO Mark Zuckerberg, and its COO Sheryl Sandberg, and its public relations people, and its engineers have lied. They have lied repeatedly. They have lied exhaustively. They have lied so much they've lost track of their lies, and then lied about them.
By any measure, Facebook as an organization has knowingly, willingly, purposefully, and repeatedly lied. And two reports this week demonstrate that the depth of its lying was even worse than we previously imagined.
The two reports were by Gabriel J.X. Dance, Michael LaForgia and Nicholas Confessore in the New York Times:
  • As Facebook Raised a Privacy Wall, It Carved an Opening for Tech Giants, which McCarthy summarizes thus:
    Facebook cut data-exchange deals with all sorts of companies based on this premise: give them what they want, and in return they would be hauled onto Zuckerberg's internet reservation.

    For example, Yahoo! got real-time feeds of posts by users' friends – reminding us of Cambridge Analytica gathering information on millions of voters via a quiz app, and using it to target them in contentious political campaigns in the US and Europe.

    Microsoft's Bing was able to access the names of nearly all Facebook users’ friends without permission, and Amazon was able to get at friends' names and contact details. Russian search engine Yandex had Facebook account IDs at its fingertips, though it claims it didn't even know this information was available. Facebook at first told the New York Times Yandex wasn't a partner, and then told US Congress it was.
  • Facebook Gave Device Makers Deep Access to Data on Users and Friends, which McCarthy summarizes thus:
    Facebook got in bed with smartphone manufacturers, such as Apple, Amazon, BlackBerry, Microsoft, and Samsung. Facebook secretly gave the device makers access to each phone user's Facebook friends' profiles, when the handheld was linked to its owner's account, bypassing protections.
McCarthy concludes:
Faced with evidence of its data-sharing agreements where – let's not forget this – Facebook provided third parties access to people's personal messages, and more importantly to their contacts lists and friends' feeds, the company claims it broke no promises because it defined the outfits it signed agreements with as "service providers." And so, according to Facebook, it didn't break a pact it has with the US government's trade watchdog, the FTC, not to share private data without permission, and likewise not to break agreements it has with its users.
As for the question of potential abuse of personal data handed to third parties, Facebook amazingly used the same line that it rolled out when it attempted to deflect the Cambridge Analytica scandal: that third parties were beholden to Facebook's rules about using data. But, of course, Facebook doesn't check or audit whether that is the case.
Trying to sell diametrically opposed stories to two different audiences just isn't effective PR. Lying to both sides at the same time just makes it safe to assume that, whatever Facebook is admitting to, the truth is far worse.

Cory Doctorow reports on a wonderful example of Facebook's speaking out of both sides of its mouth:
Though Facebook's lobbying associations spent the whole debate over the EU Copyright Directive arguing (correctly) that algorithmic filters to catch copyright infringement would end up blocking mountains of legitimate speech (while still letting through mountains of infringement), Facebook secretly told the EU Commission that it used filters all the time, had utmost confidence in them, and couldn't see any problems with their use.  
Doctorow's piece is based on Laura Kayali's Inside Facebook’s fight against European regulation, which is a detailed analysis of Facebook's lobbying the EU from documents released following a freedom of information request.

Natasha Lomas' The case against behavioral advertising is stacking up provides another example of Facebook's duplicity, based on research by Carnegie Mellon University professor of IT and public policy, Alessandro Acquisti, which showed that:
behaviourally targeted advertising had increased the publisher’s revenue but only marginally. At the same time they found that marketers were having to pay orders of magnitude more to buy these targeted ads, despite the minuscule additional revenue they generated for the publisher.

“What we found was that, yes, advertising with cookies — so targeted advertising — did increase revenues — but by a tiny amount. Four per cent. In absolute terms the increase in revenues was $0.000008 per advertisment,” Acquisti told the hearing. “Simultaneously we were running a study, as merchants, buying ads with a different degree of targeting. And we found that for the merchants sometimes buying targeted ads over untargeted ads can be 500% times as expensive.”
The alternative to targeted ads is DuckDuckGo-style contextual ads. Lomas continues:
If Acquisti’s research is to be believed ... there’s little reason to think [contextual] ads would be substantially less effective than the vampiric microtargeted variant that Facebook founder Mark Zuckerberg likes to describe as “relevant”.

The ‘relevant ads’ badge is of course a self-serving concept which Facebook uses to justify creeping on users while also pushing the notion that its people-tracking business inherently generates major extra value for advertisers. But does it really do that? Or are advertisers buying into another puffed up fake?

Facebook isn’t providing access to internal data that could be used to quantify whether its targeted ads are really worth all the extra conjoined cost and risk. While the company’s habit of buying masses of additional data on users, via brokers and other third party sources, makes for a rather strange qualification. Suggesting things aren’t quite what you might imagine behind Zuckerberg’s drawn curtain.
Given Facebook's history of lying about metrics, even if it did produce internal data showing that the targeted ads were worth the cost, why should advertisers believe it? And if the internal data doesn't show that, Facebook can't afford to release it.

Cory Doctorow reports on another of Facebook's unending attempts to prevent outsiders finding out about what it is actually doing as opposed to what it says it is doing:
Propublica is one of many organizations, mainly nonprofits, whose "ad transparency" tools scrape Facebook ads and catalog them, along with the targeting data that exposes who is paying for which messages to be shown to whom.

Facebook previously warned these watchdogs that their tools violated Facebook's terms of service and that their days were numbered and now Facebook has made technical changes to its site that locks out ad transparency tools operated by Propublica, Mozilla, and Who Targets Me.
Facebook says that its limited database of political ads makes the independent tools redundant, but Facebook's own database contains important omissions, including ads from the NRA, ads targeting Bernie Sanders supporters, and more. Moreover, this database is only available in three of the dozens of countries where Facebook is a significant political player.
So Facebook is lying about its database of political ads, too. ProPublica's full report is here.

It is actually hard to keep up with Facebook. The very same day that Doctorow penned the report above, he followed up with Project Atlas: Facebook has been secretly paying Iphone users to install an all-surveilling "VPN" app. This is an area where Facebook has form:
Facebook previously faced disgrace and crisis when it was revealed that Onavo, a so-called VPN app that was actually grabbing a huge tranche of data from users; Apple subsequently removed Onavo from its app store.
Undeterred, Facebook launched "Project Atlas":
The program recruits users aged 13 to 35, and has been running since 2016. Facebook confirmed that it uses the app to "gather data on usage habits."
How is the app gathering data?
The "Facebook Research" VPN is an app that circumvents Apple's ban on certain kinds of surveillance by cloaking itself as a beta app and distributing through the Applause, Betabound and Utest services, rather than Apple's App Store: users get up to $20/month, plus referral fees, to run the app, which comes with a man-in-the-middle certificate that lets Facebook intercept "private messages in social media apps, chats from in instant messaging apps – including photos/videos sent to others, emails, web searches, web browsing activity, and even ongoing location information by tapping into the feeds of any location tracking apps you may have installed."
Facebook does not distribute the "Research" app through Apple's own beta-test program, choosing instead to launder it through third parties. Facebook is pretty clearly violating Apple's policies in doing this.
So Facebook appears to be lying to the "users aged 13 to 35" by not telling them what data it is collecting, and it was lying to Apple:
Apple says Facebook broke an agreement it made with Apple by publishing a “research” app for iPhone users that allowed the social giant to collect all kinds of personal data about those users, TechCrunch reported Tuesday. The app allowed Facebook to track users’ app history, their private messages, and their location data. Facebook’s research effort reportedly targeted users as young as 13 years old.

As of last summer, apps that collect that kind of data are against Apple’s privacy guidelines. That means Facebook couldn’t make this research app available through the App Store, which would have required Apple approval. 

Instead, Facebook apparently took advantage of Apple’s “Developer Enterprise Program,” which lets approved Apple partners, like Facebook, test and distribute apps specifically for their own employees. In those cases, the employees can use third-party services to download beta versions of apps that aren’t available to the general public.
Not many 13-year-old employees at Facebook, so Apple banned the app:
Apple won’t let Facebook distribute the app anymore ... Apple’s statement also mentions that Facebook’s “certificates” — plural — have been revoked. That implies Facebook cannot distribute other apps to employees through this developer program right now, not just the research app.
As Rob Price reported, actual Facebook employees were dependent upon the certificates too:
Apple revoked the developer certificate that Facebook was using to power the research apps — and, in doing so, crippled Facebook's workforce.

This is because the same certificate that authenticated the research apps was also used in the key internal Facebook apps its tens of thousands of employees use every day.

The move dramatically escalated tensions between Facebook and Apple, and has left Facebook employees unable to communicate with colleagues, access internal information, and even use company transportation.
Affected apps include internal builds of Workplace, Facebook's internal version of Facebook for employee communications; Workplace Chat; Instagram; Messenger; "and other internal apps like Mobile Home and the Ride app."
Facebook still has 2 years worth of data to work with. The price was chaos inside the company but hey, check the stock price!

Techcrunch has also discovered that:
Google has been running an app called Screenwise Meter, which bears a strong resemblance to the app distributed by Facebook Research that has now been barred by Apple, TechCrunch has learned.

In its app, Google invites users aged 18 and up (or 13 if part of a family group) to download the app by way of a special code and registration process using an Enterprise Certificate. That’s the same type of policy violation that led Apple to shut down Facebook’s similar Research VPN iOS app, which had the knock-on effect of also disabling usage of Facebook’s legitimate employee-only apps — which run on the same Facebook Enterprise Certificate — and making Facebook look very iffy in the process.
However, at least:
Unlike Facebook, Google is much more upfront about how its research data collection programs work, what’s collected and that it’s directly involved. It also gives users the option of “guest mode” for when they don’t want traffic monitored, or someone younger than 13 is using the device.
I suppose that's something, but if Facebook's app violates Apple's policies so does Google's.


David. said...

Benedict Carey's This Is Your Brain Off Facebook reports on a major study of the psychological effects of quitting Facebook:

"So what happens if you actually do quit? A new study, the most comprehensive to date, offers a preview.

Expect the consequences to be fairly immediate: More in-person time with friends and family. Less political knowledge, but also less partisan fever. A small bump in one’s daily moods and life satisfaction. And, for the average Facebook user, an extra hour a day of downtime.

The study, by researchers at Stanford University and New York University, helps clarify the ceaseless debate over Facebook’s influence on the behavior, thinking and politics of its active monthly users, who number some 2.3 billion worldwide."

David. said...

Ron Amadeo reports that what is sauce for the goose is sauce for the gander:

"According to a report from The Verge, Apple has shut down Google's internal iOS apps for doing the exact same thing Facebook was doing—distributing enterprise apps outside of the company."

David. said...

Roger McNamee's book, Zucked: Waking Up to the Facebook Catastrophe is well worth reading.

David. said...

In German Regulators Just Outlawed Facebook's Whole Ad Business, Emily Dreyfuss describes how Germany's cartel office cuts through Facebook's obfuscation (my emphasis):

"Facebook’s massively lucrative advertising model relies on tracking its one billion users—as well as the billions on WhatsApp and Instagram—across the web and smartphone apps, collecting data on which sites and apps they visit, where they shop, what they like, and combining all that information into comprehensive user profiles. Facebook has maintained that collecting all this data allows the company to serve ads that are more relevant to users’ interests. Privacy advocates have argued that the company isn’t transparent enough about what data it has and what it does with it. As a result, most people don’t understand the massive trade-off they are making with their information when they sign up for the “free” site.

On Thursday, Germany’s Federal Cartel Office, the country’s antitrust regulator, ruled that Facebook was exploiting consumers by requiring them to agree to this kind of data collection in order to have an account, and has prohibited the practice going forward."

David. said...

The headline says it all in Matthew Hughes' Facebook lets you search for pictures of your female friends, but not your male ones:

"If you’re feeling an overwhelming sense of deja vu, you’re not alone. The predecessor to Facebook was a deeply unsavory site called Facemash that allowed Harvard University students to rate their female colleagues based on perceived physical attractiveness."

David. said...

To no-ones' great surprise, in Gambling, porn, and piracy on iOS: Apple’s enterprise certificate woes continue Samuel Axon reports that Google and Facebook are far from the only abusers of certificates from Apple's Enterprise Developer Program:

"the Reuters report describes the use of enterprise certificates to distribute pirated versions of popular iOS software like Minecraft, Spotify, and Pok√©mon Go. ... The distributors impersonate legitimate businesses to gain access to Apple’s enterprise certification program and tools. ... Earlier this week, a TechCrunch investigation also discovered a "dozen hardcore-pornography apps and a dozen real-money gambling apps that escaped Apple's oversight."

Signing up developers is the priority for the program, not auditing whether they are abiding by the agreements they signed. And the agreements almost certainly absolve Apple of any responsibility for what is done with the certificates.

David. said...

The UK House of Commons isn't happy with Facebook:

"Facebook deliberately broke privacy and competition law and should urgently be subject to statutory regulation, according to a devastating parliamentary report denouncing the company and its executives as “digital gangsters”.

The final report of the Digital, Culture, Media and Sport select committee’s 18-month investigation into disinformation and fake news accused Facebook of purposefully obstructing its inquiry and failing to tackle attempts by Russia to manipulate elections."

David. said...

The Economist speculates What would happen if Facebook were turned off?:

"Were Mark Zuckerberg to turn off his creation, another, similar platform might be propelled to dominance. But the Facebook era could instead be the product of unique, fleeting historical circumstances. In that case, a sunnier social-network ecology might be achievable—if only the citizens of Facebook could be nudged to seek something better."