Thursday, January 31, 2019

Facebook's Catch-22

John Herrman's How Secrecy Fuels Facebook Paranoia takes the long way round to come to a very simple conclusion. My shorter version of Herrman's conclusion is this. In order to make money Facebook needs to:
  1. Convince advertisers that it is an effective means of manipulating the behavior of the mass of the population.
  2. Avoid regulation by convincing governments that it is not an effective means of manipulating the behavior of the mass of the population.
The dilemma is even worse because among the advertisers Facebook needs to believe in its effectiveness are individual politicians and political parties, both big advertisers! This Catch-22 is the source of Facebook's continuing PR problems, listed by Ryan Mac. Follow me below the fold for details.

Facebook has tried to finesse the dilemma by lying to both sides:
  • As Max Read describes in How Much of the Internet Is Fake? Turns Out, a Lot of It, Actually, they have lied to advertisers about what their money is buying, exaggerating the effectiveness of their platform:
    Metrics should be the most real thing on the internet: They are countable, trackable, and verifiable, and their existence undergirds the advertising business that drives our biggest social and search platforms. Yet not even Facebook, the world’s greatest data–gathering organization, seems able to produce genuine figures. In October, small advertisers filed suit against the social-media giant, accusing it of covering up, for a year, its significant overstatements of the time users spent watching videos on the platform (by 60 to 80 percent, Facebook says; by 150 to 900 percent, the plaintiffs say). According to an exhaustive list at MarketingLand, over the past two years Facebook has admitted to misreporting the reach of posts on Facebook Pages (in two different ways), the rate at which viewers complete ad videos, the average time spent reading its “Instant Articles,” the amount of referral traffic from Facebook to external websites, the number of views that videos received via Facebook’s mobile site, and the number of video views in Instant Articles.
    Josh Marshall has more on this side of the dilemma in How Facebook Punked and then Gut Punched the News Biz.
  • Roger Mcnamee's devastating I Mentored Mark Zuckerberg. I Loved Facebook. But I Can't Stay Silent About What's Happening describes how they have lied to governments, minimizing the effectiveness of their platform:
    when confronted with evidence that disinformation and fake news had spread over Facebook and may have influenced a British referendum or an election in the U.S., Facebook followed a playbook it had run since its founding: deny, delay, deflect, dissemble. Facebook only came clean when forced to, and revealed as little information as possible. Then it went to Plan B: apologize, and promise to do better.
    Trevor Timm has more on this side of the dilemma in How Facebook Borrows From the NSA Playbook.
The subtitle of Kieren McCarthy's Mark Zuckerberg did everything in his power to avoid Facebook becoming the next MySpace – but forgot one crucial detail… is No one likes a lying asshole, which pretty much sums it up:
Facebook, its CEO Mark Zuckerberg, and its COO Sheryl Sandberg, and its public relations people, and its engineers have lied. They have lied repeatedly. They have lied exhaustively. They have lied so much they've lost track of their lies, and then lied about them.
By any measure, Facebook as an organization has knowingly, willingly, purposefully, and repeatedly lied. And two reports this week demonstrate that the depth of its lying was even worse than we previously imagined.
The two reports were by Gabriel J.X. Dance, Michael LaForgia and Nicholas Confessore in the New York Times:
  • As Facebook Raised a Privacy Wall, It Carved an Opening for Tech Giants, which McCarthy summarizes thus:
    Facebook cut data-exchange deals with all sorts of companies based on this premise: give them what they want, and in return they would be hauled onto Zuckerberg's internet reservation.

    For example, Yahoo! got real-time feeds of posts by users' friends – reminding us of Cambridge Analytica gathering information on millions of voters via a quiz app, and using it to target them in contentious political campaigns in the US and Europe.

    Microsoft's Bing was able to access the names of nearly all Facebook users’ friends without permission, and Amazon was able to get at friends' names and contact details. Russian search engine Yandex had Facebook account IDs at its fingertips, though it claims it didn't even know this information was available. Facebook at first told the New York Times Yandex wasn't a partner, and then told US Congress it was.
  • Facebook Gave Device Makers Deep Access to Data on Users and Friends, which McCarthy summarizes thus:
    Facebook got in bed with smartphone manufacturers, such as Apple, Amazon, BlackBerry, Microsoft, and Samsung. Facebook secretly gave the device makers access to each phone user's Facebook friends' profiles, when the handheld was linked to its owner's account, bypassing protections.
McCarthy concludes:
Faced with evidence of its data-sharing agreements where – let's not forget this – Facebook provided third parties access to people's personal messages, and more importantly to their contacts lists and friends' feeds, the company claims it broke no promises because it defined the outfits it signed agreements with as "service providers." And so, according to Facebook, it didn't break a pact it has with the US government's trade watchdog, the FTC, not to share private data without permission, and likewise not to break agreements it has with its users.
As for the question of potential abuse of personal data handed to third parties, Facebook amazingly used the same line that it rolled out when it attempted to deflect the Cambridge Analytica scandal: that third parties were beholden to Facebook's rules about using data. But, of course, Facebook doesn't check or audit whether that is the case.
Trying to sell diametrically opposed stories to two different audiences just isn't effective PR. Lying to both sides at the same time just makes it safe to assume that, whatever Facebook is admitting to, the truth is far worse.

Cory Doctorow reports on a wonderful example of Facebook's speaking out of both sides of its mouth:
Though Facebook's lobbying associations spent the whole debate over the EU Copyright Directive arguing (correctly) that algorithmic filters to catch copyright infringement would end up blocking mountains of legitimate speech (while still letting through mountains of infringement), Facebook secretly told the EU Commission that it used filters all the time, had utmost confidence in them, and couldn't see any problems with their use.  
Doctorow's piece is based on Laura Kayali's Inside Facebook’s fight against European regulation, which is a detailed analysis of Facebook's lobbying the EU from documents released following a freedom of information request.

Natasha Lomas' The case against behavioral advertising is stacking up provides another example of Facebook's duplicity, based on research by Carnegie Mellon University professor of IT and public policy, Alessandro Acquisti, which showed that:
behaviourally targeted advertising had increased the publisher’s revenue but only marginally. At the same time they found that marketers were having to pay orders of magnitude more to buy these targeted ads, despite the minuscule additional revenue they generated for the publisher.

“What we found was that, yes, advertising with cookies — so targeted advertising — did increase revenues — but by a tiny amount. Four per cent. In absolute terms the increase in revenues was $0.000008 per advertisment,” Acquisti told the hearing. “Simultaneously we were running a study, as merchants, buying ads with a different degree of targeting. And we found that for the merchants sometimes buying targeted ads over untargeted ads can be 500% times as expensive.”
The alternative to targeted ads is DuckDuckGo-style contextual ads. Lomas continues:
If Acquisti’s research is to be believed ... there’s little reason to think [contextual] ads would be substantially less effective than the vampiric microtargeted variant that Facebook founder Mark Zuckerberg likes to describe as “relevant”.

The ‘relevant ads’ badge is of course a self-serving concept which Facebook uses to justify creeping on users while also pushing the notion that its people-tracking business inherently generates major extra value for advertisers. But does it really do that? Or are advertisers buying into another puffed up fake?

Facebook isn’t providing access to internal data that could be used to quantify whether its targeted ads are really worth all the extra conjoined cost and risk. While the company’s habit of buying masses of additional data on users, via brokers and other third party sources, makes for a rather strange qualification. Suggesting things aren’t quite what you might imagine behind Zuckerberg’s drawn curtain.
Given Facebook's history of lying about metrics, even if it did produce internal data showing that the targeted ads were worth the cost, why should advertisers believe it? And if the internal data doesn't show that, Facebook can't afford to release it.

Cory Doctorow reports on another of Facebook's unending attempts to prevent outsiders finding out about what it is actually doing as opposed to what it says it is doing:
Propublica is one of many organizations, mainly nonprofits, whose "ad transparency" tools scrape Facebook ads and catalog them, along with the targeting data that exposes who is paying for which messages to be shown to whom.

Facebook previously warned these watchdogs that their tools violated Facebook's terms of service and that their days were numbered and now Facebook has made technical changes to its site that locks out ad transparency tools operated by Propublica, Mozilla, and Who Targets Me.
Facebook says that its limited database of political ads makes the independent tools redundant, but Facebook's own database contains important omissions, including ads from the NRA, ads targeting Bernie Sanders supporters, and more. Moreover, this database is only available in three of the dozens of countries where Facebook is a significant political player.
So Facebook is lying about its database of political ads, too. ProPublica's full report is here.

It is actually hard to keep up with Facebook. The very same day that Doctorow penned the report above, he followed up with Project Atlas: Facebook has been secretly paying Iphone users to install an all-surveilling "VPN" app. This is an area where Facebook has form:
Facebook previously faced disgrace and crisis when it was revealed that Onavo, a so-called VPN app that was actually grabbing a huge tranche of data from users; Apple subsequently removed Onavo from its app store.
Undeterred, Facebook launched "Project Atlas":
The program recruits users aged 13 to 35, and has been running since 2016. Facebook confirmed that it uses the app to "gather data on usage habits."
How is the app gathering data?
The "Facebook Research" VPN is an app that circumvents Apple's ban on certain kinds of surveillance by cloaking itself as a beta app and distributing through the Applause, Betabound and Utest services, rather than Apple's App Store: users get up to $20/month, plus referral fees, to run the app, which comes with a man-in-the-middle certificate that lets Facebook intercept "private messages in social media apps, chats from in instant messaging apps – including photos/videos sent to others, emails, web searches, web browsing activity, and even ongoing location information by tapping into the feeds of any location tracking apps you may have installed."
Facebook does not distribute the "Research" app through Apple's own beta-test program, choosing instead to launder it through third parties. Facebook is pretty clearly violating Apple's policies in doing this.
So Facebook appears to be lying to the "users aged 13 to 35" by not telling them what data it is collecting, and it was lying to Apple:
Apple says Facebook broke an agreement it made with Apple by publishing a “research” app for iPhone users that allowed the social giant to collect all kinds of personal data about those users, TechCrunch reported Tuesday. The app allowed Facebook to track users’ app history, their private messages, and their location data. Facebook’s research effort reportedly targeted users as young as 13 years old.

As of last summer, apps that collect that kind of data are against Apple’s privacy guidelines. That means Facebook couldn’t make this research app available through the App Store, which would have required Apple approval. 

Instead, Facebook apparently took advantage of Apple’s “Developer Enterprise Program,” which lets approved Apple partners, like Facebook, test and distribute apps specifically for their own employees. In those cases, the employees can use third-party services to download beta versions of apps that aren’t available to the general public.
Not many 13-year-old employees at Facebook, so Apple banned the app:
Apple won’t let Facebook distribute the app anymore ... Apple’s statement also mentions that Facebook’s “certificates” — plural — have been revoked. That implies Facebook cannot distribute other apps to employees through this developer program right now, not just the research app.
As Rob Price reported, actual Facebook employees were dependent upon the certificates too:
Apple revoked the developer certificate that Facebook was using to power the research apps — and, in doing so, crippled Facebook's workforce.

This is because the same certificate that authenticated the research apps was also used in the key internal Facebook apps its tens of thousands of employees use every day.

The move dramatically escalated tensions between Facebook and Apple, and has left Facebook employees unable to communicate with colleagues, access internal information, and even use company transportation.
Affected apps include internal builds of Workplace, Facebook's internal version of Facebook for employee communications; Workplace Chat; Instagram; Messenger; "and other internal apps like Mobile Home and the Ride app."
Facebook still has 2 years worth of data to work with. The price was chaos inside the company but hey, check the stock price!

Techcrunch has also discovered that:
Google has been running an app called Screenwise Meter, which bears a strong resemblance to the app distributed by Facebook Research that has now been barred by Apple, TechCrunch has learned.

In its app, Google invites users aged 18 and up (or 13 if part of a family group) to download the app by way of a special code and registration process using an Enterprise Certificate. That’s the same type of policy violation that led Apple to shut down Facebook’s similar Research VPN iOS app, which had the knock-on effect of also disabling usage of Facebook’s legitimate employee-only apps — which run on the same Facebook Enterprise Certificate — and making Facebook look very iffy in the process.
However, at least:
Unlike Facebook, Google is much more upfront about how its research data collection programs work, what’s collected and that it’s directly involved. It also gives users the option of “guest mode” for when they don’t want traffic monitored, or someone younger than 13 is using the device.
I suppose that's something, but if Facebook's app violates Apple's policies so does Google's.


David. said...

Benedict Carey's This Is Your Brain Off Facebook reports on a major study of the psychological effects of quitting Facebook:

"So what happens if you actually do quit? A new study, the most comprehensive to date, offers a preview.

Expect the consequences to be fairly immediate: More in-person time with friends and family. Less political knowledge, but also less partisan fever. A small bump in one’s daily moods and life satisfaction. And, for the average Facebook user, an extra hour a day of downtime.

The study, by researchers at Stanford University and New York University, helps clarify the ceaseless debate over Facebook’s influence on the behavior, thinking and politics of its active monthly users, who number some 2.3 billion worldwide."

David. said...

Ron Amadeo reports that what is sauce for the goose is sauce for the gander:

"According to a report from The Verge, Apple has shut down Google's internal iOS apps for doing the exact same thing Facebook was doing—distributing enterprise apps outside of the company."

David. said...

Roger McNamee's book, Zucked: Waking Up to the Facebook Catastrophe is well worth reading.

David. said...

In German Regulators Just Outlawed Facebook's Whole Ad Business, Emily Dreyfuss describes how Germany's cartel office cuts through Facebook's obfuscation (my emphasis):

"Facebook’s massively lucrative advertising model relies on tracking its one billion users—as well as the billions on WhatsApp and Instagram—across the web and smartphone apps, collecting data on which sites and apps they visit, where they shop, what they like, and combining all that information into comprehensive user profiles. Facebook has maintained that collecting all this data allows the company to serve ads that are more relevant to users’ interests. Privacy advocates have argued that the company isn’t transparent enough about what data it has and what it does with it. As a result, most people don’t understand the massive trade-off they are making with their information when they sign up for the “free” site.

On Thursday, Germany’s Federal Cartel Office, the country’s antitrust regulator, ruled that Facebook was exploiting consumers by requiring them to agree to this kind of data collection in order to have an account, and has prohibited the practice going forward."

David. said...

The headline says it all in Matthew Hughes' Facebook lets you search for pictures of your female friends, but not your male ones:

"If you’re feeling an overwhelming sense of deja vu, you’re not alone. The predecessor to Facebook was a deeply unsavory site called Facemash that allowed Harvard University students to rate their female colleagues based on perceived physical attractiveness."

David. said...

To no-ones' great surprise, in Gambling, porn, and piracy on iOS: Apple’s enterprise certificate woes continue Samuel Axon reports that Google and Facebook are far from the only abusers of certificates from Apple's Enterprise Developer Program:

"the Reuters report describes the use of enterprise certificates to distribute pirated versions of popular iOS software like Minecraft, Spotify, and Pok√©mon Go. ... The distributors impersonate legitimate businesses to gain access to Apple’s enterprise certification program and tools. ... Earlier this week, a TechCrunch investigation also discovered a "dozen hardcore-pornography apps and a dozen real-money gambling apps that escaped Apple's oversight."

Signing up developers is the priority for the program, not auditing whether they are abiding by the agreements they signed. And the agreements almost certainly absolve Apple of any responsibility for what is done with the certificates.

David. said...

The UK House of Commons isn't happy with Facebook:

"Facebook deliberately broke privacy and competition law and should urgently be subject to statutory regulation, according to a devastating parliamentary report denouncing the company and its executives as “digital gangsters”.

The final report of the Digital, Culture, Media and Sport select committee’s 18-month investigation into disinformation and fake news accused Facebook of purposefully obstructing its inquiry and failing to tackle attempts by Russia to manipulate elections."

David. said...

The Economist speculates What would happen if Facebook were turned off?:

"Were Mark Zuckerberg to turn off his creation, another, similar platform might be propelled to dominance. But the Facebook era could instead be the product of unique, fleeting historical circumstances. In that case, a sunnier social-network ecology might be achievable—if only the citizens of Facebook could be nudged to seek something better."

David. said...

Lauren Finer's Facebook reportedly gets deeply personal info, such as ovulation times and heart rate, from some apps start:

"Facebook receives highly personal information from apps that track your health and help you find a new home, testing by The Wall Street Journal found. Facebook can receive this data from certain apps even if the user does not have a Facebook account, according to the Journal."

Of course, Facebook had nothing to do with this, it only buried this in its SDK:

"A Facebook spokesperson told CNBC, "Sharing information across apps on your iPhone or Android device is how mobile advertising works and is industry standard practice. The issue is how apps use information for online advertising. We require app developers to be clear with their users about the information they are sharing with us, and we prohibit app developers from sending us sensitive data. We also take steps to detect and remove data that should not be shared with us."

So that's all OK, isn't it.

David. said...

Inevitably, Facebook was lying about who was using their "research" app. Josh Constine reports that Facebook admits 18% of Research spyware users were teens, not &lt5%:

"In the response from Facebook’s VP of US public policy Kevin Martin, the company admits that (emphasis ours) “At the time we ended the Facebook Research App on Apple’s iOS platform, less than 5 percent of the people sharing data with us through this program were teens. Analysis shows that number is about 18 percent when you look at the complete lifetime of the program, and also add people who had become inactive and uninstalled the app.” So 18 percent of research testers were teens. It was only less than 5 percent when Facebook got caught. Given users age 13 to 35 were eligible for Facebook’s Research program, 13 to 18 year olds made of 22 percent of the age range. That means Facebook clearly wasn’t trying to minimize teen involvement, nor were they just a tiny fraction of users."

No-one should believe a word Facebook says about anything.

David. said...

The Register's Kieren McCarthy has an overview of the myriad lies Facebook told about its "research" app.

David. said...

Revealed: Facebook’s global lobbying against data privacy laws by Carole Cadwalladr and Duncan Campbell is based on documents from the Six4Three cased that the UK seized. It reveals that:

"Facebook has targeted politicians around the world – including the former UK chancellor, George Osborne – promising investments and incentives while seeking to pressure them into lobbying on Facebook’s behalf against data privacy legislation, an explosive new leak of internal Facebook documents has revealed.

The documents, which have been seen by the Observer and Computer Weekly, reveal a secretive global lobbying operation targeting hundreds of legislators and regulators in an attempt to procure influence across the world, including in the UK, US, Canada, India, Vietnam, Argentina, Brazil, Malaysia and all 28 states of the EU."

David. said...

Sue Halpern's Mark Zuckerberg’s Plans to Capitalize on Facebook’s Failures is a suitably skeptical take on Zuckerberg's plan for a privacy-focused (part of) Facebook:

"In Wednesday’s announcement, Zuckerberg, who is famous for shambling apologies, seemed to have finally run out of mea culpas. “I understand that many people don’t think Facebook can or would even want to build this kind of privacy-focused platform,” he wrote, “because frankly we don’t currently have a strong reputation for building privacy protective services.” On that point, at least, he’s right."

David. said...

See also Isobel Asher Hamilton's Mark Zuckerberg's boast about Facebook's data storage was torn apart by human rights groups:

"In his blockbuster blog, Mark Zuckerberg boasted that Facebook has chosen not to store data in countries that "have a track record of violating human rights like privacy or freedom of expression."

But just months ago, Facebook announced plans to open a data center in Singapore, a country with a poor record when it comes to freedom of expression, according to human rights groups."

David. said...

In A New Privacy Constitution for Facebook Bruce Schneier and Adam Shostack lay out what Zuckerberg would need to do to deliver on his "pivot to privacy":

"There is ample reason to question Zuckerberg’s pronouncement: The company has made — and broken — many privacy promises over the years. And if you read his 3,000-word post carefully, Zuckerberg says nothing about changing Facebook’s surveillance capitalism business model. All the post discusses is making private chats more central to the company, which seems to be a play for increased market dominance and to counter the Chinese company WeChat.

In security and privacy, the devil is always in the details — and Zuckerberg’s post provides none. But we’ll take him at his word and try to fill in some of the details here. What follows is a list of changes we should expect if Facebook is serious about changing its business model and improving user privacy."

The first link is to Ryan Nakashima's Promises, promises: Facebook's history with privacy, an 11-year timeline of broken privacy promises from a year ago.

David. said...

Facebook’s Data Deals Are Under Criminal Investigation continues the New York Times' investigation of Facebook's data-sharing deals:

"Federal prosecutors are conducting a criminal investigation into data deals Facebook struck with some of the world’s largest technology companies, intensifying scrutiny of the social media giant’s business practices as it seeks to rebound from a year of scandal and setbacks.

A grand jury in New York has subpoenaed records from at least two prominent makers of smartphones and other devices, according to two people who were familiar with the requests and who insisted on anonymity to discuss confidential legal matters. Both companies had entered into partnerships with Facebook, gaining broad access to the personal information of hundreds of millions of its users."

And also Facebook has been down for hours, Instagram is back in service.

David. said...

Kieren McCarthy's Facebook blames 'server config change' for 14-hour outage. Someone run that through the universal liar translator is appropriately skeptical of Facebook's sole tweet about their outage:

"The other big question is how a "server configuration change" led to not just Facebook but also its Messenger, WhatsApp, and Instagram services going down. That would strongly suggest that Facebook has either connected them up or attempted to connect them up at a low level, merging them into one broad platform. In January, it emerged that CEO Mark Zuckerberg had ordered that his instant-chat applications and social network be intertwined. And this month, Zuck alluded to this in an otherwise aimless blog post: "With all the ways people also want to interact privately, there's also an opportunity to build a simpler platform that's focused on privacy first."

So, that "server configuration change" may have been more conspiracy than cockup, a move to bring together Facebook's individual components. An effort so large and complex, it resulted in 14 hours of downtime. That may help explain why the biz is being so secretive about the cause of the outage. Bringing together everything under one roof is certainly one way to avoid potential regulatory break-up."

David. said...

Carole Cadwalladr's Facebook faces fresh questions over when it knew of data harvesting looks like yet another case of Facebook's inability to tell the truth:

"Facebook has repeatedly refused to say when its senior executives, including Zuckerberg, learned of how Cambridge Analytica had used harvested data from millions of people across the world to target them with political messages without their consent. But Silicon Valley insiders have told the Observer that Facebook board member Marc Andreessen, the founder of the venture capital firm Andreessen Horowitz and one of the most influential people in Silicon Valley, attended a meeting with [Christopher] Wylie held in Andreessen Horowitz’s office two years before he came forward as a whistleblower."

David. said...

Brian Krebs repors that Facebook Stored Hundreds of Millions of User Passwords in Plain Text for Years. Facebook claims:

"Facebook says an ongoing investigation has so far found no indication that employees have abused access to this data. ... the investigation so far indicates between 200 million and 600 million Facebook users may have had their account passwords stored in plain text and searchable by more than 20,000 Facebook employees. The source said Facebook is still trying to determine how many passwords were exposed and for how long, but so far the inquiry has uncovered archives with plain text user passwords dating back to 2012.

My Facebook insider said access logs showed some 2,000 engineers or developers made approximately nine million internal queries for data elements that contained plain text user passwords."

David. said...

Julia Carrie Wong's Facebook acknowledges concerns over Cambridge Analytica emerged earlier than reported describes yet another example of Facebook's inability to tell the truth:

"Facebook employees were aware of concerns about“improper data-gathering practices” by Cambridge Analytica months before the Guardian first reported, in December 2015, that the political consultancy had obtained data on millions from an academic. The concerns appeared in a court filing by the attorney general for Washington DC and were subsequently confirmed by Facebook.

The new information “could suggest that Facebook has consistently mislead [sic]” British lawmakers “about what it knew and when about Cambridge Analytica”, tweeted Damian Collins, the chair of the House of Commons digital culture media and sport select committee (DCMS) in response to the court filing."

David. said...

Cecilia Kang's The Mounting Federal Investigations Into Facebook lists the FTC, the SEC, the DoJ, the US Attorney for the Eastern District of NY,and HUD as the agencies currently investigating Facebook.

David. said...

All you need to know is in the headline of Kevin Poulsen's ‘Beyond Sketchy’: Facebook Demanding Some New Users’ Email Passwords. Well, why not, if you can get away with it?

David. said...

Rob Price reports on Facebook's use of coin-op media to spread their lies in Facebook is partnering with a big UK newspaper to publish sponsored articles downplaying 'technofears' and praising the company:

"Facebook has found a novel solution to the never-ending deluge of negative headlines and news articles criticizing the company: Simply paying a British newspaper to run laudatory stories about it.

Facebook has partnered with The Daily Telegraph, a broadsheet British newspaper, to run a series of features about the company, Business Insider has found — including stories that defend it on hot-button issues it has been criticised over like terrorist content, online safety, cyberbullying, fake accounts, and hate speech."

Fake news!

David. said...

Jake Kanter's Facebook's activist shareholders are making another dramatic bid to oust Mark Zuckerberg and abolish the firm's share structure explains why:

"Zuckerberg's weighty power is why activist shareholders want to abolish the share structure. At the annual investor meeting, they will have the chance to vote on a proposal, which calls for the introduction of "fair and appropriate mechanisms through which disproportionate rights of Class B shareholders could be eliminated."

It said: "Fake news, election interference, and threats to our democracy -- shareholders need more than deny, deflect, and delay. We urge shareholders to vote FOR a recapitalization plan for all outstanding stock to have one vote per share."

It is not clear which investor has drawn up the proposal, but Facebook again calls for it to be dismissed by shareholders, as they have during the last five annual meetings. "We believe that our capital structure is in the best interests of our stockholders and that our current corporate governance structure is sound and effective," it said."

Yeah, sure.

David. said...

Today's drops in the bucket are Mark Zuckerberg leveraged Facebook user data to fight rivals and help friends, leaked documents show by Olivia Solon and Cyrus Farivar:

"Mark Zuckerberg oversaw plans to consolidate the social network’s power and control competitors by treating its users’ data as a bargaining chip, while publicly proclaiming to be protecting that data, according to about 4,000 pages of leaked company documents largely spanning 2011 to 2015 and obtained by NBC News."

and 15 Months of Fresh Hell Inside Facebook by Nicholas Thompson and Fred Vogelstein:

"If you want to promote trustworthy news for billions of people, you first have to specify what is trustworthy and what is news. Facebook was having a hard time with both. To define trustworthiness, the company was testing how people responded to surveys about their impressions of different publishers. To define news, the engineers pulled a classification system left over from a previous project—one that pegged the category as stories involving “politics, crime, or tragedy.”

That particular choice, which meant the algorithm would be less kind to all kinds of other news—from health and science to technology and sports—wasn’t something Facebook execs discussed with media leaders in Davos. And though it went through reviews with senior managers, not everyone at the company knew about it either."

David. said...

Today's drop in the bucket is Facebook admits harvesting contacts of the 1.5m email passwords it asked for:

"A few weeks ago, we learned that Facebook asked for the personal email passwords of some users logging in. Today, it admits that it used the passwords to harvest 1.5m users' email contacts without consent. Facebook claims that doing this was "unintentional," despite contact harvesting being the plainly obvious purpose of demanding people's email passwords and notifications in Facebook informing users that their contacts were being imported."

As Cory Doctorow writes:

"Facebook's cycle of promises and lies depends upon journalistic objectivity being warped into a perverse assumption of Facebook's good faith. When we fail to report each privacy abuse in the context of all the other ones, we simply fail."

David. said...

Another drop in the bucket for today! msmash at /. reports:

"Last month, Facebook disclosed that hundreds of millions of users on its platform had their account passwords stored in plain text -- in some cases going back to 2012 -- and searchable by thousands of Facebook employees. Today, the company quietly updated that blog post to reveal that Instagram users are also impacted."

What is the "credibility" of which you speak?