Back in February, Stephen Balkam's Guardian article What will happen when the internet of things becomes artificially intelligent? sparked some discussion on Dave Farber's IP list, including this wonderfully apposite Philip K. Dick citation from Ian Stedman via David Pollak. It roused Mike O'Dell to respond with Internet of Obnoxious Things, a really important insight into the fundamental problems underlying the Internet of Things. Just go read it. Mike starts:
The PKDick excerpt cited about a shakedown by a door lock is, I fear, more prescient than it first appears.Charlie goes further, and follows Philip K. Dick more closely, by pointing out that the causes of Something Bad(tm) are not just stupidity and malice, but also greed:
I very much doubt that any "Internet of Things" will become Artificially Impudent because long before that happens, all the devices will be co-opted by The Bad Guys who will proceed to pursue shakedowns, extortion, and "protection" rackets on a coherent global scale.
Whether it is even possible to "secure" such a collection of devices empowered with such direct control over physical reality is a profound and, I believe, completely open theoretical question. (We don't even have a strong definition of what that would mean.)
Even if it is theoretically possible, it has been demonstrated in the most compelling possible terms that it will not be done for a host of reasons. The most benign fall under the rubric of "Never ascribe to malice what is adequately explained by stupidity" while others will be aggressively malicious. ...
A close second, however, is a definition of "security" that reads, approximately, "Do what I should have meant." Eg, the rate of technology churn cannot be reduced just because we haven't figured out what we need it to do (or not do) - we'll just "iterate" every time Something Bad(tm) happens.
The evil business plan of evil (and misery) posits the existence of smart municipality-provided household recycling bins. ... The bin has a PV powered microcontroller that can talk to a base station in the nearest wifi-enabled street lamp, and thence to the city government's waste department. The householder sorts their waste into the various recycling bins, and when the bins are full they're added to a pickup list for the waste truck on the nearest routing—so that rather than being collected at a set interval, they're only collected when they're full.Charlie sets out the basic requirements for business models like this:
But that's not all.
Householders are lazy or otherwise noncompliant and sometimes dump stuff in the wrong bin, just as drivers sometimes disobey the speed limit.
The overt value proposition for the municipality (who we are selling these bins and their support infrastructure to) is that the bins can sense the presence of the wrong kind of waste. This increases management costs by requiring hand-sorting, so the individual homeowner can be surcharged (or fined). More reasonably, households can be charged a high annual waste recycling and sorting fee, and given a discount for pre-sorting everything properly, before collection—which they forefeit if they screw up too often.
The covert value proposition ... local town governments are under increasing pressure to cut their operating budgets. But by implementing increasingly elaborate waste-sorting requirements and imposing direct fines on households for non-compliance, they can turn the smart recycling bins into a new revenue enhancement channel, ... Churn the recycling criteria just a little bit and rely on tired and over-engaged citizens to accidentally toss a piece of plastic in the metal bin, or some food waste in the packaging bin: it'll make a fine contribution to your city's revenue!
Some aspects of modern life look like necessary evils at first, until you realize that some asshole has managed to (a) make it compulsory, and (b) use it for rent-seeking. The goal of this business is to identify a niche that is already mandatory, and where a supply chain exists (that is: someone provides goods or service, and as many people as possible have to use them), then figure out a way to colonize it as a monopolistic intermediary with rent-raising power and the force of law behind it.and goes on to use speed cameras as an example. What he doesn't go into is what the IoT brings to this class of business models; reduced cost of detection, reduced possibility of contest, reduced cost of punishment. A trifecta that means profit! But Charlie brilliantly goes on to incorporate:
the innovative business model that Yves Smith has dubbed "crapification". A business that can reduce customer choice sufficiently then has a profit opportunity; it can make its product so awful that customers will pay for a slightly less awful version.He suggests:
Sell householders a deluxe bin with multiple compartments and a sorter in the top: they can put their rubbish in, and the bin itself will sort which section it belongs in. Over a year or three the householder will save themselves the price of the deluxe bin in avoided fines—but we don't care, we're not the municipal waste authority, we're the speed camera/radar detector vendor!Cory Doctorow just weighed in, again, on the looming IoT disaster. This time he points out that although it is a problem that Roomba's limited on-board intelligence means poor obstacle avoidance, solving the problem by equipping them with cameras and an Internet connection to an obstacle-recognition service is an awesomely bad idea:
Roombas are pretty useful devices. I own two of them. They do have real trouble with obstacles, though. Putting a camera on them so that they can use the smarts of the network to navigate our homes and offices is a plausible solution to this problem.Looking back through the notes on my October post, we see that Google is no longer patching known vulnerabilities in Android before 4.4. There are only about 930 million devices running such software. More details on why nearly a billion users are being left to the mercy of the bad guys are here.
But a camera-equipped networked robot that free-ranges around your home is a fucking disaster if it isn't secure. It's a gift to everyone who wants to use cameras to attack you, from voyeur sextortionist creeps to burglars to foreign spies and dirty cops.
The Internet of Things With Wheels That Kill People has featured extensively. First, Progressive Insurance's gizmo that tracks their customer's driving habits has a few security issues:
"The firmware running on the dongle is minimal and insecure," Thuen told Forbes.Second, a vulnerability in BMWs, Minis and Rolls-Royces:
"It does no validation or signing of firmware updates, no secure boot, no cellular authentication, no secure communications or encryption, no data execution prevention or attack mitigation technologies ... basically it uses no security technologies whatsoever."
What's the worst that can happen? The device gives access to the CAN bus.
"The CAN bus had been the target of much previous hacking research. The latest dongle similar to the SnapShot device to be hacked was the Zubie device which examined for mechanical problems and allowed drivers to observe and share their habits."
"Argus Cyber Security researchers Ron Ofir and Ofer Kapota went further and gained control of acceleration, braking and steering through an exploit."
"BMW has plugged a hole that could allow remote attackers to open windows and doors for 2.2 million cars."
"Attackers could set up fake wireless networks to intercept and transmit the clear-text data to the cars but could not have impacted vehicle acceleration or braking systems."What were they thinking?
BMW's patch also updated its patch distribution system to use HTTPS."
Third, Senator Ed Markey has been asking auto makers questions and the answers are not reassuring. No wonder he was asking questions. At an industry-sponsored hackathon last July a 14-year old with $15 in parts from Radio Shack showed how easy it was:
"Windshield wipers turned on and off. Doors locked and unlocked. The remote start feature engaged. The student even got the car's lights to flash on and off, set to the beat from songs on his iPhone."Key to an Internet of Things that we could live with is, as Vint Cerf pointed out, a secure firmware update mechanism. The consequences of not having one can be seen in Kaspersky's revelations of the "Equation group" compromising hard drive firmware. Here's an example of how easy it can be. To be fair, Seagate at least has deployed a secure firmware update mechanism, initially to self-encrypting drives but now I'm told to all their current drives.
Cooper Quintin at the EFF's DeepLinks blog weighed in with a typically clear overview of the issue entitled Are Your Devices Hardwired For Betrayal?. The three principles:
- Firmware must be properly audited.
- Firmware updates must be signed.
- We need a mechanism for verifying installed firmware.
"None of these things are inherently difficult from a technological standpoint. The hard problems to overcome will be inertia, complacency, politics, incentives, and costs on the part of the hardware companies."Among the Things in the Internet are computers with vulnerable BIOSes:
"Though there's been long suspicion that spy agencies have exotic means of remotely compromising computer BIOS, these remote exploits were considered rare and difficult to attain.GCHQ has the legal authority to exploit these BIOS vulnerabilities, and any others it can find, against computers, phones and any other Things on the Internet wherever they are. Its likely that most security services have similar authority.
Legbacore founders Corey Kallenberg and Xeno Kovah's Cansecwest presentation ... automates the process of discovering these vulnerabilities. Kallenberg and Kovah are confident that they can find many more BIOS vulnerabilities; they will also demonstrate many new BIOS attacks that require physical access."
Useful reports appeared, including this two part report from Xipiter, and this from Veracode on insecurities, this from DDOS-protection company Incasula, on the now multiple botnets running on home routers, and this from the SEC Consult Vulnerability Lab about a yet another catastrophic vulnerability in home routers. This last report, unlike the industry happy-talk, understands the economics of IoT devices:
"the (consumer) embedded systems industry is always keen on keeping development costs as low as possible and is therefore using vulnerability-ridden code provided by chipset manufacturers (e.g. Realtek CVE-2014-8361 - detailed summary by HP, Broadcom) or outdated versions of included open-source software (e.g. libupnp, MiniUPnPd) in their products."And just as I was finishing this rant, Ars Technica posted details of yet another botnet running on home routers, this one called Linux/Moose. It collects social network credentials.
That's all until the next rant. Have fun with your Internet-enabled gizmos!
Even Apple can screw up the firmware update process. A newly revealed bug in older Macs related to how they wake up from sleep mode leaves their BIOS-equivalent firmware open to compromise by the bad guys.
Cindy Cohn at EFF has a great post using the analogy of the Tylenol poisoning to show how completely irrelevant to the real problem current discussions of "cybersecurity" are. Go read it! One sentence sums it up:
"We need better incentives for companies who store our data to keep it secure."
Just as with the problems of peer review, the problem with cybersecurity is that the incentives for the actors who could actually fix the problem are to not fix it. Fixing it would cut off information that governments like because it enables them to blackmail people into doing what the government wants, and would cost companies money.
And the beat goes on. A group of Spanish masters students found 60 vulnerabilities in 22 models of home routers, mostly ones provided by Spanish ISPs. The vulnerabilities included hardwired "admin" accounts and cross-site request forgery flaws and the ability for remote attackers to view, modify or delete files from USB storage.
Gotta love The Register. Alexander Martin reports on a Gallic breakthrough in IoT technology.
And also the Internet of Things With Wings That Kill People. Cora Currrier at The Intercept reports on a new book by Chris Woods: Sudden Justice: America's Secret Drone Wars:
"Bored drone pilots sometimes smuggled simple computer games onto the drone operating systems — chess, solitaire, Battleship. That stopped in 2011, after a computer virus got into the drones’ operating systems, likely from the games, former pilots told Woods."
Skynet, here we come.
The Economist takes up the issue of The Internet of Things With Wheels That Kill People with a good piece entitled Deus ex vehiculum.
Eugene Kaspersky issues a blunt warning about what he calls the Internet of Threats, and a group of Senators asks the GAO to report on the IoT. Primarily on what the government can do to encourage the business and, incidentally:
"What is the projected impact of ubiquitous IoT on consumer privacy and security?"
In the light of the recent OPM compromises, perhaps they should ask not just about the impact consumers, but also on Federal privacy and security.
On the subject of the Internet of Things That Kill People:
- Via Ars Technica, Wired reports that Chrysler's Uconnect allows attackers to take total control of a car remotely via the cellular network.
- And there's this undoubtedly vulnerable device.
Who could ever have guessed? Honeywell's "Smart Home" devices have two vulnerabilities which are easily exploited to take control of your "Smart Home" and do things like unlock your doors.
The Economist continues to draw attention to the problem with Their own devices, quoting Cambridge's Ross Anderson and Graham Steel of Cryptosense:
For those markets where bugs and hacks are more annoying than fatal, though, things may take longer to improve. “I might be happy to pay a bit extra to make sure my car is safe,” says Dr Steel. “But would I pay more to make sure my fridge isn’t doing things that annoy other people, rather than me?”
The Ethernet switches at the heart of the networks that control power generators, pipelines, and other industrial processes are riddled with elementary and difficult to patch vulnerabilities:
"While these companies are working to fix the problem, the actual process of patching the switches can take several years and piles of money to accomplish, leaving large numbers of industrial facilities open to attacks on their network today. A major part of the researchers' work has been developing mitigation techniques to defend against IES attackers even before a patch is implemented.
The vulnerabilities on industrial switches covered in the new research include the widespread use of default passwords, hard-coded encryption keys, and a lack of proper authentication for firmware updates. These three fundamental failures of security combine to make it easier for attackers to gain access to industry devices and networks, change what they please, and take control."
ZDNet makes the obvious point that Chrysler's fix for their remote vulnerability, mailing their users a USB stick and telling them to plug it into their car, is an open invitation to the bad guys.
Even Tesla's "computer on wheels" isn't immune from vulnerabilities. Researchers found 6, but overall gave Tesla high marks for their software:
'Consumer’s safety was still preserved even in cases, like the hand-brake issue, where the system ran foul of bugs.
Despite uncovering half a dozen security bugs the two researcher nonetheless came away impressed by Tesla’s infosec policies and procedures as well as its fail-safe engineering approach.
“Tesla takes a software-first approach to its cars, so it’s no surprise that it has key security features in place that minimised and contained the risk of the discovered vulnerabilities,” the researchers explain.'
Jeff Atwood points out that router compromises are so prevalent that its become an unacceptable risk to use someone else's WiFi without using a VPN.
The New York Times editorial page weighs in with Why ‘Smart’ Objects May Be a Dumb Idea by Zeynep Tufekci.
Today's vulnerability caused by connecting a vehicle's CAN bus to the Internet is here. Admittedly, this one comes from a third-party device plugged into the OBD2 port rather than from the manufacturer's software, but it still gives the bad guy the ability to mess with the vehicle's brakes.
Sean Gallagher's piece at Ars Technica entitled Highway to Hack is a must-read.
In order to provide the essential service of displaying your calendar, a Samsung fridge needs your Google credentials. Because it doesn't check the SSL certificate, it leaks them over WiFi. But I guess that's no big deal now you have your calendar on your fridge.
David Kravets at Ars Technica reports on a Rand study that, among other interesting things, asks should police have the capability to take control of driverless cars?
The article doesn't point out that this is just like mandating backdoors in encryption. If the police can do it, so can the bad guys.
It's not just your car. Exploits called FacePlant and RoadRash can take over your electric skateboard because the BlueTooth LE connection to the handheld remote isn't encrypted.
Anyone remember Hiro Protagonist's motorbike in Snow Crash?
Richard Chirgwin at The Register nicely sums up the organizational imperatives that lead to "features" such as Samsung's fridge leaking your Google credentials.
The appalling risks of the Internet of Things are covered in a new book, Abusing the Internet of Things: Blackouts, Freakouts, and Stakeouts, by Nitesh Dhanjani. It is reviewed by sh0wstOpper at /.
Glyn Moody at Techdirt makes a really important point about VW's engine control software defeating emissions testing:
"Assuming some form of DRM was employed, it would not anyway have been possible to spot the cheating algorithm of the emissions control code because it would have been illegal to circumvent the software protection. This emphasizes once more the folly of allowing the DMCA to apply to such systems, where problems could be found much earlier by inspecting the software, rather than waiting for them to emerge in use, possibly years later.
The revelation about VW's behavior once more concerns code in cars, but there is a much larger issue here. As software starts to appear routinely in an ever-wider range of everyday objects, so the possibility arises for them to exhibit different behaviors in different situations. Thanks to programming, these objects no longer have a single, fixed set of features, but are malleable, which makes checking their conformance to legal standards much more problematic."
Zeynep Tufekci appears to have coined the appropriate term: The Internet of cheating things. All the Things in your Internet can cheat, and unless they are open source there's no legal way to detect the cheat.
Zeynep Tufekci's New York Times op-ed is well worth reading. She points out that there is one application area where software is regulated to prevent cheating - casinos.
More on the vulnerabilities of the Internet of Things With Wheels That Kill People. Craig Smith has demonstrated how a car can infect a mechanic's tools with malware that spreads to all the cars that mechanic services, which then spread the malware to all the mechanics' tools that service those cars, which then spread the malware to all the cars those mechanics service, ...
Cory Doctorow points to The price of the Internet of Things will be a vague dread of a malicious world by Marcelo Rinesi. Its well worth a read:
"The intrinsic challenge to our legal framework is that technical standards have to be precisely defined in order to be fair, but this makes them easy to detect and defeat. They assume a mechanical universe, not one in which objects get their software updated with new lies every time regulatory bodies come up with a new test. And even if all software were always available, cheking it for unwanted behavior would be unfeasible — more often than not, programs fail because the very organizations that made them haven’t or couldn’t make sure it behaved as they intended."
Rebecca Wexler at Slate writes in Convicted By Code:
"It’s time to address one of the most urgent if overlooked tech transparency issues—secret code in the criminal justice system. Today, closed, proprietary software can put you in prison or even on death row. And in most U.S. jurisdictions you still wouldn’t have the right to inspect it. ...
Take California. Defendant Martell Chubbs currently faces murder charges for a 1977 cold case in which the only evidence against him is a DNA match by a proprietary computer program. Chubbs ... asked to inspect the software’s source code in order to challenge the accuracy of its results. Chubbs sought to determine whether the code properly implements established scientific procedures for DNA matching and if it operates the way its manufacturer claims. But the manufacturer argued that the defense attorney might steal or duplicate the code and cause the company to lose money. The court denied Chubbs’ request, leaving him free to examine the state’s expert witness but not the tool that the witness relied on."
The surveillance cameras you install to feel that your home is secure are just as vulnerabl;e as everything else in the IoT.
The saga continues at this new post.
Post a Comment