Tuesday, August 18, 2015

Progress in solid-state memories

Last week's Storage Valley Supper Club provided an update on developments in solid state memories.

First, the incumbent technology, planar flash, has reached the end of its development path at the 15nm generation. Planar flash will continue to be the majority of flash bits shipped through 2018, but the current generation is the last.

Second, all the major flash manufacturers are now shipping 3D flash, the replacement for planar. Stacking the cells vertically provides much greater density; the cost is a much more complex manufacturing process and, at least until the process is refined, much lower yields. This has led to much skepticism about the economics of 3D flash, but it turns out that the picture isn't as bad as it appeared. The reason is, in a sense, depressing.

It always important to remember that, at bottom, digital storage media are analog. Because 3D flash is much denser, there are a lot more cells. Because of the complexity of the manufacturing process, the quality of each cell is much worse. But because there are many more cells, the impact of the worse quality is reduced. More flash controller intelligence adapting to the poor quality or even non-functionality of the individual cells, and more of the cells used for error correction, mean that 3D flash can survive lower yields of fully functional cells.

The advent of 3D means that flash prices, which had stabilized, will resume their gradual decrease. But anyone hoping that 3D will cause a massive drop will be disappointed.

Third, the post-flash solid state technologies such as Phase Change Memory (PCM) are increasingly real but, as expected, they are aiming at the expensive, high-performance end of the market. HGST has demonstrated a:
PCM SSD with less than two microseconds round-trip access latency for 512B reads, and throughput exceeding 3.5 GB/s for 2KB block sizes.
which, despite the near-DRAM performance, draws very little power.

But the big announcement was Intel/Micron's 3D XPoint. They are very cagey about the details, but it is a resistive memory technology that is 1000 times faster than NAND, 1000 times the endurance, and 100 times denser. They see the technology initially being deployed, as shown in the graph, as an ultra-fast but non-volatile layer between DRAM and flash, but it clearly has greater potential once it gets down the price curve.

Thursday, August 13, 2015

Authors breeding like rabbits

The Wall Street Journal points to another problem with the current system of academic publishing with an article entitled How Many Scientists Does It Take to Write a Paper? Apparently, Thousands:
In less than a decade, Dr. Aad, who lives in Marseilles, France, has appeared as the lead author on 458 scientific papers. Nobody knows just how many scientists it may take to screw in a light bulb, but it took 5,154 researchers to write one physics paper earlier this year—likely a record—and Dr. Aad led the list.

His scientific renown is a tribute to alphabetical order.
The article includes this amazing graph from Thompson-Reusters, showing the spectacular rise in papers with enough authors that their names had to reflect alphabetical order rather than their contribution to the research. And the problem is spreading:
“The challenges are quite substantial,” said Marica McNutt, editor in chief of the journal Science. “The average number of authors even on a typical paper has doubled.”
Of course, it is true that in some fields doing any significant research requires a large team, and that some means of assigning credit to team members is necessary. But doing so by adding their names to an alphabetized list of authors on the paper describing the results has become an ineffective way of doing the job. If each author gets 1/5154 of the credit for a good paper it is hardly worth having compared to the whole credit for a single-author bad paper. If each of the 5154 authors gets full credit, the paper generates 5145 times as much credit as it is due.  And if the list is alphabetized but is treated as reflecting contribution, Dr. Aad is a big winner.

How long before the first paper is published with more authors than words?

Tuesday, August 11, 2015

Patents considered harmful

Although at last count I'm a named inventor on at least a couple of dozen US patents, I've long believed that the operation of the patent system, like the copyright system, is profoundly counter-productive. Since "reform" of these systems is inevitably hijacked by intellectual property interests, I believe that at least the patent system, if not both, should be completely abolished. The idea that an infinite supply of low-cost, government enforced monopolies is in the public interest is absurd on its face. Below the fold, some support for my position.

Friday, July 24, 2015

Amazon owns the cloud

Back in May I posted about Amazon's Q1 results, the first in which they broke out AWS, their cloud services, as a separate item. The bottom line was impressive:
AWS is very profitable: $265 million in profit on $1.57 billion in sales last quarter alone, for an impressive (for Amazon!) 17% net margin.
Again via Barry Ritholtz, Re/Code reports on Q2:
Amazon Web Services, ... grew its revenue by 81 percent year on year in the second quarter. It grew faster and with higher profit margins than any other aspect of Amazon’s business.

AWS, which offers leased computing services to businesses, posted revenue of $1.82 billion, up from $1 billion a year ago, as part of its second-quarter results.

By comparison, retail sales in North America grew only 26 percent to $13.8 billion from $11 billion a year ago.

The cloud computing business also posted operating income of $391 million — up an astonishing 407 percent from $77 million at this time last year — for an operating margin of 21 percent, making it Amazon’s most profitable business unit by far. The North American retail unit turned in an operating margin of only 5.1 percent.
Revenue growing at 81% year-on-year at a 21% and growing margin despite:
price competition from the likes of Google, Microsoft and IBM.
Amazon clearly dominates the market, the competition is having no effect on their business. As I wrote nearly a year ago, based on Benedict Evans' analysis:
Amazon's strategy is not to generate and distribute profits, but to re-invest their cash flow into starting and developing businesses. Starting each business absorbs cash, but as they develop they turn around and start generating cash that can be used to start the next one.
Unfortunately, S3 is part of AWS for reporting purposes, so we can't see the margins for the storage business alone. But I've been predicting for years that if we could, we would find them to be very generous.

Wednesday, July 15, 2015

Be Careful What You Wish For

Richard Poynder has a depressing analysis of the state of Open Access entitled HEFCE, Elsevier, the “copy request” button, and the future of open access and Bjoern Brembs has a related analysis entitled What happens to publishers that don’t maximize their profit?. They contrast vividly with the Director Zhang's vision for China's National Science Library., and Rolf Schimmer's description of the Max Planck Institute's plans. I expressed doubt that Schimmer's plan would prevent Elsevier ending up with all the money. Follow me below the fold to see how much less optimistic Brembs and Poynder are than I was.

Tuesday, July 7, 2015

IIPC Preservation Working Group

The Internet Archive has by far the largest archive of Web content but its preservation leaves much to be desired. The collection is mirrored between San Francisco and Richmond in the Bay Area, both uncomfortably close to the same major fault systems. There are partial copies in the Netherlands and Egypt, but they are not synchronized with the primary systems.

Now, Andrea Goethals and her co-authors from the IIPC Preservation Working Group have a paper entitled Facing the Challenge of Web Archives Preservation Collaboratively that reports on a survey of Web archives' preservation activities in the following areas; Policy, Access, Preservation Strategy, Ingest, File Formats and Integrity. They conclude:
This survey also shows that long term preservation planning and strategies are still lacking to ensure the long term preservation of web archives. Several reasons may explain this situation: on one hand, web archiving is a relatively recent field for libraries and other heritage institutions, compared for example with digitization; on the other hand, web archives preservation presents specific challenges that are hard to meet.
I discussed the problem of creating and maintaining a remote backup of the Internet Archive's collection in The Opposite of LOCKSS. The Internet Archive isn't alone in having less than ideal preservation of its collection. It's clear the major challenges are the storage and bandwidth requirements for Web archiving, and their rapid growth. Given the limited resources available, and the inadequate reliability of current storage technology, prioritizing collecting more content over preserving the content already collected is appropriate.

Tuesday, June 30, 2015

Blaming the Victim

The Washington Post is running a series called Net of Insecurity. So far it includes:
  • A Flaw In The Design, discussing the early history of the Internet and how the difficulty of getting it to work at all and the lack of perceived threats meant inadequate security.
  • The Long Life Of A Quick 'Fix', discussing the history of BGP and the consistent failure of attempts to make it less insecure, because those who would need to take action have no incentive to do so.
  • A Disaster Foretold - And Ignored,  discussing L0pht and how they warned a Senate panel 17 years ago of the dangers of Internet connectivity but were ignored.
Perhaps a future article in the series will describe how successive US administrations consistently strove to ensure that encryption wasn't used to make systems less insecure and, the encryption that was used was as weak as possible. They prioritized their (and their opponents) ability to spy over mitigating the risks that Internet users faced, and they got what they wanted. As we see with the compromise of the Office of Personnel Management and the possibly related compromise of health insurers including Anthem. These breaches revealed the kind of information that renders everyone with a security clearance vulnerable to phishing and blackmail. Be careful what you wish for!

More below the fold.