research effort has risen by a factor of 18 since 1971. This increase occurs while the growth rate of chip density is more or less stable: the constant exponential growth implied by Moore’s Law has been achieved only by a massive increase in the amount of resources devoted to pushing the frontier forward.Below the fold, some commentary on this and other relevant research.
Assuming a constant growth rate for Moore’s Law, the implication is that research productivity has fallen by this same factor of 18, an average rate of 6.8 percent per year.
If the null hypothesis of constant research productivity were correct, the growth rate underlying Moore’s Law should have increased by a factor of 18 as well. Instead, it was remarkably stable. Put differently, because of declining research productivity, it is around 18 times harder today to generate the exponential growth behind Moore’s Law than it was in 1971.
Actually, of course, in recent years Moore's Law has slowed as the technology gets closer and closer to the physical limits. This slowing increases the rate at which research productivity falls.
The implications of their finding are disturbing for the economy as a whole [Page 44]:
Taking the aggregate economy number as a representative example, research productivity declines at an average rate of 5.3 percent per year, meaning that it takes around 13 years for research productivity to fall by half. Or put another way, the economy has to double its research efforts every 13 years just to maintain the same overall rate of economic growth.
The rate at which research productivity has fallen in semiconductors is significantly higher than in other areas of the economy (6.8% vs. 5.3%) [Page 46]:
Research productivity for semiconductors falls so rapidly, not because that sector has the sharpest diminishing returns — the opposite is true. It is instead because research in that sector is growing more rapidly than in any other part of the economy, pushing research productivity down. A plausible explanation for the rapid research growth in this sector is the “general purpose” nature of information technology. Demand for better computer chips is growing so fast that it is worth suffering the declines in research productivity there in order to achieve the gains associated with Moore’s Law.Or even the smaller gains associated with growth significantly slower than Moore's Law.
Industry projections for the Kryder rate of both SSDs and HDDs depend heavily on rapid progress in density, i.e. on the products of R&D investment. Flash is a very competitive market, and although hard disk is down to 2.5 manufacturers, which might suggest improving margins, hard disks are under sustained margin pressure from flash.
Thus falling research productivity has a particular impact on the future of storage, because neither the SSD nor the HDD markets can sustain the large increases in R&D spending needed to increase, or even sustain, their Kryder rates. Bloom et al bolsters the case for low and falling Kryder rates.
This paragraph on [Page 48]: is perhaps the most interesting of the whole paper:
The only reason models with declining research productivity can sustain exponential growth in living standards is because of the key insight from [endogenous growth theory]: ideas are nonrival. And if research productivity were constant, sustained growth would actually not require that ideas be nonrival; Akcigit, Celik and Greenwood show that fully rivalrous ideas in a model with perfect competition can generate sustained exponential growth in this case. Our paper therefore clarifies that the fundamental contribution of endogenous growth theory is not that research productivity is constant or that subsidies to research can necessarily raise growth. Rather it is that ideas are different from all other goods in that they do not get depleted when used by more and more people. Exponential growth in research leads to exponential growth in [research expenditure]. And because of nonrivalry, this leads to exponential growth in per capita income.It is a strong argument for open source and open science.
I believe that the problem of declining research productivity is related to "cost disease", as explained by Scott Alexander in Considerations on Cost Disease which starts:
Tyler Cowen writes about cost disease. ... Cowen seems to use it indiscriminately to refer to increasing costs in general – which I guess is fine, goodness knows we need a word for that.Alexander shows that inflation-adjusted costs have increased rapidly with no corresponding increase in output in several US areas, including:
There was some argument about the style of this graph, but as per Politifact the basic claim is true. Per student spending has increased about 2.5x in the past forty years even after adjusting for inflation.
At the same time, test scores have stayed relatively stagnant. You can see the full numbers here, but in short, high school students’ reading scores went from 285 in 1971 to 287 today – a difference of 0.7%
NB: not inflation-adjusted
Inflation-adjusted cost of a university education was something like $2000/year in 1980. Now it’s closer to $20,000/year. No, it’s not because of decreased government funding, and there are similar trajectories for public and private schools.
I don’t know if there’s an equivalent of “test scores” measuring how well colleges perform, so just use your best judgment. Do you think that modern colleges provide $18,000/year greater value than colleges did in your parents’ day? Would you rather graduate from a modern college, or graduate from a college more like the one your parents went to, plus get a check for $72,000?
The cost of health care has about quintupled since 1970. It’s actually been rising since earlier than that, but I can’t find a good graph; it looks like it would have been about $1200 in today’s dollars in 1960, for an increase of about 800% in those fifty years. ... This study attempts to directly estimate a %GDP health spending to life expectancy conversion, and says that an increase of 1% GDP corresponds to an increase of 0.05 years life expectancy. That would suggest a slightly different number of 0.65 years life expectancy gained by healthcare spending since 1960)
I worry that people don’t appreciate how weird this is. I didn’t appreciate it for a long time. I guess I just figured that Grandpa used to talk about how back in his day movie tickets only cost a nickel; that was just the way of the world. But all of the numbers above are inflation-adjusted. These things have dectupled in cost even after you adjust for movies costing a nickel in Grandpa’s day. They have really, genuinely dectupled in cost, no economic trickery involved.The fields Alexander uses as examples have a lot of human input, but they should have reaped significant cost benefits from technology and globalization. As he writes about health care:
And this is especially strange because we expect that improving technology and globalization ought to cut costs.
Patients can now schedule their appointments online; doctors can send prescriptions through the fax, pharmacies can keep track of medication histories on centralized computer systems that interface with the cloud, nurses get automatic reminders when they’re giving two drugs with a potential interaction, insurance companies accept payment through credit cards – and all of this costs ten times as much as it did in the days of punch cards and secretaries who did calculations by hand.Note that R&D is also a human-intensive business that should have reaped significant cost savings from technology and globalization. But like these other fields, it has increased massively in price. In fact, 18-fold instead of 10-fold. Cowen and Alexander are on to a really significant problem for the economy as a whole.