who hasn’t finished a non-fiction book and thought “Gee, that could have been half the length and just as informative. If that.”Arora et al argue that a cause of the decline in productivity is that:
Yet every now and then you read something that provokes the exact opposite feeling. Where all you can do after reading a tweet, or an article, is type the subject into Google and hope there’s more material out there waiting to be read.
So it was with Alphaville this Tuesday afternoon reading a research paper from last year entitled The changing structure of American innovation: Some cautionary remarks for economic growth by Arora, Belenzon, Patacconi and Suh (h/t to KPMG’s Ben Southwood, who highlighted it on Twitter).
The exhaustive work of the Duke University and UEA academics traces the roots of American academia through the golden age of corporate-driven research, which roughly encompasses the postwar period up to Ronald Reagan’s presidency, before its steady decline up to the present day.
The past three decades have been marked by a growing division of labor between universities focusing on research and large corporations focusing on development. Knowledge produced by universities is not often in a form that can be readily digested and turned into new goods and services. Small firms and university technology transfer offices cannot fully substitute for corporate research, which had integrated multiple disciplines at the scale required to solve significant technical problems.As someone with many friends who worked at the legendary corporate research labs of the past, including Bell Labs and Xerox PARC, and who myself worked at Sun Microsystems' research lab, this is personal. Below the fold I add my 2c-worth to Arora et al's extraordinarily interesting article.
The authors provide a must-read, detailed history of the rise and fall of corporate research labs. I lived through their golden age; a year before I was born the transistor was invented at Bell Labs:
The first working device to be built was a point-contact transistor invented in 1947 by American physicists John Bardeen and Walter Brattain while working under William Shockley at Bell Labs. They shared the 1956 Nobel Prize in Physics for their achievement.[2] The most widely used transistor is the MOSFET (metal–oxide–semiconductor field-effect transistor), also known as the MOS transistor, which was invented by Egyptian engineer Mohamed Atalla with Korean engineer Dawon Kahng at Bell Labs in 1959.[3][4][5] The MOSFET was the first truly compact transistor that could be miniaturised and mass-produced for a wide range of uses.[6]
Arora et al Fig 2. |
Bell Labs had been separated from its parent company AT&T and placed under Lucent in 1996; Xerox PARC had also been spun off into a separate company in 2002. Others had been downsized: IBM under Louis Gerstner re-directed research toward more commercial applications in the mid-90s ... A more recent example is DuPont’s closing of its Central Research & Development Lab in 2016. Established in 1903, DuPont research rivaled that of top academic chemistry departments. In the 1960s, DuPont’s central R&D unit published more articles in the Journal of the American Chemical Society than MIT and Caltech combined. However, in the 1990s, DuPont’s attitude toward research changed and after a gradual decline in scientific publications, the company’s management closed its Central Research and Development Lab in 2016.Arora et al point out that the rise and fall of the labs coincided with the rise and fall of anti-trust enforcement:
Historically, many large labs were set up partly because antitrust pressures constrained large firms’ ability to grow through mergers and acquisitions. In the 1930s, if a leading firm wanted to grow, it needed to develop new markets. With growth through mergers and acquisitions constrained by anti-trust pressures, and with little on offer from universities and independent inventors, it often had no choice but to invest in internal R&D. The more relaxed antitrust environment in the 1980s, however, changed this status quo. Growth through acquisitions became a more viable alternative to internal research, and hence the need to invest in internal research was reduced.Lack of anti-trust enforcement, pervasive short-termism, driven by Wall Street's focus on quarterly results, and management's focus on manipulating the stock price to maximize the value of their options killed the labs:
Large corporate labs, however, are unlikely to regain the importance they once enjoyed. Research in corporations is difficult to manage profitably. Research projects have long horizons and few intermediate milestones that are meaningful to non-experts. As a result, research inside companies can only survive if insulated from the short-term performance requirements of business divisions. However, insulating research from business also has perils. Managers, haunted by the spectre of Xerox PARC and DuPont’s “Purity Hall”, fear creating research organizations disconnected from the main business of the company. Walking this tightrope has been extremely difficult. Greater product market competition, shorter technology life cycles, and more demanding investors have added to this challenge. Companies have increasingly concluded that they can do better by sourcing knowledge from outside, rather than betting on making game-changing discoveries in-house.They describe the successor to the labs as:
a new division of innovative labor, with universities focusing on research, large firms focusing on development and commercialization, and spinoffs, startups, and university technology licensing offices responsible for connecting the two.An unintended consequence of abandoning anti-trust enforcement was thus a slowing of productivity growth, because the this new division of labor wasn't as effective as the labs:
The translation of scientific knowledge generated in universities to productivity enhancing technical progress has proved to be more difficult to accomplish in practice than expected. Spinoffs, startups, and university licensing offices have not fully filled the gap left by the decline of the corporate lab. Corporate research has a number of characteristics that make it very valuable for science-based innovation and growth. Large corporations have access to significant resources, can more easily integrate multiple knowledge streams, and direct their research toward solving specific practical problems, which makes it more likely for them to produce commercial applications. University research has tended to be curiosity-driven rather than mission-focused. It has favored insight rather than solutions to specific problems, and partly as a consequence, university research has required additional integration and transformation to become economically useful.In Sections 5.1.1 through 5.1.4 Arora et al discuss in detail four reasons why the corporate labs drove faster productivity growth:
- Corporate labs work on general purpose technologies. Because the labs were hosted the leading companies in their market, they believed that technologies that benefited their product space would benefit them the most:
Claude Shannon’s work on information theory, for instance, was supported by Bell Labs because AT&T stood to benefit the most from a more efficient communication network ... IBM supported milestones in nanoscience by developing the scanning electron microscope, and furthering investigations into electron localization, non-equilibrium superconductivity, and ballistic electron motions because it saw an opportunity to pre-empt the next revolutionary chip design in its industry ... Finally, a recent surge in corporate publications in Machine Learning suggests that larger firms such as Google and Facebook that possess complementary assets (user data) for commercialization publish more of their research and software packages to the academic community, as they stand to benefit most from advances in the sector in general.
My experience of Open Source supports this. Sun was the leading player in the workstation market and was happy to publish and open source infrastructure technologies such as NFS that would buttress that position. On the desktop it was not a dominant player, which (sadly) led to NeWS being closed-source. - Corporate labs solve practical problems. They quote Andrew Odlyzko:
“It was very important that Bell Labs had a connection to the market, and thereby to real problems. The fact that it wasn’t a tight coupling is what enabled people to work on many long-term problems. But the coupling was there, and so the wild goose chases that are at the heart of really innovative research tended to be less wild, more carefully targeted and less subject to the inertia that is characteristic of university research.”
Again, my experience supports this contention. My work at Sun Labs was on fault-tolerance. Others worked on, for example, ultra high-bandwidth backplane bus technology, innovative cooling materials, optical interconnect, and asynchronous chip architectures, all of which are obviously "practical problems" with importance for Sun's products, but none of which could be applied to the products under development at the time. - Corporate labs are multi-disciplinary and have more resources. As regards the first of these, the authors use Google as an example:
Researching neural networks requires an interdisciplinary team. Domain specialists (e.g. linguists in the case of machine translation) define the problem to be solved and assess performance; statisticians design the algorithms, theorize on their error bounds and optimization routines; computer scientists search for efficiency gains in implementing the algorithms. Not surprisingly, the “Google translate” paper has 31 coauthors, many of them leading researchers in their respective fields
Again, I would agree. A breadth of disciplines was definitely a major contributor to PARC's successes.
As regards extra resources, I think this is a bigger factor than Arora et al do. As I wrote in Falling Research Productivity Revisited:
the problem of falling research productivity is like the "high energy physics" problem - after a while all the experiments at a given energy level have been done, and getting to the next energy level is bound to be a lot more expensive and difficult each time.
Information Technology at all levels is suffering from this problem. For example, Nvidia got to its first working silicon of a state-of-the-art GPU on $2.5M from the VCs, which today wouldn't even buy you a mask set. Even six years ago system architecture research, such as Berkeley's ASPIRE project, needed to build (or at least simulate) things like this:
Firebox is a 50kW WSC building block containing a thousand compute sockets and 100 Petabytes (2^57B) of non-volatile memory connected via a low-latency, high-bandwidth optical switch. ... Each compute socket contains a System-on-a-Chip (SoC) with around 100 cores connected to high-bandwidth on-package DRAM.
Clearly, AI research needs a scale of data and computation that only a very large company can afford. For example, Waymo's lead in autonomous vehicles is based to a large extent on the enormous amount of data that has taken years of a fleet of vehicles driving all day, every day to accumulate. - Large corporate labs may generate significant external benefits. By "external benefits", Arora et al mean benefits to society and the broader economy, but not to the lab's host company:
One well-known example is provided by Xerox PARC. Xerox PARC developed many fundamental inventions in PC hardware and software design, such as the modern personal computer with graphical user interface. However, it did not significantly benefit from these inventions, which were instead largely commercialized by other firms, most notably Apple and Microsoft. While Xerox clearly failed to internalize fully the benefits from its immensely creative lab ... it can hardly be questioned that the social benefits were large, with the combined market capitalization of Apple and Microsoft now exceeding 1.6 trillion dollars.
Two kinds of company form these external benefits. PARC had both spin-offs, in which Xerox had equity, and startups that built on their ideas and hired their alumni but in which they did not. Xerox didn't do spin-offs well:
As documented by Chesbrough (2002, 2003), the key problem there was not Xerox’s initial equity position in the spin-offs, but Xerox’s practices in managing the spin-offs, which discouraged experimentation by forcing Xerox researchers to look for applications close to Xerox’s existing businesses.
But Cisco is among the examples of how spin-offs can be done well, acting as an internal VC to incentivize a team by giving them equity in a startup. If it was successful, Cisco would later acquire it.
Sun Microsystems is an example of exceptional fertility in external startups. Nvidia was started by a group of frustrated Sun engineers. It is currently worth almost 30 times what Oracle paid to acquire Sun. It is but one entry in a long list of such startups whose aggregate value dwarfs that of Sun at its peak. As Arora et al write:
A surprising implication of this analysis is that the mismanagement of leading firms and their labs can sometimes be a blessing in disguise. The comparison between Fairchild and Texas Instruments is instructive. Texas Instruments was much better managed than Fairchild but also spawned far fewer spin-offs. Silicon Valley prospered as a technology hub, while the cluster of Dallas-Fort Worth semiconductor companies near Texas Instruments, albeit important, is much less economically significant.
An important additional external benefit that Arora et al ignore is the Open Source movement, which was spawned by Bell Labs and the AT&T consent decree. AT&T was forced to license the Unix source code. Staff at institutions, primarily Universities, which had signed the Unix license could freely share code enhancements. This sharing culture grew and led to the BSD and Gnu licenses that underlie the bulk of today's computational ecosystem.
a Lab is a cost, so it's externalized and sold, until someone buys it, and on again waiting for the next buyer. Probably the way to account it should be different, taking into account other non-appearing benefits for the mother company (brand/know-how/reputation/etc...)
ReplyDeletergds!M
A thought ... the business model changing to "providing services" instead of "selling products" will, perhaps, again shift the research back to corporations. R&D in that case makes a more visible contribution to the bottom line profits within the company.
ReplyDeleteI also would like to add my opinion that the academia cannot fully be of service to the large economy when their funding is so tightly controlled and directed to whatever political idea is flying around for the moment.
However,,, this was a great read!
I worked in corporate R&D labs for 9 years in the 1980's and early 1990's at GTE Labs and Digital Equipment Corporation. A large issue we constantly dealt with was technology transfer to other more business-oriented parts of the company. The technology we were trying to transfer was ground breaking and state-of-the-art, but was also often not at a production usage level. And the staff in the receiving organizations often did not have Masters or PhD level Computer Science training, though they were quite proficient oat MIS. As a result, they were not well equipped to receive 50K of Lisp code written within an object-oriented framework that ran on this "weird (to them) Lisp machine. So there was always this technology transfer disconnect. As researchers, we published extensively, so we contributed to the greater good of computer science advancement. And the level of publication was a large part of how we ere measured, as in academia. But it would have also been gratifying to see more use made of the cool technology we were producing. Usage was not zero percent, but I don't think it exceeded 10-15% either.
ReplyDeleteMatthew Hutson reports more evidence for falling research productivity, this time in AI, in Eye-catching advances in some AI fields are not real:
ReplyDelete"Researchers are waking up to the signs of shaky progress across many subfields of AI. A 2019 meta-analysis of information retrieval algorithms used in search engines concluded the “high-water mark … was actually set in 2009.” Another study in 2019 reproduced seven neural network recommendation systems, of the kind used by media streaming services. It found that six failed to outperform much simpler, nonneural algorithms developed years before, when the earlier techniques were fine-tuned, revealing “phantom progress” in the field. In another paper posted on arXiv in March, Kevin Musgrave, a computer scientist at Cornell University, took a look at loss functions, the part of an algorithm that mathematically specifies its objective. Musgrave compared a dozen of them on equal footing, in a task involving image retrieval, and found that, contrary to their developers’ claims, accuracy had not improved since 2006.
I worked in Silicon Valley (software industry) for 17 years, then as a full Professor of Software Engineering for 17 years. I retired at the end of 2016. I ran a research lab at school and spent a lot of time arranging funding. My lab, and several others that I knew, were perfectly willing to create a research stream that applied / extended our work in a direction needed by a corporation. The government grants kept us at a pretty theoretical level. Mixing corporate and government work let us explore some ideas much more thoroughly.
ReplyDeleteThe problem with corporate funding was that almost every corporate representative who wanted to contract with us wanted to pay minimum wage to my students and nothing to me--and they wanted all rights (with the scope "all" very broadly defined). They seemed to assume that I was so desperate for funding that I would agree to anything. Negotiating with these folks was unpleasant and unproductive. Several opportunities for introducing significant improvements in the efficiency and effectiveness of software engineering efforts were squandered because the large corporations who contacted me wanted the equivalent of donations, not research contracts. I heard similar stories from faculty members at my school and other schools.
It is possible for university researchers to steer some of their work into directions that are immediately commercially useful. But we can't do it with government research money because that's not what they agree to pay for. And we can't do it for free. And we can't agree to terms that completely block graduate students from continuing to work on the research that they focused their graduate research on because that would destroy their careers. Companies that won't make a serious effort to adddress those issues won't get much from university researchers. But don't pin that failure on the university.
Meanwhile, no-one cares that China outspends us 4:1 on research.
ReplyDeleteWe have a real missile gap opportunity to get our R&D back on track.
Why Corporate America Gave Up on R&D by Kaushik Viswanath is an interview with Ashish Arora and Sharon Belenzon, the authors of the paper behind this post.
ReplyDeleteI spent 37 years in an Analytical Sciences Division that, depending upon the whim of manamagement, was either loosely or tightly integrated with the Research Labs. Our mission was to help researchers and technologists understand the chemistry and performance of the materials that they were integrating into products. It was facinating and rewarding work. The instruments and the skills needed to properly interpret the results made it a frequent target for budget cuts when money was tight. Our clients valued the data because it helped understand the chemistry and materials science that determined how well the final product would perform.
ReplyDeleteExcellent post! I'm still not sure how antitrust plays into this. Doesn't the value of research increase to a firm if they have many acquired companies which could make use of that research?
ReplyDeleteIn the long-gone days when anti-trust was enforced, firms could not buy competitors to acquire their newly-developed products. So they had to invest in their own internal research and development, or they wouldn't have any new products.
ReplyDeleteNow that there is no anti-trust enforcement, there's no point in spending dollars that could go to stock buybacks on internal research labs or developing new products. Instead, you let the VCs fund R&D, and if their bets pay off, you buy the company. See for example the pharma industry. In software it is particularly effective, because you can offer the VC-funded startup a choice between being bought at a low valuation, or facing a quick-and-dirty clone backed by a much bigger company with a much bigger user base. See for example Facebook.
After spending 16 years in Silicon Valley R&D, I am working for Toyota on zero emission vehicle technology, and I can tell you that the Japanese have not abandoned the corporate research model. I don't know if it is tradition or necessity, or simply that it works for them, but it is refreshingly old fashioned. In my mind, economics as a measure of success is as good as any other metric, because it represents a sort of minimization of effort, energy, what have you. R&D will always be resource limited in some way, electricity, money, time, personnel; and so we have to learn to be efficient within our constraints. The yin to that yang, is that innovation can not happen outside of a creative environment. It is the responsibility, and the sole responsibility, of leadership to maintain a dynamic balance between creativity/innovation and resource constraint.
ReplyDeleteRob Beschizza's Explore an abandoned research lab points to this video, which provides a suitable coda for the post.
ReplyDeleteEx-Google boss Eric Schmidt: US 'dropped the ball' on innovation by Karishma Vaswani starts:
ReplyDelete"In the battle for tech supremacy between the US and China, America has "dropped the ball" in funding for basic research, according to former Google chief executive Eric Schmidt.
...
For example, Chinese telecoms infrastructure giant Huawei spends as much as $20bn (£15.6bn) on research and development - one of the highest budgets in the world.
This R&D is helping Chinese tech firms get ahead in key areas like artificial intelligence and 5G."
Daron Acemoglu makes good points in Antitrust Alone Won’t Fix the Innovation Problem, including:
ReplyDelete"In terms of R&D, the McKinsey Global Institute estimates that just a few of the largest US and Chinese tech companies account for as much as two-thirds of global spending on AI development. Moreover, these companies not only share a similar vision of how data and AI should be used (namely, for labor-replacing automation and surveillance), but they also increasingly influence other institutions, such as colleges and universities catering to tens of thousands of students clamoring for jobs in Big Tech. There is now a revolving door between leading institutions of higher education and Silicon Valley, with top academics often consulting for, and sometimes leaving their positions to work for, the tech industry."
As a late comment, most corporate management realized that researchers in corporate labs were too old and too expensive, with permanent positions, plus benefits, pensions, etc. and decided to go for much lower cost alternatives:
ReplyDelete* A lot of research labs in cheaper offshore locations, with younger researcher not demanding high wages and pensions and benefits, and much easier to fire.
* A lot of research was outsourced, via research grants, to universities, casualizing research work, because universities can put together very cheaply teams of young, hungry PhDs and postdocs on low pay and temporary contracts, also thanks to an enormous increase in the number of PhD and postdoc positions, also thanks to industry outsourcing contracts.