Tuesday, May 19, 2020

The Death Of Corporate Research Labs

In American innovation through the ages, Jamie Powell wrote:
who hasn’t finished a non-fiction book and thought “Gee, that could have been half the length and just as informative. If that.”

Yet every now and then you read something that provokes the exact opposite feeling. Where all you can do after reading a tweet, or an article, is type the subject into Google and hope there’s more material out there waiting to be read.

So it was with Alphaville this Tuesday afternoon reading a research paper from last year entitled The changing structure of American innovation: Some cautionary remarks for economic growth by Arora, Belenzon, Patacconi and Suh (h/t to KPMG’s Ben Southwood, who highlighted it on Twitter).

The exhaustive work of the Duke University and UEA academics traces the roots of American academia through the golden age of corporate-driven research, which roughly encompasses the postwar period up to Ronald Reagan’s presidency, before its steady decline up to the present day.
Arora et al argue that a cause of the decline in productivity is that:
The past three decades have been marked by a growing division of labor between universities focusing on research and large corporations focusing on development. Knowledge produced by universities is not often in a form that can be readily digested and turned into new goods and services. Small firms and university technology transfer offices cannot fully substitute for corporate research, which had integrated multiple disciplines at the scale required to solve significant technical problems.
As someone with many friends who worked at the legendary corporate research labs of the past, including Bell Labs and Xerox PARC, and who myself worked at Sun Microsystems' research lab, this is personal. Below the fold I add my 2c-worth to Arora et al's extraordinarily interesting article.

The authors provide a must-read, detailed history of the rise and fall of corporate research labs. I lived through their golden age; a year before I was born the transistor was invented at Bell Labs:
The first working device to be built was a point-contact transistor invented in 1947 by American physicists John Bardeen and Walter Brattain while working under William Shockley at Bell Labs. They shared the 1956 Nobel Prize in Physics for their achievement.[2] The most widely used transistor is the MOSFET (metal–oxide–semiconductor field-effect transistor), also known as the MOS transistor, which was invented by Egyptian engineer Mohamed Atalla with Korean engineer Dawon Kahng at Bell Labs in 1959.[3][4][5] The MOSFET was the first truly compact transistor that could be miniaturised and mass-produced for a wide range of uses.[6]
Arora et al Fig 2.
Before I was 50 Bell Labs had been euthanized as part of the general massacre of labs:
Bell Labs had been separated from its parent company AT&T and placed under Lucent in 1996; Xerox PARC had also been spun off into a separate company in 2002. Others had been downsized: IBM under Louis Gerstner re-directed research toward more commercial applications in the mid-90s ... A more recent example is DuPont’s closing of its Central Research & Development Lab in 2016. Established in 1903, DuPont research rivaled that of top academic chemistry departments. In the 1960s, DuPont’s central R&D unit published more articles in the Journal of the American Chemical Society than MIT and Caltech combined. However, in the 1990s, DuPont’s attitude toward research changed and after a gradual decline in scientific publications, the company’s management closed its Central Research and Development Lab in 2016.
Arora et al point out that the rise and fall of the labs coincided with the rise and fall of anti-trust enforcement:
Historically, many large labs were set up partly because antitrust pressures constrained large firms’ ability to grow through mergers and acquisitions. In the 1930s, if a leading firm wanted to grow, it needed to develop new markets. With growth through mergers and acquisitions constrained by anti-trust pressures, and with little on offer from universities and independent inventors, it often had no choice but to invest in internal R&D. The more relaxed antitrust environment in the 1980s, however, changed this status quo. Growth through acquisitions became a more viable alternative to internal research, and hence the need to invest in internal research was reduced.
Lack of anti-trust enforcement, pervasive short-termism, driven by Wall Street's focus on quarterly results, and management's focus on manipulating the stock price to maximize the value of their options killed the labs:
Large corporate labs, however, are unlikely to regain the importance they once enjoyed. Research in corporations is difficult to manage profitably. Research projects have long horizons and few intermediate milestones that are meaningful to non-experts. As a result, research inside companies can only survive if insulated from the short-term performance requirements of business divisions. However, insulating research from business also has perils. Managers, haunted by the spectre of Xerox PARC and DuPont’s “Purity Hall”, fear creating research organizations disconnected from the main business of the company. Walking this tightrope has been extremely difficult. Greater product market competition, shorter technology life cycles, and more demanding investors have added to this challenge. Companies have increasingly concluded that they can do better by sourcing knowledge from outside, rather than betting on making game-changing discoveries in-house.
They describe the successor to the labs as:
a new division of innovative labor, with universities focusing on research, large firms focusing on development and commercialization, and spinoffs, startups, and university technology licensing offices responsible for connecting the two.
An unintended consequence of abandoning anti-trust enforcement was thus a slowing of productivity growth, because the this new division of labor wasn't as effective as the labs:
The translation of scientific knowledge generated in universities to productivity enhancing technical progress has proved to be more difficult to accomplish in practice than expected. Spinoffs, startups, and university licensing offices have not fully filled the gap left by the decline of the corporate lab. Corporate research has a number of characteristics that make it very valuable for science-based innovation and growth. Large corporations have access to significant resources, can more easily integrate multiple knowledge streams, and direct their research toward solving specific practical problems, which makes it more likely for them to produce commercial applications. University research has tended to be curiosity-driven rather than mission-focused. It has favored insight rather than solutions to specific problems, and partly as a consequence, university research has required additional integration and transformation to become economically useful.
In Sections 5.1.1 through 5.1.4 Arora et al discuss in detail four reasons why the corporate labs drove faster productivity growth:
  1. Corporate labs work on general purpose technologies. Because the labs were hosted the leading companies in their market, they believed that technologies that benefited their product space would benefit them the most:
    Claude Shannon’s work on information theory, for instance, was supported by Bell Labs because AT&T stood to benefit the most from a more efficient communication network ... IBM supported milestones in nanoscience by developing the scanning electron microscope, and furthering investigations into electron localization, non-equilibrium superconductivity, and ballistic electron motions because it saw an opportunity to pre-empt the next revolutionary chip design in its industry ... Finally, a recent surge in corporate publications in Machine Learning suggests that larger firms such as Google and Facebook that possess complementary assets (user data) for commercialization publish more of their research and software packages to the academic community, as they stand to benefit most from advances in the sector in general.
    My experience of Open Source supports this. Sun was the leading player in the workstation market and was happy to publish and open source infrastructure technologies such as NFS that would buttress that position. On the desktop it was not a dominant player, which (sadly) led to NeWS being closed-source.
  2. Corporate labs solve practical problems. They quote Andrew Odlyzko:
    “It was very important that Bell Labs had a connection to the market, and thereby to real problems. The fact that it wasn’t a tight coupling is what enabled people to work on many long-term problems. But the coupling was there, and so the wild goose chases that are at the heart of really innovative research tended to be less wild, more carefully targeted and less subject to the inertia that is characteristic of university research.”
    Again, my experience supports this contention. My work at Sun Labs was on fault-tolerance. Others worked on, for example, ultra high-bandwidth backplane bus technology, innovative cooling materials, optical interconnect, and asynchronous chip architectures, all of which are obviously "practical problems" with importance for Sun's products, but none of which could be applied to the products under development at the time.
  3. Corporate labs are multi-disciplinary and have more resources. As regards the first of these, the authors use Google as an example:
    Researching neural networks requires an interdisciplinary team. Domain specialists (e.g. linguists in the case of machine translation) define the problem to be solved and assess performance; statisticians design the algorithms, theorize on their error bounds and optimization routines; computer scientists search for efficiency gains in implementing the algorithms. Not surprisingly, the “Google translate” paper has 31 coauthors, many of them leading researchers in their respective fields
    Again, I would agree. A breadth of disciplines was definitely a major contributor to PARC's successes.

    As regards extra resources, I think this is a bigger factor than Arora et al do. As I wrote in Falling Research Productivity Revisited:
    the problem of falling research productivity is like the "high energy physics" problem - after a while all the experiments at a given energy level have been done, and getting to the next energy level is bound to be a lot more expensive and difficult each time.
    Information Technology at all levels is suffering from this problem. For example, Nvidia got to its first working silicon of a state-of-the-art GPU on $2.5M from the VCs, which today wouldn't even buy you a mask set. Even six years ago system architecture research, such as Berkeley's ASPIRE project, needed to build (or at least simulate) things like this:
    Firebox is a 50kW WSC building block containing a thousand compute sockets and 100 Petabytes (2^57B) of non-volatile memory connected via a low-latency, high-bandwidth optical switch. ... Each compute socket contains a System-on-a-Chip (SoC) with around 100 cores connected to high-bandwidth on-package DRAM.
    Clearly, AI research needs a scale of data and computation that only a very large company can afford. For example, Waymo's lead in autonomous vehicles is based to a large extent on the enormous amount of data that has taken years of a fleet of vehicles driving all day, every day to accumulate.
  4. Large corporate labs may generate significant external benefits. By "external benefits", Arora et al mean benefits to society and the broader economy, but not to the lab's host company:
    One well-known example is provided by Xerox PARC. Xerox PARC developed many fundamental inventions in PC hardware and software design, such as the modern personal computer with graphical user interface. However, it did not significantly benefit from these inventions, which were instead largely commercialized by other firms, most notably Apple and Microsoft. While Xerox clearly failed to internalize fully the benefits from its immensely creative lab ... it can hardly be questioned that the social benefits were large, with the combined market capitalization of Apple and Microsoft now exceeding 1.6 trillion dollars.
    Two kinds of company form these external benefits. PARC had both spin-offs, in which Xerox had equity, and startups that built on their ideas and hired their alumni but in which they did not. Xerox didn't do spin-offs well:
    As documented by Chesbrough (2002, 2003), the key problem there was not Xerox’s initial equity position in the spin-offs, but Xerox’s practices in managing the spin-offs, which discouraged experimentation by forcing Xerox researchers to look for applications close to Xerox’s existing businesses.
    But Cisco is among the examples of how spin-offs can be done well, acting as an internal VC to incentivize a team by giving them equity in a startup. If it was successful, Cisco would later acquire it.

    Sun Microsystems is an example of exceptional fertility in external startups. Nvidia was started by a group of frustrated Sun engineers. It is currently worth almost 30 times what Oracle paid to acquire Sun. It is but one entry in a long list of such startups whose aggregate value dwarfs that of Sun at its peak. As Arora et al write:
    A surprising implication of this analysis is that the mismanagement of leading firms and their labs can sometimes be a blessing in disguise. The comparison between Fairchild and Texas Instruments is instructive. Texas Instruments was much better managed than Fairchild but also spawned far fewer spin-offs. Silicon Valley prospered as a technology hub, while the cluster of Dallas-Fort Worth semiconductor companies near Texas Instruments, albeit important, is much less economically significant.
    An important additional external benefit that Arora et al ignore is the Open Source movement, which was spawned by Bell Labs and the AT&T consent decree. AT&T was forced to license the Unix source code. Staff at institutions, primarily Universities, which had signed the Unix license could freely share code enhancements. This sharing culture grew and led to the BSD and Gnu licenses that underlie the bulk of today's computational ecosystem.
Jamie Powell was right that Arora et al have produced an extremely valuable contribution to the study of the decay of the vital link between R&D and productivity of the overall economy.

3 comments:

miguel said...

a Lab is a cost, so it's externalized and sold, until someone buys it, and on again waiting for the next buyer. Probably the way to account it should be different, taking into account other non-appearing benefits for the mother company (brand/know-how/reputation/etc...)
rgds!M

Unknown said...

A thought ... the business model changing to "providing services" instead of "selling products" will, perhaps, again shift the research back to corporations. R&D in that case makes a more visible contribution to the bottom line profits within the company.

I also would like to add my opinion that the academia cannot fully be of service to the large economy when their funding is so tightly controlled and directed to whatever political idea is flying around for the moment.

However,,, this was a great read!

AlanG01 said...

I worked in corporate R&D labs for 9 years in the 1980's and early 1990's at GTE Labs and Digital Equipment Corporation. A large issue we constantly dealt with was technology transfer to other more business-oriented parts of the company. The technology we were trying to transfer was ground breaking and state-of-the-art, but was also often not at a production usage level. And the staff in the receiving organizations often did not have Masters or PhD level Computer Science training, though they were quite proficient oat MIS. As a result, they were not well equipped to receive 50K of Lisp code written within an object-oriented framework that ran on this "weird (to them) Lisp machine. So there was always this technology transfer disconnect. As researchers, we published extensively, so we contributed to the greater good of computer science advancement. And the level of publication was a large part of how we ere measured, as in academia. But it would have also been gratifying to see more use made of the cool technology we were producing. Usage was not zero percent, but I don't think it exceeded 10-15% either.