Thursday, December 18, 2008

The Price of Forgoing Basic Research

The Price of Forgoing Basic Research


In the "good old days," the industrialized world was peppered with corporate research labs. At the same time, universities were generally well funded. Curiosity-driven research, a key component in innovation was the ethic of the academy. The university produced great minds that were encouraged to think deeply and creatively without the consideration of commercial relevance. Industry selected appropriate candidates from among this cohort and gave them a home where they could generate proprietary intellectual property on which that company's future could be based. Along the way, all did some outstanding basic science.

But since about 1970, we have been on a path where industry's investment in basic research has been in decline. At the same time, there has been a significant shift toward applied, "industry-relevant" research within academia. I believe that these trends do not augur well for the future of industry, academia, or society as a whole.

The Decline of the Corporate Lab

Some might argue that the decline in the number of corporate research labs is no bad thing—that the market made a correction and halted investment in things that did not provide an adequate return. I can even hear someone bringing up Xerox PARC (where I worked) as an example. "Hey, they developed the laser printer, local area networks, and personal workstations and were still not a player in personal computing!" Well, if you want to argue that a failure in a particular technology transfer is sufficient to condemn the whole notion of corporate research, then we will just have to disagree. The invention of Nylon, Lycra, Spandex, Teflon, and Kevlar provide a clear illustration of how investment in research can sustain the long-term viability of a corporation (in that case, Dupont (DD).)

Others might argue that corporate research has simply moved to other parts of the organization—to places where it can be more integrated with the rest of the company and therefore accelerate the adoption of research. They might even support such a conclusion by referring to the data reported in sources such as the OECD Science, Technology & Industry Scoreboard 200 which indicates that, in general, reasonable investments in research and development are being made by industry, academia, and government.

But the term R&D is so broad as to border on useless for the purpose of analysis, since it covers the whole gamut of activities from basic research to product development. It ignores the significant difference between the work of a Nobel Prize laureate and a junior programmer. As early as 1980, the economist Edwin Mansfield showed that throwing everything into one R&D bucket obscured the fact that corporate investment in basic research, and even advanced development, was in decline. In this now-classic paper, Mansfield surveyed the R&D spending of 119 firms, representing about 50% of R&D expenditures in the U.S. He found approximately a 25% reduction in their investment in basic research between 1967 and 1977.

That may seem like a long time ago—but think how long it can take for research or development to play out (for more on this topic, see my previous column, "The Long Nose of Innovation"). All of a sudden, the issue becomes contemporary. As a result, we should be skeptical of reports which lull us into believing that our R&D bucket is adequately full.

There will still be those who argue that industry can no longer afford to undertake basic research and that any investment is best made in applied research and development.



  • Marcial: Procter & Gamble, Durable in a Downturn
  • Clouds over the Solar Power Industry
  • Marcial: Procter & Gamble, Durable in a Downturn
  • No comments: