If there is any consensus in the debate over how to revitalize the American economy, it is over innovation. Innovation, we can all readily concur, is the only way for an advanced economy like the United States – which cannot grow by copying and imitating others – to continue to boost productivity and raise living standards. But understanding why useful innovations occur, and what if anything governments can do to foster them, quickly degenerates into a clash between free market absolutists and industrial policy aficionados.
In their book Producing Prosperity: Why America Needs a Manufacturing Renaissance, Harvard Business School professors Gary Pisano and Willy Shih cut through the confusion. In just 138 pages – a perfect read for the Washington to New York Acela – they offer the most compelling case I have read for why making things matters, even if it will produce very few manufacturing jobs in the future. Pisano is an economist with particular expertise in the biotechnology industry, and Shih is a professor of management practice who spent a career in senior executive positions at IBM, Digital Equipment, Silicon Graphics, and Kodak. I had the pleasure of hosting both at a roundtable meeting at the Council on Foreign Relations in New York on February 1. (We had scheduled the meeting for several months ago but our plans were blown away by Hurricane Sandy.)
They demolish the comforting story that many economists have offered to dismiss concerns over the shrinking role of manufacturing in the U.S. economy. The conventional argument goes like this: it makes more economic sense to locate the actual production of goods in lower-wage countries, while the United States maintains the skilled parts of the supply chain – R&D, branding, marketing, etc. The classic example here is Apple: most of the value of an iPhone or iPad comes from the design, software, branding and retailing, not from the assembly. Therefore, U.S.-headquartered Apple can become the most valuable company in the world even while making virtually nothing in the United States.
But it turns out this model is not very replicable (and may not even work very well for Apple in the longer run). The reason is that new technological innovations often come from what is learned in the manufacturing and development of earlier technologies. It is not enough to have a good idea: Bell Labs invented the photovoltaic (PV) cell, but production has been almost entirely in Asia, where all the key component suppliers are located and the manufacturing knowledge now largely resides.
The loss of manufacturing production can have lasting knock-on effects. Willy Shih, who led the consumer digital business for Kodak in the late 1990s, tells the story of how Kodak missed the digital camera revolution. It was not through ignorance – in fact the company had long been working on digital technologies and produced one of the first consumer digital cameras, in 1994. But Kodak had largely exited the camera business in the 1960s, deciding (quite logically at the time) that the real profits were in film. The camera business moved offshore to Japan. As a result, when Kodak decided to begin making digital cameras in the United States, there was no supplier network; all the critical components were being made in Japan. In 1998, Kodak shut down its digital assembly line in Rochester and moved it to Japan to be closer to suppliers.
One of the compelling things about their analysis is that they do not argue that it always, or even mostly, makes sense for U.S. companies to manufacture in the United States. Only in certain sorts of industries is it critical that research and manufacturing be kept in close proximity. For mature technologies with established production processes – such as desktop computers, consumer electronics, commodity semiconductors – outsourcing is a sensible business strategy. But where production processes are rapidly evolving – advanced semiconductors, biotech drugs, and nanomaterials to name just a few – the loss of production can quickly lead to the loss of any innovative edge. And over time the research capabilities will follow production. Applied Materials, for example, moved its chief technical officer to Asia in 2010 because it made more sense to locate research capabilities closer to the company’s largest customers in China, Taiwan, and South Korea.
Nor do Pisano and Shih promise that manufacturing will again become a big jobs engine for the United States. Companies that are investing in U.S. manufacturing are also investing heavily in automation. Manufacturing will still produce some good jobs, but the compelling reason to retain and attract manufacturing is not for employment, but to retain the production know-how and supplier networks that are the key to future innovation – which will in turn spin off new job opportunities.
The book is not particularly aimed at government. The authors are largely focused on business strategy, and a key goal is to make corporate managers think carefully about the long-term costs of outsourcing, and to decide wisely where it makes sense and where it does not. But there is, they argue, a role for government – not as a venture capitalist picking winners and losers (see Solyndra), but in laying the foundations for the United States to remain an innovation hub. This includes large investments in basic and applied research as was done throughout the Cold War, support for big, collaborative research such as the Human Genome Project, and developing human capital through education and more sensible immigration laws. Those sorts of recommendations ought to find support even in a divided Washington.