Why University Research Is More Important Than Ever
from Energy, Security, and Climate and Energy Security and Climate Change Program

Why University Research Is More Important Than Ever

More on:

Technology and Innovation

Media

A dangerous ideological current is coursing through the intellectual circuit, a political conviction dressed up as an empirical theory. Its proponents argue that public funding of basic scientific research is, at best, a waste of money and, at worst, an actively counterproductive endeavor that crowds out the private sector’s innovative instincts. And the institutions in the crosshairs of these broadsides are U.S. research universities, the country’s most valuable assets in a global economy driven by innovation.

A Backward Theory of Innovation

Last week, Matt Ridley wrote in The Wall Journal that “the linear dogma so prevalent in the world of science and politics—that science drives innovation, which drives commerce—is mostly wrong. It misunderstands where innovation comes from. Indeed, it generally gets it backward.” He continues, “technological advances are driven by practical men who tinkered until they had better machines; abstract scientific rumination is the last thing they do.” That is, the private sector’s inventions drive research in basic science, not the other way around.

Similarly, Lev Grossman’s Time Magazine cover article, an encomium to start-ups commercializing fusion reactors, quotes an entrepreneur disdainful of university research: “Fusion is in the end an application, right? The problem with fusion typically is that it’s driven by science, which means you take the small steps.”  Crystallizing this conception of methodically misguided public research seeking to understand basic science, Grossman asserts, “Understanding is all well and good, in an ideal world, but the real world is getting less ideal all the time. The real world needs clean power and lots of it.”

This strain of virulent vitriol against basic scientific inquiry hit home over the weekend when I spoke with a champion of university research, Stanford University President John Hennessy. Mulling his legacy, President Hennessy glowed with pride, noting that Stanford can count more Nobel Laureates over his fifteen-year tenure than any other university. But he warned that uncertainty over future federal research support to universities poses a grave risk to the prolific advances that have helped Stanford fuel the Silicon Valley innovation engine.

Indeed, as the figure below illustrates, federal spending on basic university research has declined in real terms since the one-time windfall of President Obama’s 2009 stimulus package. And depending on the outcome of the 2016 election, further cuts could loom large.

Source: National Science Foundation

Nonlinear ≠ Linear in Reverse!

Writing a timely rejoinder to Ridley in The Guardian, Jack Stilgoe concurs that innovation is nonlinear. But he correctly calls out Ridley for making the leap that just because basic science does not linearly advance innovation, the reverse must be true: namely, that private innovation must therefore drive basic science. Reality is considerably more complicated, and Ridley’s fantasy world of basic research following in the wake of private inventions is as simplistic as the linear model he derides.

Innovation is, in fact, nonlinear. The path from basic science to commercial product can span decades, traverse disciplinary boundaries, and meander back and forth between academia and industry. Nevertheless, the causal role of university research is indisputable: it provides a theoretical framework and a body of empirical observations that constrain an otherwise intractably vast option space for innovation.

Here’s a concrete example. In my field—solar power—hordes of chemists and materials engineers tweak the chemical compositions and production processes of semiconductors, hoping to make a solar material that converts more sunlight into electricity. In Ridley’s universe, privately funded scientists would iterate and see what works, making an evolutionary series of tweaks that make more and more efficient solar panels. Later on, university scientists can tinker away, trying to figure out why what worked actually worked.

This is a monumentally foolish idea and, frankly, one of the reasons why so many solar start-ups went bust. I’ve worked in companies under pressure from investors to deliver results, and I’ve witnessed scientists taking shortcuts to improve device performance without understanding the underlying physics—in fact, I was guilty of doing this myself. We would run experiments without a clear theoretical reason, and our new devices wouldn’t behave better or worse but simply, differently. Lost in an unending wilderness of data without the compass of prior scholarship, we would invariably retrace our steps and bemoan the wasted effort.[1]

By contrast, university research is obsessed with questions that start with why and only occasionally apply the test of so what? Now, this can be problematic, and I’ve written before that scientific curiosity alone is not sufficient to develop real-world clean energy technologies. For example, most performance  records for emerging solar power technologies—including perovskites, quantum dots, organics, etc.—are held by publicly funded universities and research laboratories. Absent practical product development, at which industry excels, to complement fundamental scientific inquiry, these technologies may languish in laboratories. But eliminating university research will ensure that those solar materials never see the light of day, rejecting a necessary condition for innovation because of its insufficiency.

I worry that support for science, and clean energy research in particular, could fall victim to complacent confidence in the autonomous advances of innovation.  Speaking out forcefully in favor of expanded public research and development funding for clean energy, Bill Gates recently pronounced, “We need an energy miracle.” To get there, he advocates tripling government funding for basic energy research to $18 billion per year. Doing the opposite—cutting public funding for university research to give the private sector running room—will make any energy miracle a pipe dream.


[1] Academic scholarship from the early 20th century continues to guide innovation in solar technology today. Researchers still design experiments, craft mathematical models, and troubleshoot puzzling results by falling back on the quantum theory of solids, which Bloch, Peierls, and Wilson established by the mid- 1930s in European research universities.

More on:

Technology and Innovation

Media