Every once in a while,” says Dean Kamen, the famed inventor whose many creations include the Segway and the stair-climbing iBot wheelchair, “a new technology, an old problem, and a big idea turn into an innovation.”
How exactly, though, do new technologies emerge? What characterizes their development? How can we foster and replicate technological progress — and get out of its way?
Innovation is the most important fact about the modern world, but one of the least well understood. It is the reason most people today live lives of prosperity and wisdom compared with their ancestors, the overwhelming cause of the great enrichment of the past few centuries, the simple explanation of why the incidence of extreme poverty is in global freefall for the first time in history: from 50 per cent of the world population to 9 per cent in my lifetime.
As such, a careful analysis of how, when, and why innovation blossoms is more than warranted. Many technological advances help economies to flourish, along the way improving the quality of life. Understanding this requires attention, which Ridley rigorously applies to his subject.
At its most basic, Ridley contends, innovation “means finding new ways to apply energy to create improbable things, and see them catch on. It means much more than invention, because the word implies developing an invention to the point where it catches on because it is sufficiently practical, affordable, reliable and ubiquitous to be worth using.”
A gifted storyteller, Ridley seamlessly weaves historical examples into the wider tapestry of his argument. From the polio vaccine to anti-malarial bednets to chlorinated water to e-cigarettes, Ridley humanizes his vignettes in the best sense: Rather than worshipping as incomprehensible geniuses the innovators behind these creations, he illustrates how serendipity, incrementalism, combination, unpredictability, perseverance, and old-fashioned trial-and-error shaped them.
Equally important, however, is what stifles innovation: overweening governmental interference, generally in the form of regulation, which often suffocates promising developments. In the current Covid-19 crisis, though, the U.S. Food and Drug Administration’s easing of certain regulations points the way forward to a healthier relationship between those who invent and those who inspect.
Weak intellectual property protections can also stifle rapid technological progress, in spite of Ridley’s flawed arguments to the contrary. Public understanding of and access to groundbreaking technology must be balanced against strong incentives for inventors to develop them. Striking this balance presents difficult challenges, but the basic patent bargain struck by all developed countries has created an environment congenial to tremendous advances. Unlike certain regulatory hurdles, the patent system, though also cumbersome, is well worth preserving to ensure that innovators remain motivated to share their designs while also profiting from them.
A first key characteristic of innovation is its sheer unpredictability, evident in the way seemingly unrelated events can change how technologies are used and adopted. Railway engines were originally designed in the early 1800s to ferry wagons of coal up hills, but were not widely used, as they were considered too unreliable and impractical compared to horses. But the strain on the equestrian market driven by Napoleon’s military adventurism forced owners of coal mines to change their approach. Engineers working for the mines solved design problems facing railways, building multiple-cylinder engines and making smooth-wheeled locomotives run on smooth rail lines, defying critics who said this was impossible. Eventually, in 1825, the Northumberland father – son team of George and Robert Stephenson were able to build a twenty-five mile railway, and by the 1830s the iron horse was transporting passengers and freight not just through Britain but also France, Germany, and the United States.
Likewise, the Wright brothers encountered literal and figurative headwinds while developing their powered flying machine. “Our skepticism is only as to the utilitarian value of any present or possible achievement of the aeroplane,” wrote Engineering Magazine in the wake of Wilbur Wright’s 1909 flight over the Hudson, six years after the more famous flight at Kitty Hawk. “We do not believe it will ever be a commercial vehicle at all.” For fifty years, this pessimistic prediction actually held up — before rapidly falling apart as commercial air transportation took off in the 1960s.
Other breakthroughs provoked, and ultimately embarrassed, naysayers. Motorola’s Martin Cooper, credited by many with inventing the mobile phone, said in 1981 that “cellular phones will absolutely not replace local wire systems. Even if you project it beyond our lifetimes, it won’t be cheap enough.” Yet the litany of false prophets does not, to his credit, dissuade Ridley from making his own predictions about the future of technologies like artificial intelligence, blockchain, and autonomous vehicles.
Innovation can also occur quite literally by accident. The single-turn screw propeller supplanted the paddle wheel in the mid-1800s as the primary means of powering steamboats because of a collision. A Londoner named Francis Smith had designed an engine that powered a screw with the spirals running around a wooden shaft in two complete turns, but while testing his design a collision lopped an entire screw-turn off of the shaft, leaving only a single turn. This seemingly unhappy event, however, turned out to increase the boat’s speed and begat the single screw-turn design that propelled steamship technology for decades.
A more recent example of accidental innovation comes from a Spanish scientist, who noticed peculiar patterns in the genomes of certain salt-loving microbes in a lake and in 2003, in what Ridley calls a “lucky break,” matched them up to the gene of a bacteriophage virus — a discovery that would form the basis of the contemporary gene editing technology known as CRISPR.
Even ancient innovations like cooking and animal domestication most likely resulted from accidental (and extremely gradual) developments, and their contemporary equivalents like Google Search, Post-It notes, and Teflon emerged serendipitously.
“Innovation,” wrote Joseph Schumpeter in 1939, “combines factors in a new way.” For the alimentary creations of groundbreaking chefs like René Redzepi in Copenhagen, this can mean returning to the ancient localism of hunter-gatherers by combining pork-neck with the kinds of plants that grow in the pig’s natural habitat, such as bulrushes, malt, and violets. But combination is no less important for modern inventions like the iPhone, which for the first time successfully integrated a telephone, a camera, a touchscreen, and an Internet browser in a single device. Neither Redzepi nor Steve Jobs invented any of the components, but only they managed to fuse them into far more than the whole of their parts.
So too was the computer formed as an amalgam of multiple innovations. The ENIAC, which Ridley credits as the first fully formed computer — at once digital, electronic, programmable, and general-purpose — was “not so much invented as evolved through the combination and adaptation of precursor ideas and machines.” Similarly, Waze, Facebook, and Uber innovated by mixing and matching previously existing technologies. As Ridley memorably put it in a 2010 TED talk, innovation is “when ideas have sex.”
Many, if not most, innovations accrete gradually rather than springing fully formed from a eureka moment. The absence of drama, however, does little to detract from the significance of incremental breakthroughs.
Consider the history of the now widely used, genetically modified insect-resistant Bt crops. The story begins in 1909, when a German researcher stumbled upon Bacillus thuringiensis (Bt), a bacterium infecting caterpillars and moths that had infested a flour mill in Thuringia, shortly after a Japanese biologist had identified the same bacterium among silkworms. Gardeners soon began using spores of the bacterium to control pests in greenhouses, but it was unsuitable for large-scale application by farmers. With the tools of modern biotechnology, however, scientists in the late 1980s managed to genetically manipulate crops to incorporate the Bt toxin — a development that yielded “dramatically reduced crop loss, pesticide use and environmental damage,” Ridley writes.
In the same way, “the story of the light bulb,” Ridley asserts, “far from illustrating the importance of the heroic inventor, turns out to tell the opposite story: of innovation as a gradual, incremental, collective yet inescapably inevitable process.” Thomas Edison would surely agree. The Wizard of Menlo Park himself freely acknowledged (in what is surely an overstatement) that “my so-called inventions already existed in the environment — I took them out. I’ve created nothing. Nobody does. There’s no such thing as an idea being brain-born; everything comes from the outside.”
The automobile is another classic example of incremental, mutual invention. “Who invented the motor car running on an internal-combustion engine?” Ridley asks. After all, “Ford made it ubiquitous and cheap; Maybach gave it all its familiar features; Levassor provided crucial changes; Daimler got it running properly; Benz made it run on petrol; Otto devised the engine’s cycle; Lenoir made the first crude version; and de Rivaz presaged its history.” Other key inventions, including the telephone and radio, fit these descriptions as well.
If these characteristics of innovation promote its growth, then government often hinders it. Ridley observes that inventive creativity has historically flourished amidst fragmented governance, such as Renaissance-era Italy, while unified, centralized regimes, such as the Ottoman and Mughal Empires, stifled technological breakthroughs. In China, “the periods of explosive innovation coincided with decentralized government,” while strong empires like the Ming impeded progress.
In the United States, too, inventiveness has typically thrived in the absence of governmental subsidy, including in such areas as air travel, telecommunications, and electricity. Economists such as Mariana Mazzucato, along with many tech policy writers, highlight the role that government-led research plays in promoting breakthroughs, with the Apollo program as the paradigmatic case. But Ridley persuasively shows how the private sector accounts for the lion’s share of innovation. Indeed, a 2003 OECD report found that R&D performed by businesses, rather than public institutes, was chiefly responsible for economic growth. The healthier approach, Ridley contends, is to fund basic science research for its own sake, not as industrial policy seeking to produce technological applications: “Science should be seen as the fruit rather than the seed.”
While governments can overreach in trying to force innovation through funding, they can also dampen it through regulation. For instance, the European Union’s foolhardy restrictions on genetically modified organisms deprive hundreds of millions of people of safe, cheap, healthy, environmentally friendly food through a campaign of “demonization and delay” motivated by the ill-considered precautionary principle. The Federal Communications Commission’s unnecessarily tight restriction of cellular broadband spectrum in the late 1940s stifled mobile telephony for decades.
Medical regulation is a particularly thorny problem. A certain degree of oversight ensures patient safety and enshrines ethical imperatives, but the FDA’s stringent requirements often go too far. According to a 2010 survey of medical technology companies, the premarket process for devices posing a low to moderate risk to patients takes an average of ten months and $24 million in FDA-related costs. A 2016 BusinessEurope report warned that the EU’s Medical Devices Directive “will inevitably slow down innovation and make [devices] more expensive, increasing the mounting challenges for public healthcare.” (The directive has since been replaced with new regulation.)
But if the current pandemic has had a silver lining, it can be found in the relaxation of otherwiseextravagantlyintrusivescrutiny by regulatory agencies like the FDA. Fighting the virus has provoked a fevered effort to temporarily roll back these impediments, though scholars at the Mercatus Center, proposing a “Fresh Start Initiative,” argue that we should make this newfound openness a more permanent part of the regulatory landscape. In a Wall Street Journal article in May, Ridley wrote about “the regulatory delays and hurdles that have now been hastily swept aside to help innovators in medical devices and therapies” and hopes the same spirit can profitably be applied to other regulation. “We must find a way,” he wisely hopes in the book, “to reform the regulatory state so that while keeping us safe it does not prevent the simple process of trial and error on which all innovation depends.”
Intellectual property plays a pivotal role in promoting technological advances, but here Ridley’s analysis falters. In discussing the demerits of the patent system, Ridley reprises some of the arguments from his 2010 book The Rational Optimist. But these claims have gotten no stronger with age, as Ridley consistently exaggerates the costs and underestimates the benefits of a robust intellectual property regime.
Take the advent of the steam engine. Ridley blames patents for “getting in the way of improvement,” because, first, James Watt had to “develop an alternative system” to the crank-and-flywheel mechanism invented by James Pickard and, second, because Watt himself “was an enthusiastic defender of his own patents.”
But a closer examination of the history reveals the shortcomings of these arguments. First, Watt could always have developed further steam-engine improvements on the basis of Pickard’s mechanism and could have protected those inventions with patents, even if a part of any revenues derived from such an improved system would have had to flow to Pickard in recognition of his contributions.
Second, Ridley allows that “just how much Watt’s litigiousness delayed the expansion of steam as a source of power in factories is a hotly contested issue” but insists that steam engine innovation burgeoned after Watt’s patent expired in 1800. But, in their 2009 article “Watt, Again?,” George Selgin and John Turner found, on the basis of earlier research, no significant increase in steam engine efficiency in the early 1800s and a higher-than-previously-thought rate of improvement before Watt’s patent expired.
In addition, even if Watt somehow thwarted progress, a strong patent system could not have been to blame. As James Bessen and Michael J. Meurer argued in their 2009 book Patent Failure: How Judges, Bureaucrats, and Lawyers Put Innovators at Risk, at the turn of the nineteenth century in Britain, “patent litigation was costly and risky,” and British patents were issued as formalities without being subject to examination for novelty or inventiveness. Thus, pace Ridley, any infirmities in the British patent system in Watt’s time derived from its underprotectiveness.
In addition, Ridley decries the personal and financial toll patents take on the inventors themselves. “Again and again,” he laments, “I have documented in this book how innovators wrecked their lives battling to establish or defend patents on their innovations.” But this is simply another way of saying that innovators like Watt, Morse, and Marconi fought like the devil to capture the monetary benefits of their creations — a perfectly reasonable and responsible choice. Absent patents, they might not have bothered inventing in the first place.
There is another downside to doing away with patents: Without patent protection, innovators can commercially exploit and intellectually protect their ideas by never revealing them, preserving them as trade secrets. Indeed, in many industries, including software development, some inventors opt against filing patent applications and instead carefully keep their secret algorithms under (digital) lock and key. But not every trade secret can be preserved, as some technologies are more vulnerable to reverse-engineering than others. In addition, trade secrets prevent the dissemination of the knowledge animating those innovations. By contrast, widespread disclosure of technological ingenuity is precisely the goal of the fundamental patent tradeoff — a limited period of exclusive use of an invention in exchange for explaining precisely how it functions.
Ridley argues that “the dismal failure of the pharmaceutical industry to find any effective new drugs for diseases like Alzheimer’s, or even to sustain its rate of innovation generally, hardly testifies to the effectiveness of the intellectual-property regime,” and claims, citing a “successful tech investor,” that innovation would be sustained absent patents on drugs. This argument is shockingly ungenerous, as it disregards the tremendous breakthroughs in drugs developed by the pharmaceutical industry in recent decades to help treat cancer, HIV/AIDS, and hepatitis — all of which required billions of dollars in R&D investment and none of which would have been financially viable without a robust patent system.
As an alternative, Ridley has proposed a prize-based system in which governments, industry, and philanthropists would pool resources to reward innovators with cash. But while prizes can occasionally fill a gap where the market fails, like vaccine development for rare diseases or widespread illnesses in developing countries, they have proven generally unwieldy, inequitable, and ineffective. A 2017 study of prizes awarded between 1750 and 1850 by the Royal Society of Arts (which includes manufacturing and commerce) concluded that the Society “itself became disillusioned with the prize system, which they ultimately recognized had done little to promote technological progress and industrialization.” Instead, the society “switched from offering prizes towards supporting patents and lobbying for reforms in the patent system.”
Ultimately, Ridley correctly and eloquently argues that innovation boils down to freedom: the liberty to “exchange, experiment, imagine, invest and fail” and to avoid “expropriation or restriction by chiefs, priests and thieves.” For all the unpredictability, incrementalism, combination, and serendipity that nourish technological advances, freedom is their bread and butter. Policymakers the world over should redouble their commitment to maximizing the autonomy of innovators and to clearing needless regulatory hurdles.
Michael M. Rosen is an attorney and writer in Israel and an adjunct fellow at the American Enterprise Institute.
Michael M. Rosen, “Promoting the Useful Arts,” The New Atlantis, Number 62, Fall 2020, pp. 120-127.
Header Image: James Eckford Lauder, James Watt and the Steam Engine: The Dawn of the Nineteenth Century (1855) via Wikimedia
Notice: Undefined index: gated in /opt/bitnami/apps/wordpress/htdocs/wp-content/themes/thenewatlantis/template-parts/cards/25wide.php on line 27
How the technological feats of World War II grew out of curiosity-driven research
Notice: Undefined index: gated in /opt/bitnami/apps/wordpress/htdocs/wp-content/themes/thenewatlantis/template-parts/cards/25wide.php on line 27
The case for curiosity-driven science — and a new way to think about R&D