Chapter 13: The Technological Imbalance

This book ishas been about the imbalances of American power in the next decade and their effect of these imbalances on the world. I’ve focused on economic and geopolitical imbalanceissues and made the argument that these imbalances here are transitory and can be corrected. But tThe book is, however, would be incomplete without considering until we consider two other, major issues outside of economics and geopolitics, impinging on the decade ahead, namely demography and technology. And here too I see a decade of imbalance. Let’s end by considering these.

Economic cycles—boom and bust—can be driven by speculation and financial manipulation as it was theduring the first decade of this century just ending. But, at a deeper level, economic expansion and contraction is driven by demographic forces, and by technological innovation.

During the decade to come, we will see the ebbing of the demographic tide that helped to launch drive the prosperity ofduring the immediate post-war period. The age cohort known as the Baby Boom, the children born during the Truman and Eisenhower administrations, will be in their sixties, beginning to retire, beginning to slow down, beginning to get old. As a result, the same demographic bulge that helped create abundance a half century ago will create an economic burden in the years ahead.

In the 1950s, the Baby Boomers helped create demand for millions of baby strollers, tract houses, station wagons, bicycles, and washer-dryers. During the 1970s, they began to seek work in an economy not yet ready for them. As they applied for jobs, married and had children, bought and borrowed, their collective behavior caused interest rates, inflation and unemployment roseto rise.

As the economy absorbed these people in the 1980s, and as they matured in the 1990s, the Boomers pushed the economy to extraordinary levels of growth. But during the next ten years years, the tremendous spurts of creativity and productivity that the Boomers brought to American life will draw down, and the economy will start feeling the first rumblings of the demographic crisis. The passing of the Baby Boomers throws into sharp relief an accompanying crisis in

In assessing the geopolitical implications of how economics and demography intersect over the next ten years, I would more urgently call the attention to the crisis in technological innovation that, ultimately, may be more salient. the passing of the Boomers brings into sharp relief. As the Boomers age, not only will their consumption soars and their production disappear, but they wills. The elderly require heathcare and end of life care at a level never seen before.

The 2010s will be a period in which technology lags behind needs. In some cases, existing technologies will reach the limits of how far they can be stretched, yet replacement technologies will not be in the pipeline. Which isn’t to say that there won’t be ample technological change—electric cars and new generations of cell phones will abound. What will be in short supply are breakthrough technologies to solve emerging and already pressing needs, the kinds of breakthroughs that drive real economic growth.

The first problem is financial, because the development of radically new technologies is inherently risky, both in terms of implementing new the concepts, and when it comes to matching the product to the market. The financial crisis and recession of 2008-2010 has reduced the amount of capital that is available for technological development, along with the appetite for risk. The first few years of the next decade will be marked not only by capital shortages, but by a tendency to deploy available capital in low risk projects, with the dollars available flowing to more established technologies. This will ease up in the second half of the decade globally, and sooner in places like the United States. Nevertheless, given the lead-time in technology development, the next generation of notable technological breakthroughs won’t emerge until the 2020s.

The second problem in this rate of innovation, oddly enough, is with the military. In the 19th century, the development of the steam engine and the development of the British navy (and imperial reach) moved hand in hand. In the 20th century, the United States was the engine of global technological development, and much of that innovation was funded and driven by military acquisitions, and almost all of that with some spin-off, civilian application. Aircraft development and radios were both heavily subsidized by the military, with the subsequent birth of the airline industry and the broadcasting industry. The interstate highway system was first conceived of as a military project to facilitate the rapid movement of troops in case of Soviet attack or nuclear catastrophe. The microchip was developed for use in the small digital computers that guided both nuclear missiles and the rockets needed to put payloads in space. And of course the Internet, which entered public consciousness in the 1990s, began as a military communications project in the 1960s.

Wars are times of intense technological transformation, because societies invest—sometimes with massive borrowing—when and where matters of life and death are at stake. The U.S.-Jihadist war has driven certain developments in unmanned surveillance and attack aircraft, as well as in database technology, but the profound transformations of World War II—radar, penicillin, the jet engine, nuclear weapons—or the Cold War—computers, the Internet, fiber optics, advanced materials—are lacking. The reason is that ultimately, the conflicts in Afghanistan and Iraq are light infantry wars that have required extrapolations of existing technologies but few game-changing innovations. .

As funding for these wars dries up, research and development budgets will take the first hits. This is a normal cycle in the American defense procurement, and growth will not resume until new threats are identified over the next 3-4 three to four years. With few other countries working on breakthrough military technologies, this traditional driver of innovation will not begin bearing civilian fruit until the 2020s and beyond.

The sense of “life or death” that should drive technological innovation in the coming decade is the crisis in demographics, and its associated costs. The decline in population which I wrote about in the Next 100 Years will begin to makes its appearance in a few places in this decade. However, its precursor—an aging populace— will become a ubiquitous fact of life. The workforce will contract, not only as a function of retirement, but as increasing educational requirements keep people out of the market until their early or mid-twenties.

Compounding the economic effects of a graying population will be an increasing life expectancy coupled with an attendant increase in the incidence of degenerative diseases. As more people live longer, Alzheimer’s, Parkinson’s, debilitating heart disease, cancer, and diabetes will become an overwhelming burden on the economy as more and more people require care, including care that involves highly sophisticated technology.

Fortunately, the one area of research that is amply funded is medical research. Political coalitions make federal funding sufficiently robust to move from basic research to technological application by the pharmaceutical and biotech industries. Still, the possibility of imbalance remains. The mapping off the genome has not provided rapid solutions to degenerative diseases, nor has anything else, so over the next ten years the focus will be on palliative measures.

Providing such care could entail labor costs that will have a substantial drag on the economy. One alternative is robotics, but the development of effective robotics depends on scientific breakthrough in two key areas of that have not evolved in a long time—microprocessors and batteries. Robots that can provide basic care for the elderly will require massive computing power, as well as enhanced mobility, yet the silicon chip is reaching the limits of miniaturization. Meanwhile, the basic programs needed to guide the robot, process its sensoryinputsdata, and assign tasks, can’t be supported on current computer platforms. There are a number of potential solutions, from biological materials to quantum computing, but work in p these areas has not moved beyond basic research.

Two other technological strands are converging that will get bogged down in the next decade. The first is the revolution in communications that began in the 19th century. This revolution derived from a deepening understanding of the electro-magnetic spectrum, a scientific development driven in part by the rise of global empires and markets. The telegraph provided near instantaneous communications across great distances, provided that the necessary infrastructure—telegraph lines—was in place. Analog voice communications in the form of the telephone followed, after which infrastructure free communications developed in the form of wireless radio. This innovation subsequently divided into voice and video (television), which had a profound effect on the way the world worked. These media created new political and economic relations, allowing both two-way communications and centralized broadcast communications, a “one to many” medium that carried implicitly great power for whoever controlled the system. . But the hegemony of centralized, “one to many” broadcasting has come to an end, overtaken by the expanded possibilities of the digital age. Now, what we see as this The coming decade begins is marks the end of a sixty year period of growth and innovation in even this most advanced and disruptive digital technology.

The digital age began with a revolution in data processing required by the whenWorld War II created massive challenges in the management of of personnel management during World War Two. Data on individual soldiers was entered as non-electronic binary code onto computer punch cards for sorting and identification. After the war, the defense department pressed the transformation of this primitive form of computing into electronic systems, creating a demand for massive mainframes built around vacuum tubes. These mainframes entered the civilian market largely through the IBM sales force, lserving businesses in everything from billing to payrolls.

After development of the transistor and the silicon based chip which allowed for the reduction in the size and cost of computers, innovation moved to the West Coast and focused on the personal computer. Where mainframes were concerned primarily with the manipulation and analysis of data, the personal computer was primarily used to create electronic analogs of functions that already existed—typewriters, spread sheets, games and so on. This in turn evolved into handheld computing devices and computer chips embedded in a range of appliances.

In the 1990s, the two technological tributaries— communications and data—merged into a single stream, with information in electronic, binary form that could be transmitted by way of existing telephone circuits. The internet, which the defense department had developed to transmit data between mainframes computers, quickly adapted to the personal computer and the transmission of data over telephone lines using modems. The next innovation was fiber optics for transmitting large amounts of binary data, as well as extremely large graphics files.

With the advent of graphics and data permanently displayed on web sites, the transformation was complete. The world of controlled, “one to many” broadcasting of information had evolved into an infinitely diffuse system of “many-to-many” narrowcasting, and the formerly imposed sense of reality provided by 20th century news and communications technology became a cacophony of realities.

The personal computer had become not only a tool for carrying out a series of traditional functions more efficiently, but also a communications device. In this it became a replacement for both conventional mail and telephone communications, as well as a research tool. The internet became a system that combined information with sales and marketing—from data on astronomy to the latest collectibles on Ebay. The web became the public square and marketplace, tying mass society together and fragmenting it at the same time.

The portable computer and the analog cell phone had already brought mobility to certain applications. When they merged together in the personal digital assistant, with computing capability, internet access, voice and text messaging, plus instant synchronization with larger personal computers, we had achieved instantaneous, global access to data. When I land in Shanghai or Istanbul, and my Blackberry instantly downloads my emails from around the world, then allows me to read the latest news as the plane taxis to the gate, we have reached a radical new point that approximates what technology guru Kevin Kelly calls “hive mind.” The question has ceased to be what will technology allow me to do, but what will I do with the technology.

All well and good, but we are now at an extrapolative and incremental state in which the primary focus is on expanding capacity and finding new applications for technology developed years ago. This is a position similar to the plateau reached by personal computers at the end of the dot.com bubble. The basic structure was in place from hardware to interface. Microsoft had created a comprehensive set of office applications, wireless connectivity had emerged, e-commerce was up and running at Amazon and elsewhere, and Google had launched its search engine. But iIt is very difficult to think of a truly transformative, technological breakthrough that occurred — in the past ten years. RatherInstead of breaking new ground, the focus has been on evolving new applications such as social networking, and on moving previous capabilities to mobile platforms. As the IPAD demonstrates, this effort will continue. But ultimately, this is rearranging the furniture rather than building a new structure. Microsoft, which transformed the economy in the 1980s, is now a fairly staid corporation, protecting its achievements. Apple is inventing new devices that make what we already do more fun. Google and Facebook are finding new ways to sell advertising and make a profit on the Internet.

Radical technological innovation has been replaced by a battle for market share, —finding ways to make money by hawking small improvements as major events. Meanwhile, the dramatic increases in productivity once driven by technology, which helped in turn to drive the economy, are declining, which will have a significant impact on the challenges we face in the decade ahead. With basic research and development down, and corporate efforts focused on making incremental improvements in the last generation’s core technology, the primary global growth impetus is limited to putting existing technologies into the hands of more people. With the sale of cell phones having reached the saturation point already, and corporations reluctant to invest in unnecessary upgrades, this is a problematic prescription for growth.

This is not to say that the world of digital technology is moribund. But computing is still essentially passive, manipulating and transmitting data. The next and necessary phase is to become active, using that data to manipulate and change reality, with robotics as a primary example. Moving to that active phase is necessary for achieving the massive boost in productivity that will compensate for the economic shifts associated with the demographic change about to hit.

The U.S. Defense Department has been working on military robots for a long while, and the Japanese and South Koreans have made advances in civilian applications. However, much scientific and technological work remains to be done if this technology is to be ready when it will be urgently needed, in the 2020s.

Even so, relying on robotics to solve societal problems simply begs another vexing question, which is how we are to power these machines. Human labor by itself is relatively low in energy consumption. Machines emulating human labor will use large amounts of energy, and as they proliferate in the economy (much as personal computers or cell phones did) the increase in power consumption will be massive.

Questions of powering technological innovation, in turn, raises the great and heated debate about whether or not the increased use of hydrocarbons is aeffecting the environment and causeffecting climate change. While this question is engagesing the passions, it really isn’t the most salient issue. The question of climate change begs two others that demand astute Presidential leadership: First, is it possible to cut energy use, and second, is it possible to continue growing the economy using hydrocarbons, and particularly oil?

There is an expectation built into public policy that says that it is possible to address the issue of energy use through conservation. But much of the recent growth of energy consumption has come from the developing world, which makes solving the problem through conservation by cutting back wishful thinking at best.