What Happened to the ‘Innovation, Disruption, Technology’ Dividend?

 
 
SUMMING UP. Jim Heskett’s readers are divided on whether we are seeing productivity dividends from the latest round of technological innovation.
 
 
by James Heskett

How Patient Should We Be In Waiting for the Tech Productivity Dividend?

Respondents to this month’s column cited a number of factors accounting for the fact that there is no discernable increase in the rate of improvement in human productivity in the US in spite of significant effort that has produced innovations in information and other technologies.  Some questioned the data.  Others counseled patience.

David Caulfield sees productivity improvement, but not necessarily among workers.  In questioning the way we measure productivity, he commented that “Much of the productivity increases I see in the field of analytics are focused on improved utilization of productive equipment and materials… It is hard to get much GDP growth or wage growth this way, but corporate profits are certainly going up.  Perhaps that is a better way to see the impact.  Another measure that should be useful is GDP per unit of energy consumed.”  Dan Wallace asked whether or not we have “the wrong expectations.”  As an example, he cited the sharing economy, where we achieve “better asset utilization and customer experience, but not necessarily increased ‘productivity.’”

Dave C. suggested that too much technology is being diverted to the creation of more variations on basic ideas faster instead of “an economy based on new products/platforms/ideas.”  Donald Shaw cited “problems in finding qualified people to hire.”  He also said that, “It appears to me that our systems of education and government cannot change fast enough for the U.S. to have the workers it needs in any near-term way.”  Tema Frank went even further, commenting that “… we are still reliant on the human factor… so a lot of potential productivity is wasted through bad management of human resources.”  And Murray Kenney laid the blame on the low productivity increases associated with certain industries, like hospitality, retail, and increasingly education and health care.   AIM questioned whether productivity increases are even the appropriate goal, commenting that “productivity increases will always come at an expense of a human being.” 

Others challenged data documenting productivity trends.  Their position basically was that there is no need for patience.  The productivity dividend is happening.   For example, David Wittenberg cautioned us to “Beware of snapshot analysis amid a trend! … Globally, the trend continues unabated, although some high-productivity economies may have seen their growth rates slow.”  And Brent Dalager expressed his opinion that the picture is changing rapidly for the better with the introduction of “software robotics” in such industries as “financial services and telco’s.”

Anil Gupte’s comment characterized those urging patience.  As he put it, “The computer was created by Babbage in the 18th century… We only started seeing the pervasive benefits from computing in the last 10-15 years.  Today’s innovations will bring real productivity in half a century or more.”  That prompts the question,:  How patient should we be in waiting for the tech productivity dividend?  What do you think?

Original Column

For years we have been regaled with prospects of outsized productivity increases in the United States such as those that actually accompanied the Industrial Revolution, the Second World War, and the period from the mid-1950s to the mid-1970s. During another productivity bonanza, the one that extended from 1995 to about 2005, we speculated in one of these first columns on the possibility that organizations producing new information technologies would anchor a New Economy (note the capitalization), one that defied the old rules for productivity, income increases, and economic growth.

We are told by widely-quoted scholars such as Erik Brynholfsson and Andrew McAfee that innovation and new information technologies are creating a "second machine age" that holds the key to an even brighter future. Others predict that "disruption," perhaps the most overworked term in business English today, will foster competition, make long-term strategic planning a questionable management activity, and bring new ideas to market with increasing speed.

The facts are that the expected productivity increases so critical to wage and economic growth haven't occurred, at least not in the US. We've waited for at least a decade for this new era of information innovation to yield dividends. Why haven't they? Which of the explanations for this make the most sense to you?

First is the measurement argument, which goes like this: The way we measure productivity increases doesn't account for such things as quality of life (although greater convenience should be reflected somehow in productivity). Others argue that the economic climate isn't quite right; it's so volatile-downturn in 2000, recovery, Great Recession in 2008, recovery--that it makes trend analysis (and consistent productivity growth) impossible.

A related argument is that when labor is relatively cheap, growing companies are encouraged to hire more low-productivity help than train the employees they have. A fourth possibility is that some sectors of the economy, particularly services, pose a drag on productivity increases, although our own analysis of productivity trends over the past three decades does not provide consistent evidence of this. Or is it that profitable organizations today are not investing very much of their profits in new technologies, instead returning profits to shareholders or holding cash? Or is too much of our innovation devoted to "cool apps" that actually reduce productivity?

Yet others still argue for patience. There is a learning curve for any new technology. The use of new information technologies in medicine is a favorite piece of evidence cited in support of this argument. Brynjolfsson and McAfee cite the importance of changes in the workforce alongside the introduction of new technologies if the potential they forecast is to be achieved. Give users of the new technologies time to learn and productivity will increase.

What's really going on here? It's exciting to hear about innovation (although author Martin Ford speculates on "a jobless future" resulting from the introduction of such things as robotics). New information-based technologies seem to make a lot of sense. And if your company is still engaging in long-term strategic planning and not pursuing a strategy of disruption these days, some conclude that it has already lost the competitive race.

But how much of the impact of such things as 3D printing, robotics, and information technology that fuels the shared economy is hype? What happened to the "innovation, disruption, technology" dividend? What do you think?

To Read More:

Erik Brynholfsson and Andrew McAfee, The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies, (New York: W. W. Norton & Company, 2014).

Martin Ford, Rise of the Robots: Technology and the Threat of a Jobless Future (New York: Basic Books, 2015).

James L. Heskett, W. Earl Sasser, Jr., and Leonard A. Schlesinger, What Great Service Leaders Know and Do: Creating Breakthroughs in Service Firms (Oakland, CA).

Robert M. Wachter, Why Health Care Tech Is Still So Bad , The New York Times, March 22, 2015, p. SR5.

Post A Comment

In order to be published, comments must be on-topic and civil in tone, with no name calling or personal attacks. Your comment may be edited for clarity and length.
    • Dan Wallace
    • Partner, Tailwind Discovery Group
    Jim, of the specific technologies you mentioned, Robotics seems like the only one that would lead directly to an increase in productivity (classically defined as an increase in revenue or GDP per labor hour). And that's not a new phenomenon. Manufacturing robots have been displacing labor for decades. I don't spend time in the world of Big Data, and so won't pretend to know what people are doing with it. It seems like it would lead to higher margins (better slicing and targeting), and perhaps occasionally to the identification of a new market.

    Among the technologies you mentioned, the most visible are those driving de-verticalization of industries and, especially, the sharing economy. These result more in the shifting of demand than the creation of new demand, to better asset utilization and customer experience, but not necessarily increase "productivity." Look at Uber. Cars that would otherwise be idle are now being utilized, and the need for a specialized class of cars (taxis) is going away - thus better asset utilization. The experience is better because customers can provide direct feedback that's used by other customers. But in terms of actual productivity (revenue/labor hour), the price is actually lower, the ride still takes just as long, and there is still a driver in every car.

    So perhaps we have the wrong expectations?
    • David Wittenberg
    • CEO, The Innovation Workgroup
    Beware of snapshot analysis amid a trend! Beware of local analysis amid a global change! Beware generalizing from a particular example!

    Productivity equals value generated per worker hour. Technological advances have already multiplied this figure, and I expect the trend to continue. Globally, the trend continues unabated, although some high-productivity economies may have seen their growth rates slow.

    Three-D printing is in its infancy. As it becomes refined, I see it providing huge productivity increases in high-value/low-quantity manufacturing, though I'm not yet convinced that it will ever replace mass-production lines.

    To your question, Jim, 3-D is just one technology amid a broad trend. Its moment in the sun will come. The productivity-multiplying effects of software, mobile telecommunications and robotics continue to spread. The delayed effects of 3-D or any other single technology don't disprove the trend.
    • Bent Dalager
    • Managing Director, Accenture
    Actually Robotics is now beginning to make a big difference. In practice! Within the last 12 months this has moved from a few experiments into replacing thousands of jobs especially within business process outsourcing firms and recently it has begun within financial services and telco's. For years the talk has been going on with no results, but it's happening now. We have replaced about 8000 FTE's just within Accenture itself in the last year and this is spreading like wildfire now. Right after the simple software robotics are the cognitive computing tools. They are beginning to catch on as well. The first wave of companies have begun to use tools/"avatars" like Amelia for service requests. It's right here, right now.
    • Aim
    • DSV, KOC

    Professor,

    I think there is a saturation point of "quality of life" for a human being. Quality of life is also a relative term which behavioral economists have explained over various experiments.

    The bottom line is that there are about 800 million people worldwide who go hungry to bed. You might think this does not affect the U.S. however with the onset of globalization the companies who are operating world wide are seeking profits accordingly. The hungry people imply bottle neck in velocity of money. The wealth continues to search for more wealth pressured by peers or conditions. And given that the growth in the U.S. has saturated the global markets have larger meaning in financial statements of corporations. This might also be one reason to Fed's delay of rate hike: the world is coming to a grinding halt!

    Furthermore, productivity increase will always come at an expense of a human being. It is because humans are inadequate that the productivity search exists. Thus talking of increased income as a result of productivity increases is a misnomer. The advent of instrumentation/automation will always squeeze the poor and the intellectually challenged. You may be surprised but, in comparison with a micro processor the 99.99% of population is intellectually challenged.

    I fully agree with Elon Musk, Prof Hawking...etc. who argue that AI is against human values.

    Best regards,

    Aim

    • Dave C.
    • Market analyst

    Maybe before we decide whether these new technologies make us more productive, i.e., the dividend, we need to change the definition of productive given new technologies.

    In technology, we no longer take several years to design and introduce a new product, a mobile phone, say. Instead, we iterate older versions with some new stuff thrown in so that we have constant releases, once called point upgrades.

    So is a tech economy based on point upgrades making us more productive than an economy based on new products/platforms/ideas? It generates more income at less cost, creating more productivity. But little dividend, as I see it.

    • Donald
    • Shaw, Donald E. Shaw, P.E.

    All of the things you mentioned probably play a part. There are a few things that have not been mentioned. One is problems in finding qualified people to hire. It is not possible to take a displaced industrial work and turn them into a worker in a modern factory using high tech machines and sophisticated software--at least not overnight. Programs to retrain workers have been suffering from the same problem since the demise of steel in the 1980s. That problem is that by the time a training program is developed and tested the work that is the object of the program has changed or disappeared. With technical information doubling at the rate of every year or two, how can such programs be developed effectively. It is shooting at the proverbial moving target that keeps moving at an accelerating pace.

    Worker skills have changed with a significant emphasis on problem solving, decision making and collaboration. There are questions as to whether the public schools are keeping up with the pace of change or even can given regulations that are usually outdated by the time they are written. Like your mention of long-term strategic plans being out of date in an era of rapid change, our system of government may be out of date for the same reasons--too long-term in how it operates, if it operates at all.

    It appears to me that our systems of education and government cannot change fast enough for the U.S. to have the workers it needs in any near-term way.

    • Anil Gupte
    • Founder, Layer 3 Media, Inc.
    Have Patience. The computer was created by Babbage in the 18th century. Even if you discount that, the first modern computer, the ENIAC was created aout 75 years ago. We only started seeing the pervasive benefits from computing in the last 10-15 years. Today's innovations will bring real productivity in half a century or more...
    • Doug Elliot
    • Intellectual Property Consultant, D. Elliott & Associates

    Successful innovation increases productivity through processes or products. Innovative technologies are creative outcomes based on the human pursuit of innovation.

    Economically valuable creations become intellectual property. Intellectual property, like all forms of property, can be owned, sold or shared.

    The savings and gains of productivity can also be shared or owned. Today, free market capitalism favors ownership of innovation, intellectual property, and the profits or gains that attach to them. And the legal definition of ownership is the right to exclude others from a property or the benefits that attach to the property.

    It was not always this way with innovation. The Western post World War II mentality was one where 'peace begets prosperity for all'. That prosperity was predicated on a victorious nationalized economy that spawned quantum leaps in innovation, technology and productivity. But it came on the heels of horrendous military destruction which, in turn, had been spawned by a seemingly intractable global depression.

    In the 1950's, an equitable sharing between labor and capital of the peacetime fruits from innovations that arose from the past unpleasantness was socially and morally acceptable, if not a political necessity. Also, it was economically perfect- rising real wages, rising profitability, and rising standards of living. And bigger businesses and big unions.

    But it didn't last- it never does. 'Sharing the wealth' breeds entitlement, entitlement saps productivity, low productivity erodes profits. As the rest of the world healed from the war, new competitors came into marketplace. Think Germany, Japan, Korea and the China's.

    By the 1980's, privatization and competition are deemed necessary to cure the 'bloat'. Self-interest displaces the public interest as the driver of productivity and profits. This leads to increasing private ownership of innovation. What is owned accrues to the interest of the owner. Thus the new economic fruits of productivity flowed to private capital to the exclusion of the labor force. So begins the current income inequality.

    Can innovation create shorter work weeks and rising living standards? Of course. It has historical proof. The fact that this is not happening now is the source of our social discontent. This remains one of the Big Questions for the current generation of leaders- how to equitably balance the private and public interests in productivity gains. Is it possible to create a more pluralistic self-interest in the capitalization of intellectual property? Definitely. Is it possible to pay better wages to use the new productivity of innovation? Yes- if sharing some of the profits stimulates a greater consumption of the fruits.

    In any case, let's hope so. Personally I'm not in favor of economic progress through global depression and global war.

    • Tema Frank
    • Chief Instigator, Frank reactions
    Part of the problem is that even in an era when we use computing and robotics much more, we are still reliant on the human factor. We need people to analyze and understand consumer needs, to program the machines, to teach others how to use them, to sell them, to service them, etc. Yet managing people well is very hard. Few companies actually do it well. Why is it that we can so easily name the few standouts such as Zappos, Amazon, Google and Disney? Because they are exceptional. So a lot of potential productivity is wasted through bad management of human resources.
    • Kim Johnson
    • EVP Managing Director, HERO Marketing
    I recommend you check out Autodesk University 2015. They will be addressing these topics and I think you would be very enlightened.
    • Murray Kenney
    • At home/unemployed, None
    Maybe it's a combination of 1) rapid increases in low wage employment in industries like hotels/restaurants/retail where the labor component has already been cut to the lowest possible level (can you get rid of anyone at McDonald's and try finding a worker at Home Depot); and 2) growth in health care and education, two massive sectors where the implementation of technology has been halting at best for a variety of structural reasons. Those industries like manufacturing where technology has made the biggest impact don't employ enough people to make a difference.