U.S. productivity growth has been surprisingly sluggish in recent years, prompting economists to embark on a lengthy quest to explain the decline and its role in stubbornly low inflation and GDP growth.
Measured productivity growth has averaged 1.5 percent over the past decade —far below the long-term average of 2.25 percent. While some of that decline can be attributed to the aftermath of the 2008 financial crash and ensuing recession, there's a longer-term trend at play here too.
A quick glance at the breakdown of U.S. productivity shows that most of the recent slump can be pinned to a drop in the contribution coming from information technology. John Fernald and Bing Wang at the Federal Reserve Bank of San Francisco, for instance, point out that U.S. productivity growth began falling several years before the onset of the financial crisis as the 1990s surge in technology investment came to an end around 2003.
In new research, Goldman economists led by Jan Hatzius, also pin the productivity slowdown on a declining IT contribution. But they then proceed to argue that this falling IT contribution may be an illusion.
As the economists put it, the measured collapse in in the contribution from information technology just feels weird. In the same time period that IT productivity has supposedly dropped, corporate profit margins have surged to an all-time high, inflation has been very low and stock valuations —including those of tech companies— have been surging. It doesn't seem like the overall economy has been hit by a "major IT-centered productivity slowdown," as they put it. So, how to explain the fall?
Hatzius and company venture that what's happened is a major shift in the nature of the IT contribution itself. In simple terms, we've probably been making lots of productivity gains in IT but we just haven't figured out a way to measure them yet.
Here's what they say:
One reason why the IT revolution of the 1990s showed up so clearly in the productivity numbers is that the statisticians devised ways of translating the increases in computer performance —faster processors, more memory, better graphics, and the like— into rapid quality-adjusted price declines. Exhibit 4 shows that the measured price of computer hardware has plunged by 91.5% since 1995, with most of the decline occurring in the first half of that period.7 Since real output is equal to nominal output divided by the price level, this meant a sharp increase in the measured contribution of computer hardware to real GDP growth.
But Exhibit 4 also illustrates that the statisticians have not found a way to capture the improvements in software and digital content in a similar manner, with measured software prices only edging down slightly over the past two decades. This is not surprising because the “performance” of software and digital content is inherently a more amorphous and subjective concept. How much better are the inventory management systems that retail companies contract out or develop for their own account compared with those of twenty years ago? How much better is Grand Theft Auto V than Grand Theft Auto IV? And how much more value do we now derive from our internet connection compared with a decade ago? It is very difficult for a statistician to know, and when we do not know our default assumption tends to be that there is little change. 
Digital content and software now make up more than half of the output and market value of the total technology sector, Hatzius says. In fact, they probably make up an even greater proportion because Apple —the world's biggest technology company by far— is categorized as 'hardware' despite selling a combination of hardware and software. Given the preponderance of software and digital content, IT prices have likely been overstated and productivity understated. Assuming that tech prices were really falling at the 20-year average pace of the computer hardware industry of about 5 percent a year, then that equates to a statistical understatement of the growth contribution from software and digital content of about 0.2 percentage points per year.
A software-induced statistical mirage probably means that GDP growth is understated too.
And that would have some major implications, according to Hatzius and co:
First, we would be skeptical of confident pronouncements that the standard of living is growing much more slowly than in the past. Second, given the uncertainty around GDP it is better to focus on other indicators—especially employment— to gauge the cumulative progress of the recovery and the remaining amount of slack. And third, if true inflation is even lower than measured inflation—and especially if this gap is bigger than it has been historically—the case for keeping monetary policy accommodative strengthens further.