A Little More Hedonic Regression
While it's very reasonable to be suspicious of changes to the way CPI is calculated based on the increasing utility of decreasingly expensive goods (i.e. "digital convergence") it's important to note that it really makes a lot of sense and is not without precedent.
Imagine if you will that you're an economist measuring CPI using a "fixed basket of goods" at the advent of printing. Now, while the ensuing information revolution will take place in slow motion compared to the digital revolution, consider just the following:
A member of the growing middle class will belong to some form of trade. In order to learn the necessary skills he/she becomes an apprentice and, in essence, provides several years of free labor in exchange for being taught the rudiments of his/her chosen profession. The printing press will eventually make this information available, essentially, for free.
An economist who includes "training" in the fixed basket will see food prices rise (as labor moves to the city during the industrial revolution) and the price of training and pins, say (Adam Smith's famous example) plummet, but anything not in the fixed basket won't be taken into account, and the trained workforce will consume vastly more training at very low prices from information which is now widely and cheaply disseminated.
This is exactly what's happening with computers today, and economists are struggling to accurately represent this in their calculations. As more things go digital we're finding computers more and more useful, and the fact that they're better and cheaper is a huge benefit to us all.
And lest you think my earlier comparison of the Nintendo DS to a desktop workstation is artificial (because the DS isn't a direct replacement for the workstation), you can buy a $10-20 gadget in Walmart today which has 20 or more 1980's arcade games in it, the equivalent of $500 or more worth of consoles and cartridges. And the fact that people prefer $500 PS3s to these things is a pretty compelling argument that the PS3 really is perceived as being worth more than 25x as much as $500 worth of 1980s arcade games. Similarly, the OLPC (which currently costs about $200) is a direct workstation replacement, and again it is far superior to any $5000 computer available in 1990. (Indeed, the company I worked at the time for bought a laptop in 1992 for $3500 which was a sad joke compared to the OLPC.)
Finally, consider how digital technology is constantly expanding its reach. In 1974 (when I first got interested in Photography) a typical SLR cost $200 or more in 1974 dollars, an enlarger cost another $200, and the other stuff you'd need to make your own pictures cost another $200. That's for black and white photography, and a single SLR with a fixed focal length lens. All these prices basically remained constant until cameras went digital. And when I get a faster computer for less money today, it's improving my photos and videos as well as my word-processing and gaming.
One of the fascinating aspects of all this is that there's an absolutely enormous opportunity for computers to deliver free "deflation" in the future. Optimization of software has almost become passe thanks to Moore's "Law". A typical web browser today uses 10-100MB to handle one web page. At some point, we assume, Moore's Law will run into a brick wall , and optimization will suddenly become more important. The staggering inefficiency* of modern software affords huge potential for future optimization and it's likely that enormous benefits will accrue, and much of it will likely be free (as improvements are made to the open source software than underpins most commercial software today). CPI calculations will get even more interesting.
* Actually it's efficient with respect to development effort, and inefficient with respect to performance. The calculations will change dramatically when Moore's Law gives out.