The best way to understand the future is to look into how chips are changing.
Two transitions are transforming Moore's Law. The original article, in 1964, described only the density of circuits on silicon substrate.
The rule implied that chips could get better-and-better, faster-and-faster. Doubling bigger numbers means bigger incremental changes in the same time. Over the years chemists and electrical engineers learned to apply this exponential improvement concept to fiber cables, to magnetic storage, to optical storage, even to radios, so that 802.11n radios will transmit data at over 100 Mbps -- twice what earlier 802.11g models could deliver, but still 50 Mbps more.
The transitions have to do with what we mean by better.
The first transition is that as circuits get closer, with distances measured in microns, signals moving through one wire can easily interfere with those coming from another wire, and the heat of that movement causes more trouble. So Intel is now concentrating entirely on lower-power designs. It means everything will run on batteries, ever-smaller, lower-power batteries, so intelligence can be anywhere. And by applying this to RF chips, so can data transmission.
(The image to the left, by the way, is an Intel illustration of this "power spectrum," with low-power motes to the left and high-power servers to the right. From an Intel Research report on motes and RFID.)
The second transition is the application of parallel processing to the chip level. Parallel processing was first perfected at the Sandia Labs in the 1980s. It has since been applied to stacks of computers, at Google, and to computers just sitting on the Internet, distributed computing. The "dual-core" processors now coming out from AMD and Intel are just the first step in this transition. If you can do two computers on a chip, why not four, then eight, and so on. Why not, if you need the power, make the chip even bigger, make it wafer-wide? Why not Google on a chip? (That's the AMD Opteron with a dual core to the right, from PC World.)
Both these transitions are in their early stages. Both have yet to take hold in the popular imagination. Thus the results at Intel especially have not been what they could be lately.
The Moore Transitions require imagination for their complete application. I've written about Always-On because that is one way to apply the low-power transition of Moore's Law to daily life. Maybe you can tell me how the multi-core transition of Moore's Law will impact us.
Remember, Moore's Law applies only to semiconductors, just the front-end of the electronics industry. It took imagination to combine the idea of a TV, a tape recorder, and a typewriter to make the first PC -- Moore's original article foresaw only a box.
Bottom line. We need more futurists.
We need a Moore's Law of the Imagination.