\n"; echo $styleSheet; ?>include("http://www.corante.com/admin/header.html"); ?>
No, not on the football field, silly.
(The original Rice seal, to the right, dates from 1911, and carries its own story, including Confederate gray "warmed into life by a tinge of lavender.")
I'm talking about the laboratory, where Rice is successfully managing the transition from Dr. Richard Smalley's "Buckyball foundation" generation with new research into the link between optics and electronics.
Professor Peter Nordlander has announced "a universal relationship between the behavior of light and electrons" which "can be exploited to create nanoscale antennae that convert light into broadband electrical signals capable of carrying approximately 1 million times more data than existing interconnects."
This is big. In many ways it's as big as the original BuckyBall discovery, and more readily exploited.
More after the break.
I know people are going to read this as good news.
Intel announced it is putting $340 million into expanding plants in Colorado and Massachusetts.
What could be wrong with that?
Richard Wingard has figured out a way to fund cutting-edge technology with angel investors, and hold them in their investments for nearly 7 years. (The picture is courtesy the University of New Hampshire alumni association.)
Wingard runs Euclid Discoveries, which is working on an object-based video compression technology he says will deliver 10 times the performance of MPEG-4, enough to "turn your iPod into a DVD player."
And he's done it all with angel investors, who are best-known for backing only early-stage customers. Wingard has rejected the entreaties of venture capital firms, saying their time frames for pay-outs are too short. Yet he has succeeded in getting angels who will wait as much as 7 years for a private auction of his technology, and a distribution.
Want to know how he did it?
Back in 1985, you would have spent big money to get an Intel 386 chip, with over 100 Megabytes of storage, and a local network that ran as fast as 1 megabits per second.
I know I didn't have one. The closest I saw to one that year was an entrepreneur 10 miles north of me who had a Digital Equipment PDP-8 minicomputer in his office.
Yet that is just what you see in the picture to the right:
Thanks to Moore's Second Law (complexity causes costs to scale exponentially) competition in the semiconductor business is held in an ever-thickening mud, which represents the cost of building new capacity.
The number of company-owned fabrication plants, or fabs, must decline over time, as their cost rises above even corporate affordability. The decision to build one must be taken with increasing care, with an eye toward a far-off future. It's the opposite of what happens in the product cycle, at the other end of the factory floor, where things are constantly accelerating.
While Intel has played its hand in Asia, AMD has chosen Europe, specifically the former East Germany. More specifically Dresden, firebombed during World War II, left for dead during the Cold War.
In 2003 AMD broke ground for its second Dresden Fab, AMD 36. The plant goes into volume production next year, at a point where AMD's designs seem to be excelling those of Intel.
Market share, in other words, could make a big swing next year.
At the very same time, AMD is advancing in court, forcing Intel to defend an already-fading monopoly. A few years ago Intel had knocked AMD practically out of the ballpark. With the Dresden Fab 36 that won't be true, but AMD figures Intel must still have a case to answer for, because its hyper-competitive marketing department never changed tactics.
Evidence will likely show that Intel did have a near-monopoly under Craig Barrett, and that it did abuse its position in its dealing with big customers. But a court finding for AMD would still be a mistake.
Every decade of computing technology can be summarized fairly simply. (That's an Apple ad to the right.)
The 2000s are the decade of wireless.
It's now clear that wireless technology defines this decade. Mobile phones are opening up Africa as never before. WiFi is making networking truly ubiquitous.
Walk or drive down any street, practically anywhere in the world, and you will find people obsessed by the use of wireless. Behaviors that in previous decades were shocking -- walking around chatting animatedly to the air for instance -- are now commonplace.
What's amazing, as we pass the halfway point, is how far this evolution has to go, and how easy it is to see where it can go:
Who do we have to thank for this?
The question of Wi-Fi and real estate is about to come to a head, at Boston's Logan Airport. (Picture from MIT.)
Declan McCullagh reports that the Airport is trying to close Continental Air's free WiFi service, based in its Frequent Flyer lounge, in favor of a paid service on which it gets a 20% cut of revenue.
Continental has appealed to the FCC under the 1996 Telecommunications Act. Massport, which runs the airport, is making bogus arguments about security (its paid service uses the same spectrum as Continental so if one goes under its argument, both go).
If this thing goes to trial it will be a very important case. Here's why.
Today's politics is cultural.
Even economic and foreign policy issues are, in the end, defined in terms of social issues. This creates identification, and coalitions among people who might not otherwise find common ground -- hedonistic Wall Street investment bankers and small town Kansas preachers, for instance.
I am coming to believe the next political divide will be technological. That is, your politics will be defined by your attitude toward technology.
On one side you will find open source technophiles. On the other you will find proprietary technophobes.
It's a process that will take time to work itself out, just as millions of Southern Democrats initially resisted the pull of Nixon. Because there are are divisions within each grand coalition we have today, on this subject.
This latter split gets most of the publicity, because more writers are in the cyber-libertarian school than anywhere else.
Initially, the proprietary, security-oriented side of this new political divide has the initiative. It has the government and, if a poll were taken, it probably has a majority on most issues.
But open source advocates have something more powerful on their side, history. You might call it the Moore's Law Dialectic.
Either my wonderful mother (who still walks among us, to my great joy) failed to check the box indicating I was a citizen on my Social Security application, or some clerk failed to do so when the data was entered because there were separate forms then for citizens and non-citizens.
The clerk who put me through this hell blamed "Homeland Security." But I think he was really responding to the reality of how this number is used.
As I've noted many times before, the Social Security Number is an index term. Everybody has one. Everyone's number is different. By indexing databases based on Social Security Numbers (SSNs), government and businesses alike can make certain there's a one-to-one correspondence between records and people.
Stories like this AP feature don't really address this need, this fact about how data is stored. Without the SSN we'd have to create one. Some companies like Acxiom do just that. Every business and individual in their database has their own unique identifier, created by the company. Which also means that the Acxiom indexing scheme is proprietary. The only way toward a non-proprietary indexing scheme, in other words, is for government to provide one. Which gets us back to the need for an SSN.
The big trend of this decade, in technology, is a move toward openness.
It started with open frequencies like 802.11. It then moved into software, with open source operating systems and applications. Now we have open source business models. The ball keeps rolling along.
Open source has proven superior in all these areas due to simple math. The more people working a problem, the better. No single organization can out-do the multitudes.
But this simple, and rather elegant, fact, is at odds with all political trends.
But the bottom line is simpler. Our identification system is broken.
It's no longer a question of this system or some other system. There is no system.
What that means, in real terms, if your own identity hangs by a thread, a very thin thread that can break anywhere, and leave you an un-person.
Since Mark Hurd left NCR to run the mess Carly Fiorina made of Hewlett-Packard in March, he has been fighting to turn the old boat around. The company turned in solid numbers in May, he hired away Dell's CIO, Randy Mott, and now he has the credibility with his board needed to prune the deadwood.
H-P has a lot of deadwood.
In buying Compaq, her signature move, Fiorina took on a lot of old, tired, even worthless brands, like DEC and Tandem. Compaq's latter-day strategy had been to buy these outfits for their book of business, and Fiorinia's deal was the apotheosis of this old-line industrial strategy. She insisted at the time there would only be a few survivors of the PC wars, and buying Compaq was the only way to make sure H-P would be one of them.
She was wrong. What works in steel does not work in tech. A book of business is worthless, because computers are short-term capital goods. It's not what you did for me, or even what you did for me lately, but what you're going to do for me tomorrow that counts.
But enough about the past.
As regular readers here know, there is no Moore's Law of Training.
Training, learning, adaptation -- call it what you will -- must happen at its own pace. This is why the productivity boom arising from the 1990s IT spending boom didn't become apparent until this decade.
But there is a way to accelerate Moore's Law of Training (which doesn't exist) -- publicity. If a good idea, an obvious use of existing technology, is heavily publicized, it can spread very, very quickly, and provide real benefits.
ICE is just such an idea.
Given the direction of antitrust law recently I was surprised to see the recent suits by AMD and (more recently) Broadcom. They left me scratching my head.
But there is an answer to my quandary.
Antitrust has become a process. It's not a goal, but a weapon in the business war.
The idea that Qualcomm has a monopoly in the mobile phone industry is laughable. It may abuse what position it has, charging chip makers like Broadcom the equivalent of an "intellectual property tax" in areas which use CDMA (and its variants). But GSM is the major world standard. It would be like calling the Apple Macintosh a monopoly.
The Broadcom antitrust suit comes right after it filed a patent suit against Qualcomm, accusing it of violating Broadcom patents regarding delivery of content to mobile phones.
The first shot didn't open up the Qualcomm ship, maybe the second will. All lawyers on deck!
When the Comdex show closed its doors a few years ago a lot of people threw up their hands and decided it was some sort of secular turning-point, the lesson being that people didn't do trade shows anymore.
Well it was a turning point. But not of the ind they thought.
Fact is, Comdex lives. It lives in Taiwan, and it's called Computex.
The show just finished, and the reports are still dribbling in. But what's clear is that the same spirit of innovation, the same corporate social mobility, and the same establishment of distribution that marked the Comdex show in its heyday all took place in Taiwan.
This is meaningful on several levels.
AMD is the most infuriating company imaginable.
If Intel is the big dog of the chip world, AMD is the little dog jumping around it, nipping at its heels, acting like it (not Chipzilla) owns the street.
Its latest legal assault may be its dumbest move yet.
Strictly from a timing standpoint, it sucks.
This Administration does not look kindly at anti-trust claims. They settled with Microsoft, they gave the cables and Bells a duopoly (leaving America a third-world broadband country), and they seem content to let China monopolize world trade while India gains control of services. All this is in pursuit of an ideology that becomes less-and-less distinguishable from Putinism and Kleptocracy by the day.
Short form. If they had a case they should have filed it in Europe.
By the time Paul Winchell died, last weekend at 82, the BBC was only able to point out that he had done the voice of Tigger for Disney.
He was so much more. Like Hedy Lamarr, who created the technology underlying WiFi, he led a double-life, as an intellectual in the fun house.
For starters he was the first TV star I remember, one of many models for what became The Simpsons' Krusty the Klown. He had a morning show with puppets, more entertaining (I thought) than Kaptain Kangaroo, with more brain and heart (I thought) than even Fred Rogers. The puppets, which he made himself, were called Jerry Mahoney and Knucklehead Smiff (right).
What I didn't know at the time was he was also a polymath, with a wide range of interests and a photographic memory. One of his interests was medicine. As an entertainer he manuevered into the worlds of famous physicians, including Dr. Henry Heimlich (then Arthur Murray's prospective son-in-law), and with his help won the first U.S. patent on an artificial heart.
There was even more to his life than that. He sought early funding for the farm-raising of tilapia, He was a skilled painter. And, of course, he was a ventriloquist and a subversive humorist who emphasized the fun of the mind.
Taken directly from his own Web site (he was working on streaming video at the time of his death) is a list of his inventions (remember he was self-taught):
Irregular readers of this blog may think Gordon Moore invented the microchip.
He didn't. Moore did have a major role. He was part of the Fairchild team, co-founder of Intel, and his Moore's Law article popularized the changes that chips would bring.
But Jack Kilby won the Nobel Prize for the microchip, in 2000. He died today.
The original invention, designing multiple devices on a single piece of substrate, was invented in two places at once. One team, which Kilby headed, worked at Texas Instruments. The other team, with Moore, Robert Noyce, and other key men, worked at Fairchild Semiconductor.
The resulting patent was shared, but it was Kilby's team that created the basic technology. The key contribution from the Noyce team involved manufacturing process.
More on Kilby after the break.
A note came in today from a friend I've known even longer than my saintly wife. "How does this fit into Moore's Law?" he asked.
The link was to a story about a holographic storage system being marketed by Optware of Japan. It's called a Holographic Virtual Card (HVC) and claims to store 30 Gigabytes on $1 worth of media. (That's the Optware reader to the right, from a 2004 GearBits article promising commercialization in 2005.)
The kicker is that readers will cost $2.000 and read-write devices may cost five times more. Optware has promised standard kit by the end of next year.
All this related to what my book calls Moore's Law of Optical Storage. Instead of storing data on a fairly flat substrate, the Optware design uses all three dimensions. Think of the storage medium as a cube rather than a circle.
There's a long way to go before this threatens the CDs we're used to. Right now, however, the high price of the readers may be an advantage, making this perfect for applications like physical security.
Imagine the depth of personal knowledge that could be input on a 30 GByte substrate for an entry badge. Connect that to a variety of biometric readers so the bad guys can't hide their identities behind, say, phony fingerprints or contact lenses. Add a human guard to the mix and your entry portal could be pretty darned secure (for a time).
But the best news here may be this, the fact that there's competition in this space from Inphase Technologies, a spin-off of Bell Labs. They're looking at issues like the speed of data transfer, issues that could make holograms an alternative for the archiving of Web data.
Writing about Microsoft earlier today got me thinking more deeply about the company. (The image is from the Pioneer Theater Co., at the University of Utah.)
A decade ago Microsoft reached a tipping point. Maybe this came with its release of Windows 95. It was obvious in its obsession over destroying Netscape.
Before 1995 Microsoft was about creating capabilities for others. Since then its mission has been embracing and extending, bringing the great ideas of others into its own operating system, destroying rather than creating niches.
It all sounds like a Jon Stewart set-up. "Aw, Bill, it used to be about the world domination." But in truth, at some point, people do come to dominate their worlds.
And then it all starts to go wrong.
The 1990s were all about the Internet. (The picture is from a great site called i-Learnt, for teachers interested in technology.)
This decade is all about gadgets.
Digital cameras, musical phones, PSPs, iPods -- these are the things that define our time. While they can be connected to networks their functions are mainly those of clients.
In some ways it's a "back to the future" time for technology. We haven't had such a client-driven decade since the 1970s, when it was all about the PC.
In some ways this was inevitable. The major network trend is wireless, so we need a new class of unwired clients.
But in some ways this was not inevitable. If we had more robust local connectivities than the present 1.5 Mbps downloads (that's the normal local speed limit) we would have many more opportunities to create networked applications.
Not only is Apple switching its chip supply contract from IBM to Intel, but it is moving to Intel processors in the bargain.
In making the announcement this morning, Steve Jobs said he didn't see how he could continue making great products beyond next year "based on the Power roadmap."
Right after his speech he had a cagey interview with CNBC's Ron Insana. "Itís not as dramatic as youíre characterizing it," he insisted.
"This is going to be a gradual transition. Hopefully a year from today weíll have Intel-based Macs in the market. Itís going to be a two-year transition.
"As we look into the future, where we want to go is different (from IBM's product roadmap). A year or two in the future Intelís processor roadmap aligns with where we want to go.
"I think this will get us where we want to be a year or two down the road." Jobs refused repeated requests by Insana to explain what he meant by that. (Jobs is also shaving even more closely than this picture shows. He's down to tiny stubble around a a still-brownish moustache. Hey, Steve, I'm 50 too.)
What I think he means, simply, is video.
Beyond this, most of what I wrote last week holds. This deal is not material to Intel, which continues to face loss of major market share to AMD among Windows and Linux users.
But there are also vital lessons here for followers of Moores Law, lessons I need to impart.
The folks at CNN fell for the hype from a project called RepRap, a rapid prototyper from the University of Bath in England. (The picture is from the CNN story and shows a robot built with RepRap.)
The machine that can copy anything was their breathless headline.
Well, yes. And no.
The folks at RepRap would like you to think they've got something truly revolutionary. But they don't. The technology has been around for some time. You need to input a lot of files to make anything, so there's a lot of intellectual capital involved.
And here's the kicker.
Assuming Apple does switch to Intel chips tomorrow, as News.Com reports, the value of Intel stock will likely rise.
That would be a mistake.
Intel is making a big investment here to gain a very small amount of market share. Meanwhile it's losing far more market share to AMD in what used to be called the Windows world.
WinTel has been broken. That should be the real headline here.
Microsoft is perfectly happy to have AMD supply chips for Windows machines. People are very happy to buy them. And right now AMD has a price-performance advantage there.
This move toward Apple will, if anything, accelerate that shift. Intel should be spending all its time addressing its loss of share in the Windows world, and in the Linux world, instead of wasting energy with a tetchy, demanding Apple, an outfit that even IBM couldn't please.
One more point.
Despite his ponytail and his sometimes counter-cultural language, despite being what I like to call a Truly Handsome Man (it's a brighter term for bald, people) Ted Waitt was always a follower, not a leader. (The picture is from a 2002 profile in the Sioux Falls, South Dakota Argus-Leader.)
Waitt was Gimbel's to Michael Dell's Macy's. He wanted to be Pepsi to Dell's Coke.
But computing lacks the stability of the retailing or the soda business. So when Waitt announced his resignation today (at 42 it wouldn't sound right to call it a retirement) it wasn't big news.
Waitt and Gateway did well in the 1990s, following Dell into mass customization. He made his big mistake when he tried to out-think Dell, opening a chain of retail stores that caused $2.4 billion in losses, according to The New York Times.
But I personally think the mistake was more basic than that.
The secret's out.
Intel is re-interpreting Moore's Law. Not repealing it. Not rejecting it.
They're reinterpreting it. That's the significance of the change incoming CEO Paul Otellini (right) is making.
Before Moore's Law was like Samuel Gomper's famous quote about what labor wanted. More. More circuits, more speed, more cycles, more bits. More.
As of today Intel's new direction is better. Better doesn't always mean more. In the case of microprocessors it can mean putting more computers on each chip (multi-core) or running with lower power. In terms of communications it can mean a host of attributes, from security to coverage to throughput.
The best way to understand the future is to look into how chips are changing.
Two transitions are transforming Moore's Law. The original article, in 1964, described only the density of circuits on silicon substrate.
The rule implied that chips could get better-and-better, faster-and-faster. Doubling bigger numbers means bigger incremental changes in the same time. Over the years chemists and electrical engineers learned to apply this exponential improvement concept to fiber cables, to magnetic storage, to optical storage, even to radios, so that 802.11n radios will transmit data at over 100 Mbps -- twice what earlier 802.11g models could deliver, but still 50 Mbps more.
The transitions have to do with what we mean by better.
Now that dual-core chips are a reality (to be followed in time by four-core, eight-core, etc.) software companies face a dilemma on pricing. (Picture from AMD.)
Traditionally software companies have priced per-processor. But if a single chip has multiple processors, which could be doing different things, then shouldn't you require two licenses?
The hole is the whole U.S.
Intel plans on mass producing WiMax chips and going into rapid deployment, offering end-user speeds far in excess of what U.S. phone outfits provide with DSL.
The problem is that's the speed limit for most backhauls. Go to most WiFi hotspots, or most home networks, and DSL is the backhaul platform. We're talking 1.5 Mbps, max.
But is this just another Marty Rimm study?
Rimm, you may or may not remember, wrote a paper at Georgetown Law in 1995 claiming 85% of Web traffic was dirty pictures. This was later disproved, but the damage was done and Congress passed the ill-fated Communications Decency Act.
Mike Godwin, the former EFF counsel who fought the Rimm study and is now senior counsel at Public Knowledge, remains skeptical, noting that the Cachelogic study hasn't gone through peer review. He also notes that, since Cachelogic sells systems to control P2P traffic, it has a natural bias.
The Cachelogic claims may have logic behind them, however. Many ISPs do report that over half their traffic is on ports commonly used by P2P applications. Brett Glass of Lariat.Net, near the University of Wyoming, says the claim seems accurate, noting that unless ISPs cut-back capacity to those ports (a process called P2P Mitigation), the applications quickly discover the fat pipe and divert everyone's traffic to it, filling it at the cost of thousands per month.
And that is at the heart of the problem.
Last month Intel's mobility chief Sean Maloney was in the hunt to head H-P, a job that eventually went to Mark Hurd of NCR. (Watch out. Dana is about to criticize a fellow Truly Handsome Man.)
But how well is Maloney doing his current job?
Intel's role in the development of Always On is crucial, and its strategy today seems muddled. It's not just its support for two different WiMax standards, and its delay in delivering fixed backhaul silicon while it prepares truly mobile solutions.
I'm more concerned with Maloney's failure to articulate a near-and-medium-term wireless platform story, one that tells vendors what they should sell today that will be useful tomorrow.
Intel seems more interested in desktops and today's applications than it is in the wireless networking platform and tomorrow's applications.
Incoming CEO Paul Otellini says Intel is going to sell a platforms story, not a pure technology story. Platforms are things you build on.
Here's an interesting juxtaposition of headlines. (The lovely idiot is from UC Berkeley.)
Hitachi Eyes 1 Terabyte Drives, writes MacWorld, noting new technology the Japanese company says lets it put 4.5 Gigabytes of data on a single centimeter of hard drive.
I'm like, don't the first people read the second paper?
Moore's Law of Storage is rocketing along right now even faster than Moore's other "laws" (as described in The Blankenhorn Effect). Magnetic storage is eliminating the cost of physically maintaining content, any content, with profound implications for everyone.
I bought a new laptop yesterday.
And to my surprise I violated my Iron Law.
Dana's Iron Law of Laptops holds that an ounce on the desk is a pound in my hands.
My favorite laptop of all time was a 2-pound Sinclair ZX-81. It had a tiny screen (nearly non-existent) but it had a pliant membrane keyboard that let me write and send stories from a beach. I haven't seen anything so light, rugged and useful since.
Instead, laptops have been desktop analogs. When desktop power increased, so did that of laptops, and they became no lighter in the process. Even today most laptops on the market weigh 7-8 pounds.
So why did I get one?
The cost of making something good is directly proportional to the complexity of the tools needed to create it. (The picture is from Freeadvice.com.)
This blog item is quite good. The tools needed to create words are very cheap. Even if the tools were more expensive, as they were when I began writing, my cost to create this text would not go up much. And the likelihood of its being of high quality would be just as high.
If I read this on the radio it would not be as good. The tools needed to create a Podcast require knowledge of radio or music production values. Even if Podcasts were as cheap to make as blog items, the proportion of good ones would be smaller than they are for blog items.
Now that youíve read my latest dismissive screed against the government, the question may have occurred to you.
What might a proper telecommunications policy consist of? (Very pretty flower, I know. Here's where I got it. The picture is called Simplicity.)
Itís really quite simple.
Click below and I'll tell you.
The great financial Curse is to have money coming out of the ground.
I didn't believe this when I started in journalism. I started in Houston, whose economy was based entirely on the concept of money coming out of the ground - Black Gold, Texas Tea.
For most of history, money has mainly come out of the ground. Assets were what you could drill for, what you could mine, or what you could grow. The exceptions to this rule were those of trade. If you sat astride a trade route, if you had a deep water port, if the railroads decided that your location would work for a station, then your land had value.
Moore's Law has changed all that. The Internet has changed that for all time.
As we approach the 40th anniversary of Gordon Moore's Electronics article, the man himself (Intel co-founder and namesake of this humble blog) has appeared to join the celebration.
While the headlines spoke of Moore's skepticism on materials that might replace silicon, I was more intrigued by his views on Intel, where his foundation still holds a considerable stake.
He's pretty happy. He likes the idea of pushing platforms over performance. It makes sense to him.
Moore also gave an irascible cur whom he quit a half-century ago credit for the creation of what's now Silicon Valley.
The logjam over the next optical storage standard may be about to break, as Apple has joined the Blu-ray Group.
The announcement at Germany's CeBit today means that HD-DVD, the rival technology, has lost yet-more momentum. Dell and H-P are already on the Blu-ray side.
This news is bigger than it sounds. Read on.
It's faster, has less interference, and it's just better.
Uh-huh. Maybe that's all true. But even if it is, that will take time.
Bluetooth has taken over a half-decade to reach its present level of prominence, and many mobile phones still don't have the capability -- despite cool applicationsl like Hypertag being written for it. (Thanks to point-n-click and Billboard for that link.)
I have headlined this Moores Law of Market Acceptance because, again, there is none. (It's like Moore's Law of Training.) Market acceptance is a human process, involving many actors.
The rate at which a new technology is accepted and replaces an old one depends on how revolutionary it is, how nimble its sponsors, and how rapid is the replacement within the older market.
I used to like Intel chairman Craig Barrett.
Now, as he prepares for his May exit from the job he's had for seven years, I love Craig Barrett. (Image from ComputerWorld's Heroes page.)
I wish I had been able to say this:
"I believe in the Hippocratic Oath for government: first do no harm. That means sorting out spectrum allocation, fostering R&D and creating an environment to let business function," he said.
"[WiMax] is the solution to the 'last mile' broadband issue. It will get us out of the half-assed broadband situation we're in today. 1 Mbps to 2 Mbps is not broadband; 50 Mbps is."
Tell it, brother Barrett. Amen. More on what this means after the jump.
Haptics recreates touch and texture artificially. If your kid has a "force-feedback" joystick on their computer game console, they're getting a taste of haptics. Northwestern, USC and MIT are among the universities doing research in the field. (The image is from USC.)
It's vital that something like haptics comes to mobiles because, in a hands-free environment, you can't depend on just sight and sound. Bringing other senses, like touch (or smell) into the mix allows for communication to happen invisibly.
It's also vital for haptics to come to mobiles because this is a huge (in terms of installed base) platform. If the coding and messaging can be delivered in this space, we're talking about billions of users. And we're talking about a universal language.
Rivals and investment bankers say it's stupid. BellSouth must either eat or be eaten, they claim, and once SBC has finished eating AT&T it wll chow down on BellSouth.
Maybe yes, maybe no. It must be admitted that rivals who've merged, and bankers who are selling deals, both have reasons to diss the company refusing to dance.
But there's another way for things to go. Because while there will soon be fewer players in the telecomm space, there will also be fewer real assets.
Podcasting is the trend of 2005.
It's driven by simple facts.
The result is millions of units and millions of hours waiting to be used by someone.
What else is the result?
Many different types of solutions go into creating an Always On world.
Iíve talked here often of medical applications for Always On, where you wear a monitor (or have it implanted) that connects to the network and can alert you (or others) to dangerous changes in your physical condition, thus saving your life.
I have also talked of inventory applications for Always On, in which RFID tags or bar codes give you a ready inventory of your stuff. This lets you, for instance, find your keys, or check the fridge to see what you need for tonightís dinner.
But the low-hanging fruit lies in automation applications. CABA (it stands for Continental Automated Buildings Association) is one of the trade groups involved here. They work mainly with landlords who want to save money on utilities, provide security, and keep track of whatís happening in lots of space so as to minimize labor costs.
That's right, gang. The old joke from The Graduate is here again, aiming to drive silicon into the ground.
Nanomarkets, a market research outfit with a beat that looks like tons of fun from here (call me) has a $2,000 report out with a hockey stick chart for plastic semiconductors, estimating the market at $5.8 billion in 2009 and $23.5 billion three years after that.
Plastic electronics -- chips built on conductive polymers and flexible substrates, will be cheaper, take less power, and (obviously) be more flexible than silicon circuits. This makes them perfect for, say, mobile phones.
It will also bring a bunch of new suppliers to the electronics market, names like Dow Chemical, DuPont, Kodak, and Xerox, along with the usual suspects.
What does this mean?
One of the nastiest open secrets in the Internet is the switching bottleneck.
Optical fibers move data at, well, light-speed. But electricity moves data much more slowly. Getting between the two is like trying to get onto a freeway from an old cloverleaf junction -- there's not enough of an acceleration lane.
Many companies, including Intel, have been working this problem for a long time. Photonic switching is already a reality. But linking silicon directly to optics remains elusive.
That's the heart of Intel's claimed breakthrough, announced yesterday. Intel managed to produce a full Raman effect on silicon. This should enable Intel to build lasers just as chips are built.
Right now electronic signals have to be multiplexed, and packaged, before getting into the optical net. It's a very expensive, complex process. It's one of the chief capital costs a telecommunications provider faces.
But if PCs had their own photonics, they could plug directly into fiber and, as their processing speeds increased, take full advantage of what fiber can do. You could even have photonic processing inside silicon chips. Voila -- no bottleneck.
That's the hope, anyway. As Alan Huang, a 20-year veteran of this silicon laser business points out, "it's a neat science experiment" and there's a long way to go before this shows up on your desktop.
Still, imagine the implications, as Intel is now doing. Tom's Hardware Guide reports:
The Cato Institute claims to be an advocate of free enterprise, by which we are meant to think free and open competition. (That's the logo from one of their standard online products.)
They are, in fact, huge supporters of untrammeled business power, of oligopoly. Hey, where do you think their funding comes from, rabbits?
Here's a great example. It's a blog they call Tech Liberation. It takes a few clicks to learn this is a Cato shop, but they're not really hiding it.
The piece is by Adam Thierer (left), who works full-time at Cato as "director of telecommunication studies.". Its theme is the latest round of telecom mergers. Its message is don't worry, be happy.
"We can safely conclude that the communications / broadband networking business can be very competitive with 2 or 3 or even 4 major backbone providers in each region providing some mix of voice, video and data services."
Evidence for this? A Wall Street Journal piece noting that SBC wants to get into cable television. Other than that, a lot of chirping crickets. And some very nasty lies.
Want a taste?
Permanent hardware encryption isn't going to happen. (The image, by the way, is from DBC of Germany, a player in this market game.)
This does not mean we should give up on encryption as protection, or on hardware for encryption. It's just that, just as Moore's Law means today's state-of-the-art PC is tomorrow's door stop, so today's RFID lock could become tomorrow's open door.
Unfortunately this has major implications for the security industry as it is today.
If I were a rich man I'd want some of these new Oakley Bluetooth sunglasses.
Of course, I'd need the prescription version. And I really like photograys. And have you got that in a bifocal model?
As you can see there is a way to go before Motorola's Cannes fashion statement turns into a really big market. Yes, there are cool-types who will grab on to this, so they can walk down the street gabbing away, like well-dressed homeless. But how many are there? And are all these fashionistas going to be satisfied with just these Oakley wrap-arounds?
A better solution, to my mind, would mount this user interface on the frame, with the electronics hidden in one of those cool eyeglass retainers 49er coach George Seifert used to wear? (That's George, left and above, and you may be able to make out his retainers. From the Seifertsite on Earthlink.)
In a New Yorker profile of chef Mario Batali (left) there's a wonderful scene of Mario rooting around a waste pail, looking for what the author-turned-prep chef has tossed away.
Our job is to sell food for more than we paid for it, Mario lectures him. You're throwing money away.
Apple Computer is the greatest exponent today of what I call Batali's Clue. Your job, as the maker of products, is to get more for your creation than the cost of the electronic "food" that goes into it.
It's a vital Clue because components in the Moore's Law age spoil like dead fish on a wharf.
Here's an example plucked from today's headlines. (Well, the ad pages.)
Carly Fiorina's reign at Hewlett-Packard was defined by her acquisition of Compaq.
The merger didn't work.
She was fired today. The press release doesn't say "fired," of course. (It never does.) Various stories say the board "dismissed" her, that she "suddenly quit," that she's "stepping down, effective immediately."
She's out because her strategy was doomed from the start. She tried to treat computing as a traditional industry, where the pattern is that once growth slows to a modest level you get consolidation, companies merging together until just a few are left and profits are regular.
This doesn't work because Moore's Law prevents it. Moore's Law means the nature of systems are always changing. Companies rise because they know something about the market, and fall when they lose touch. No amount of consolidation can change that. The merger that created Unisys didn't save Univac and Burroughs, merger didn't save Digital Equipment and Compaq, and it didn't save Hewlett-Packard.
Fiorina's key ad campaign, "Invent," implying the company was going tback to its roots in the garage, turned out to be just that -- an ad campaign. What has H-P invented under Fiorina, except financial manipulation. Anything?
So she's out, for the same reason that, say, Tony Samuel (left) is no longer coach at New Mexico State. (Go Aggies.) His color had nothing to do with his firing, and her gender had nothing to do with hers.
The digirati are in a fury today over claims by an outfit called i-mature which claims to have solved the problem of age verification with a $25 device that checks a finger's bone density to determine just how old you are.
The image, by the way, is from Vanderbilt University, which has no affiliation with either Corante, i-Mature, or this blog. It describes x-rays of a finger taken at different power settings. Go Commodores.
RSA announced "a joint research collaboration" with the company. But there is skepticism over exactly how precisely a bone scan can measure age, and the more people investigate, the more questions they raise.
Think of it as a LAN on a chip. Not just the network itself, but the computers on the network and, to some extent, the people behind the computers as well. (The illustration is from the first section of Blatchford's report.)
Software programs on the chip, called apulets, portion work out among the computing sections, then recompile the results, the way an editor does at a newspaper desk. (Only without the coffee and the yelling and the pressure or the beer after work for a job well done.)
The result is true multi-tasking. As good as some teenagers, who will listen to music, watch TV, and gab on the phone while allegedly doing their homework, and still get As. (You know who you are.)
The best thing, though, is that this thing scales. You have 8 cells on the chip now. You can have more.
I'm no electrical engineer. I just went to school with some fine ones and picked up some of the lingo by osmosis. But it does seem to me that the "dual core" ideas Intel has committed to are merely extended here, in a way very consistent with Moore's Law.
The key point Moore missed (because it wasn't relevant to the paper, hadn't been discovered, and don't you dare criticize Mr. Moore for this) is that the exponential improvements he saw in silicon fabrication apply elsewhere. As I've written many times here, they apply to fiber, they apply to storage, to optical storage, to radios.
And now, for the first time, they may apply to chip design.
A few more points:
Now that Star Trek is officially dead (no new shows or movies, even in production) the time has come for a new idea.
It's an anthology series, built around various scientific "principles" that define the Star Trek franchise.
Think of it as Science made into Drama.
Yes, it's an excuse to make science exciting. (Just think of the educational spin-offs we can produce!) And the production costs are low enough to put this on the SciFi channel (where Enterprise should have been all along). Or might I suggest a pitch to Discovery Networks, which has got proven talent in making science fun with shows like Mythbusters?
For host, might I recommend Stephen Hawking? Playing the role Alistair Cooke made famous, he opens each show by describing the science (and the Star Trek technology) on which the show will be based. (I might recommend getting several scientists for this role, perhaps one for each specialty. But Hawking is a name. He'll do great for starters.) Or, with confidence this show will last for decades, Lance Armstrong, who's already under contract to Discovery, who knows how to read a cue card, and who owes his life to science?
More after the break.
To all those wishing to bury Moore's Law. There are more tricks left in it than are dreamt of in your philosophy.
We all know about "dual-core" chips. Intel has switched development here, AMD has them in droves. They're basically multiple chips drawn on the same piece of silicon, taking advantage of parallel processing on-the-chip. Great stuff. Makes chips faster, makes processing faster, and keeps Moore's Law going.
Now IBM (with Sony) is rolling out what it calls Cell technology . This extends the dual core philosophy, a single chip that passes instructions to as many as eight processors at once. (Think of it as an editor chip in the "slot" of a computerized editing desk.) IBM says it can handle up to 10 instructions at one time.
All the speculation surrounding the Cell involves where it might go, and what it might do. (They're putting it first into Sony's Playstation 3, but it's listed as a PowerPC advance.)
But that's now what you should be thinking about.
An ink-jet printer that makes gourmet food?
The printer is in Moto, a Chicago restaurant, and it's programmed by executive chef Homaro Cantu. The paper is the same stuff you see on some birthday cakes, made of soybeans and cornstarch. The ink is edible, and the flavors are powders placed on the paper after it's printed. This means he can create a 10-course "tasting menu" that won't leave you bloated -- just well-read and out several Benjamins.
Cantu is making paper sushi and menus that can be crunched into his gazpacho for "alphabet soup."
Now that we have proof of concept, what next?
The other day a colleague sent me a party invitation. The headline was "HP Plans Retirement Party for Moore's Law." (Real retirement parties, of course, feature lovely cakes like this specimen, from the Carolina Cake Co., Hilliard, Ohio.)
Moore's Law has been buried more often than Dracula, but like Elvis it keeps coming back.
As I've written, the exponential improvements Moore first revealed in silicon have been replicated in optical fiber, in hard drives, in radios, across the technological universe. And it shows no sign of ending.
In fact, the "Retirement Party" was a tongue-in-cheek reference to a new Hewlett-Packard technology that could extend the life of Moore's Law improvements many, many years.
It's called a crossbar latch and in theory it's just a circuit line crossed by two other lines. But it's capable of performing the same functions as a circuit etched in silicon, and when made on nanoscale, it's more efficient.
The key is that the size of the crossbar latch can scale down further than today's circuits. They can be made smaller, thinner, run closer together, and hence, create more circuit density, which is what Moore's Law is all about.
This is the kind of story that warms the cockles, and hopefully makes today's Friday dog blogging feature make some sense. (Dogs are colorblind.)
It goes by the name of the Eyeborg (Adam's an inventor, not a marketer). It takes a picture of the scene with a digital camera, then translates the colors to sound with a computer program.
Best of all, the first one cost under $100 (well 50 pounds) to make.
But here's the really cool part.
The significance of WiFi-cellular roaming doesn't lie in cutting voice costs. (The picture, by the way, comes from Novinky, a Czech online magazine, a story about DSL.)
The significance of WiFi-cellular roaming lies in Always On applications.
Think about it. Cellular channels are relatively low in bandwidth, WiFi channels are high in bandwidth.
Now, you're wearing an application, like a heart monitor. When you're at home, or in your office, this thing can be generating, and immediately disgorging, tons and tons of data, detailed stuff that may be fun for your doctor to analyze later.
I have talked about this before, but now everyone else is talking, too. So we will, again. (The picture, by the way, is of a single-chip radio from two years ago, a "mote" from Cal Berkeley. The link is very worthwhile.)
What does it mean for TI to make, and Nokia to sell, a complete cellular phone on a single chip? For one thing, it means phones can be one-chip cheap.
Right, cheap as chips.
I've been re-reading the last in Harry Turtledove's Worldwar series, called Homeward Bound, and I'm once again struck by the similarities between the U.S. military in Iraq and the Lizards of the story.
The Lizards (not to give the story away) invade Earth i 1942, at the height of World War II. They have the weapons of 2000, Earth has what it had. The overall theme of the piece (which has now run into its seventh 500-page book) is human ingenuity vs. reliance on technology.
I don't know what they're thinking with this latest battle robot. (The picture, which I'm confident betrays no military secrets, is from the BBC.) But I'm pretty certain we're going to have some captured, disabled electronically and then grabbed under covering fire. The wireless link between the operator and the bot is the weak link.
And what happens then?
I have been singing the good news about Moore's Law for many years now. It spurs productivity, it spreads knowledge, it increases the rate of change across the board, etc. etc.
But there is a dark side to all this that most who write on technology don't talk about. (The image is from Youngstown State University in Ohio.)
That's what I call Moore's Inverse Law of Labor.
Simply put, Moore's Law makes large productivity gains absolutely necessary. To compete in a Moore's Law world, you have to continually replace people with technology, and move folks' time into more productive tasks, or they fall behind.
This is true for individuals, for business, for government, for nations. It has very profound implications for all of us.
Let's think about some of them:
Over the last few weeks I've read a lot of commentary about the recent mobile phone health scares.
Much of it follows the industry line. Even on blogs, the tone seems dismissive. Case not proven, nothing to see here, move on.
But that's the wrong attitude to take. (The ostrich came from a financial planning site.) It's ignorant on how easy it would be to address valid concerns, and even improve the product at the same time.
What seems to matter is the power of the wave hitting your head, the distance between sensitive tissue and high frequency waves, and the duration of exposure. Stick a high-powered microwave brick next to an ear for 10 years or more, it seems, and something's going to fry.
But Moore's Law of Radios shows we don't need that much power. We're better off without it. Frequencies are used most efficiently when you have a lot of very low-power devices -- this lets you put more traffic in less space.
As I've said before, separating the handset from the headset can also work wonders, not just from a health standpoint but from a user interface standpoint. A close friend of mine has had a Bluetooth headset on his ear for some weeks, and now he's hot to replace his phone with something that has more functionality, more expense, that's more like a PDA. This should be good news for the industry.
But by sticking our heads in the sand, by dismissing reports of health effects out of hand, rather than addressing what we can now, the industry is setting itself up for a nasty fall, and many unhappy jury returns.
But here's what is worse.
I recently wrote in high praise of Motorola for the MS1000, calling them The Kings of Always On.
The following does not detract from that call. Motorola has come closer to building an Always On platform (as I envision one) than anyone else.
But there are still a few things they could easily add:
Along with all their other implications, the mass adoption of mobile phones represents the first step in the single-chip era.
If you look inside the guts of your phone you are unlikely to find a big honking circuit board. (The circuit board illustration is from Sciencetechnologyresources.com.) Instead you will find one, two or three single chips performing major functions in an integrated way.
This is happening across-the-board in technology. We've gone from circuit boards in the 1980s to modules in the 1990s, to single chips. Just as early IBM PC add-in board producers created "multi-function cards" to assure a price worthy of retail distribution 20 years ago, so chip makers today put multiple functions on many chips, creating entire systems no bigger than a finger-nail.
The triumph of liberty in the 20th century was basically a technological triumph. It was Moore's Law that did it. Moore's Law, and all its antecedents, changed the rules of the economic game, of the power game, and the balance between rulers and the ruled.
Moore's Law, the idea that things get better-and-better faster-and-faster, means that trained minds are the key to economic growth. Willing hands, the key to economic growth in the industrial age, matter far less than they did. Chains may keep trained hands working. They don't do so well with trained minds.
In America the result, as Dr. Richard Florida (left) wrote, was the rise of a new "Creative Class" that could dominate societies and drive economic growth. These were people, accused of wealth and guilty of education, whose values were intellectual and meritocratic, and (perhaps most important) were capable of economic satiation. Creative people have, on the whole, risen through Maslow's "hierarchy of needs," and are in search of self-actualization, not food or even luxury.
For the last year I've been harping here on the subject of Always On.
The idea is that you have a wireless network based on a scalable, robust operating system that can power real, extensible applications for home automation, security, medical monitoring, home inventory, and more.
As I wrote I often came back to Motorola and its CEO, Ed Zander. They would be the perfect outfit to do this, I wrote.
Little did I know (until now) but they did. A year ago.
It's called the MS1000.
The product was introduced at last year's CES, and re-introduced at various vertical market shows during the year. It's based on Linux, responds to OSGi standards, and creates an 802.11g network on which applications can then be built.
At this year's CES show, Motorola is pushing a home security solution based on the device, with 10 new peripherals like cameras and motion sensors that can be easily set-up with the network in place, along with a service offering called ShellGenie.
Previously the company bought Premise, which has been involved in IP-based home control since 1999, and pushed a version of the same thing called the Media Station for moving entertainment around the home.
What should Motorola do now? Well, the platform is pretty dependent on having a home PC. The MS1000 could use space for slots so needed programs could be added as program modules. They need to look at medical and home inventory markets, not just entertainment and security.
But they've made an excellent start. And from here on out everyone else is playing catch-up.
Oh, and one more thing...
Reuters today discusses this in terms of New Mexico, home to two Intel plants outside Albuquerque that make Pentium chips. But the problem is industrywide and worldwide. It's baked into the system. The fact is that etching chips requires the use of caustic chemicals that pollute the air and water.
What can be done about it?
Now here's the perfect gift for the gadget freak on your list, assuming they have the right phone and Windows XP.
It's software for burning DVDs onto your mobile. Just $25, from Makayama Software, a Japanese outfit with European representatives.
Before you start thinking what's that for, imagine yourself in an Airport facing a nasty business flight. Imagine if you could turn on your phone and watch that DVD you got from someone else for Christmas.
Leave it to the folks at the BBC to turn good news into bad. (Cute kitty found at Guru International, in Oz.)
As hard drives become cheaper, faster, and capable of handling more files, we leave more files in them. I now have copies of all my music on my PC, and could quickly transfer most of it to an iPod. My son and daughter never clear their Internet cache any more, and my wife keeps e-mails going back months and months.
So what do the good people at the BBC call this? They call it ďdigitally obese.Ē
The BBC has a piece today showing how the World of Always On could be invisible, worn instead of held.
We've already seen undershirts embedded with medical sensors. But Ian Pearson predicts we're going to move, over the next 10 years, to a world of devices imprinted on the skin.
For those not enjoying our online novel, The Chinese Century, here's a piece of non-fiction that may leave you even more upset. (That's a 1963 Time Magazine cover, by the way, of which prints are available for purchase.)
It's an interview with Ronald Chweng, chairman of Acer Technology Ventures. Acer is based in Taiwan which China calls a renegade province, and the U.S. once called the Republic of China. Chweng's charge is to find U.S. investments. He says there are plenty, but that the focus may be changing.
It may be moving East.
The idea, which is valid, is that through blogging ordinary communication becomes content. I know this is true because my own newsletter, a-clue.com, has been losing readers ever since I started blogging here. It's not just that readers prefer getting my thoughts through the blog instead of e-mail. It's that the one-week lag between my writing and your reading is eliminated by blogging. You're not just an audience here, you're practically reading over my shoulder as I type.
But it seems to me this is old news.
In watching how people use their devices, I have come to the conclusion that we're witnessing four separate evolutions of the user interface:
If you're to take Always-On applications into the world with you, they have to be fashionable. They have to look smart. It would be very nice if they were machine washable.
Now they are.
What would you use this stuff for?
Hey, kids! You can get this cool wallpaper of the Hollywood sign for your PC right here. We now return you to your regularly-scheduled tech blog.
Texas Instruments has a new chip, code-name Hollywood, that will deliver real TV to mobile phones.
The chip doesn't just process TV images using mobile phone frequencies. It actually connects you to TV signals, over-the-air, including digital TV standards. It includes a tuner, OFDM demodulator and channel decoder processor.
It's great. But in a way it's a stunt.
Michael Thomas launched a company some time ago to push the use of nanoelectronics in data storage. Hence its name: Colossal Storage Corp. (The image is from the company's Web site.)
Al Shugart is on his board, so you know these are serious storage folks.
For months he's been talking about 3.5 inch removable disks storing 10 Terabytes each. Blu-Ray disks, the most effective CD-type technology out there, can currently store, at most, 50 Gigabytes, so we're talking about improvements of nearly three orders of magnitude.
But it turns out the technology he's worked on can also be applied to displays.
Everyone thinks they know Moore's Law. (At least everyone who reads this blog should.)
The number of circuits that can be drawn on a given piece of silicon doubles every 18 months or so. (And that's its author, Gordon Moore, to the left. Note the high forehead, a sure sign of fierce intelligence and handsomeness.)
Or, to put it Dana's way, things get better and better faster and faster.
But we also need to remind ourselves of Moore's Second Law, which follows directly from his first and which may (fortunately) apply only to silicon.
The cost of doubling circuit density increases in line with Moore's First Law. In other words, when you go from 1 billion circuits on a chip to 2 billion, the cost of developing the plant to produce that latter chip also doubles.
This is now starting to bite Intel.
Every former chairman of Intel becomes, by definition, an elder statesman. Gordon Moore and Andy Grove have done us proud.
Craig Barrett (pictured, from the News.Com story) aims to do equally well. And he's off to a fast start, as he told a recent Gartner conference what interests him:
"You would like to think that public leaders are statesmen and have the country's best interests at heart," Barrett said. "We spend $25 billion on agriculture subsidies a year. Yet we spend $5 billion a year on basic research and engineering. Do you think agriculture is the industry of the future? You would like your government leaders to stand up and say something about that. I would like them to stand up and say something about it."
Amen and godspeed, Doc.
Intel's move to slash laptop chip prices in time for Christmas means two things.
The Age of Robotics is finally about to begin.
This is the conclusion of a United Nations study, whose census indicated there were 607,000 domestic robots in use at the end of 2003, 570,000 lawnmowers and 37,000 vacuum cleaners. (The illustration is from the adventures of Hubie and Bertie, two of the lesser-known Warner Brothers characters.)
But the prediction was pretty grand:
By the end of 2007, some 4.1 million domestic robots will likely be in use, the study said. Lawnmowers will still make up the majority, but sales of window-washing and pool-cleaning robots are also set to take off, it predicted.
In other words, general purpose "mechanical men" are still a long way off. We're building a host of small machines geared to specific tasks, something more of a Chuck Jones future than an Isaac Asimov one.
But that's OK too.
Here's another way in which Moore's Law is taking us to amazing places.
It's here, from a company called ZCorp. in Burlington, Massachusetts.
But wait, there's more.
The big news coming out of Intel's developer conference is they're abandoning Moore's Law.
Nothing could be further from the truth, of course. But that's how the SCCM (So-Called Computer Media) or (if you're conservative) the MCM (Mainstream Computer Media) has it. And if Intel were running for something (which it is, namely your money) it has some explaining to do.
Now you may call this nuanced, but it's just the straight facts. There's more to Moore's Law than clock speed. Truth to tell there is more to Moore's Law than chip count. Yes, that was the focus of Moore's 1964 article, but the man was writing about the challenges of his time, he wasn't trying to be Nostradamus. (If he were, he wouldn't have written so plainly.)
After 25 years of cable I'm finally getting a dish.
Comcast has been tossing me price increases for years, but the latest back-door price increase will send my bill way north of $60/month. If I were using cable broadband I might think differently, but...
So I went to the store, and I learned something about Moore's Law. It works. Mass production and better electronics means you can buy a dish and install it yourself for just $50. A version with a DVR (like a TiVo) on it costs just $50 more.
But there's a problem...
You may think I credit Moore's Law with just about everything. Maybe I do.
But compute power makes many things possible. It accelerates the pace of all types of change. Even in materials science.
So here we have Integral Technologies and its conductive plastic ElectriPlast antenna. Plastic that conducts like metal means lighter conductors that can be molded into any shape. So we're talking about more than just antennas here.
At a recent Mobile conference in the United Kingdom David Wood (pictured), executive vice president for research at Symbian (a small kernel OS maker for mobiles) brought two slides that really show you what Moore's Law means.
NOTE: The above paragraph originally said the conference was in England, but Chris Potts corrected me. Also, the folks at Semacode deserve credit for extracting the slides and pointing them out to us.
First was a chart tracking the cost of making a smartphone over time, going back two years and forward six. (These are PDF files.) Despite the fact they're getting a lot better, they're also getting cheaper -- the bill of materials cost could be cut in half in four years.
How is this possible?
One of the great things about Moore's Law is that it's multi-dimensional. It's not just that things get better-and-better, faster-and-faster. It's that you can create new revolutions by combining existing ones. (As in this example, from XTalk.)
Here's what I'm talking about. I call these combo chips. We're talking here of chips for mobile phones and set-top boxes that include Digital Signal Processors, thus enhancing the underlying product.
OK. We've got music players, we've got mobile phones, we've got game machines, we've got organizers.
They're all small, pocket sized. They all run different applications. They can't cross-over and run applications from any other category.
Question: Why? Why can't we have just one device that does all that, and maybe more?
Russell Beattie says a concept called Podcasting may be one step toward solving the problem.
Samsung will soon debut a five megapixel camera in one of its new mobile phones.
Problem is, what do you do with the file once you've got it?
The Moores Law improvements in mobile phone cameras are jumping far ahead of the networks used to move the files off them.
Already Asian operators are "locking down the phone," preventing the 2-3 megapixel cameras their phones contain from sending their stuff over their mobile network.
This is big news. And, for consumers, it's very scary indeed. My point today is it should be even more frightening for network operators.
One thing that doesn't draw enough comment in the excitement over the iPod (right, from the Hatena Diary in Japan) (and its growing list of competitors) is what this says about the underlying technology.
The iPod is a triumph for the hard disk over optical storage.
When DVDs first came out in the late 1990s they were able to offer about 5 Gigabytes of permanent storage on a disk that could sell for $20, even including the cost of the content. At that time it was a big deal to have a 5 Gigabyte hard drive, and you paid through the nose for it.
Today drive storage prices are down to $1 per Gigabyte. And you can get this storage in any form factor you want. There are even hard drives in some mobile phones.
Not only has the price come down, but today's drives are sturdier than ever. Those dancers on the iPod commercials? Their music isn't skipping, as it might if they were holding CD players in their hands. (That's a subliminal point made in the marketing.) Remember tape back-up? Raise your hands if you still have it, or think you still need it.
The optical disk, meanwhile, has become a floppy. Sure you can buy a blank disk for as little as 60 cents, in quantity. (Fry's has a special on of 50 for $30, with rebate.) But what do you get for that? A few hours of a movie, maybe a dozen albums. And it's still not as sturdy or portable as a hard drive device.
The reason hard drives have become cool while optics drool has something to do with Moore's Law, but also the process by which these two technologies march forward.
Forbes, the magazine for the dumb-and-dumber executive, has another prize this week -- an article bemoaning the problem of batteries.
While it is true that metal batteries aren't advancing very fast, this is like deriding Moore's Law because improvements in copper are lagging. (The image of a lemon battery is from Hilaroad.com.)
Fact is there are new sources of power being developed all the time. And they're being adapted. When was the last time you saw a calculator or saw an emergency phone that wasn't solar-powered?
A team headed by Yang Wang at Boston College has found that an array of aligned, but randomly placed, carbon nanotubes (pictured, from Physics News) can act as an antenna for visible light. (The little scale bar on the right-side of the illustration is one micron in length.)
This could be used to create optical television or (more important I think) convert light directly into electricity. That had been one of the perceived promises of Buckyballs when Rice scientists first found them almost 20 years ago, but no one had come up with a method for making it happen until now.
This could be big. (Yes, bigger than the win over Penn State.)
"Sky Captain and the World of Tomorrow" , the first totally computer-generated "live action" film, won the week's box office with a take of $16.2 million. (The picture was downloaded from the actual "Sky Captain" site, where it's available as desktop wallpaper.)
I took my son to see it Sunday and while we both enjoyed it the film didn't draw applause.
One early scene explains it all. The heroine, played Gwyneth Paltrow, goes to Radio City Music Hall to meet a contact. While she and the contact huddle in the foreground, the background is the Radio City screen showing "The Wizard of Oz." (The film is set in a fictional 1939.) Just look at Judy Garland's face, reacting to the effect of Billie Burke's good witch arriving on a soap bubble, then compare it to what Paltrow is doing in the foreground.
And that's the most emotion Paltrow gives through the whole performance. (I blame the director, by the way. If actors in front of a blue screen aren't given proper instruction, none of them can get it right.)
Sometimes I get ahead of myself.
When I read about speech recognition I take it as a given. I really had no idea it wasn't already chip-based.
Well, it isn't. (The big ear is from the ACM.)
Carnegie-Mellon and Cal-Berkeley are going to spend $1 million in the government's money over the next three years trying to create a general speech recognition chip for the market.
When they succeed, and I have no doubt they will succeed, it will be a true revolution.
The University of Durham, in England, says it has perfected the first plastic magnet that works at room temperature. (The picture is from an earlier experiment at the University of Utah, published in Reactive Reports.)
The scientists were about to give up on their project when they decided to re-check their samples one more time, a month after making them. Turns out they took time to settle and as time went by all their batches showed traces of magnetism.
This is important on several levels:
History buffs will recall that Sun Microsystems began in the 1980s with one goal, to put the power of an engineer's minicomputer onto his desktop. (Picture from The Register.)
As Moore's Law proceeded this ideal seemed to die away. After all, couldn't Intel chips, running Windows, provide all the power the average engineer needed? Over time, in fact, Windows-based machines did indeed reach, and then exceed, the capabilities of SPARC-based Sun designs.
But perhaps Sun just wasn't using its imagination.
A dual-core chip is actually two chips in one, so products with them can run multiple tasks at the same time.
AMD is pushing the dual core idea, which it's been working on for five years, in part to extend the life of the 90 nm process technology. Dual core lets you boost performance without actually shrinking transistors, much as RISC technology did in an earlier time.
It's yet another example of how Moore's Law doesn't just apply to gate size (although that was what Moore was writing about in 1964). Moore's Law proceeds because exponential improvements can happen in many different directions. Even as we reach the limits of gate size (as we approach the atomic scale) new advances in the design of chips, in the material used to make chips, and in the way we put systems together will keep the improvements coming.
All of which makes this beat very cool indeed....
It's always comforting when the nay-sayers are once again proven wrong on Moore's Law. (This, believe it or not, is a Japanese Screensaver.)
Intel said this week it has perfected 65 nanometer technology. That means it can get 33% more transistors in the same space than with the 90 nm technology now going into production.
All the Moore's Law advantages apply. As the new chip goes into production new application spaces will be created that don't exist today. All that 90 nm stuff can indeed become cheap as chips.
Of course, this was a close-run thing
Silicon has unique properties. It can act as a conductor of electricity or an insulator. Burn it, oxidize it, and the silicon oxide becomes a circuit, magic.
But it's not perfect. Run it too hot and it breaks down. It's not perfectly efficient. You've got to be careful with voltages. It's vulnerable to radiation.
But nothing forms such a perfect, and perfectly thin crystal. Nothing has, until now.
In what may be the most important chemical discovery since silicon itself, Toyota researchers have found a way to create uniform crystals with silicon and carbon, silicon carbide.
Despite a regulatory regime that is impossible to obey (isolating data traffic that's to be turned into voice on a network with trillions of transactions going through it each second) hardware makers are going ahead with the production of Voice Over IP (VOIP) hardware.
Linksys and Netgear are the latest to say that voice support will become part of their residential gateways Real Soon Now. (For more on VOIP, buy O'Reilly's VOIP book, right.)
In this case, however, the Feds will be glad to know there's actually less here than meets the eye.
It's said that the creator of Pokemon got the idea from watching insects around his home, creating complex imaginary societies. This sort of thing is common in Japan, where space is tight, and so the eye is drawn to the small in order to imagine big things. (A lot of kids elsewhere do the same thing -- I did as a kid -- but we tend to grow out of it faster.)
Anyway this preoccupation is leading to real advances in robotics, here in the real world. And here's the latest. It's a tiny helicopter -- less than a half-ounce in size -- that can be programmed to take short flights and send back pictures.
The media, the digirati, even some government figures are laughing today at the East Buchanan Telephone Co-op of Winthrop, Iowa.
They laugh because the co-op has threatened to cut-off cellular calls from Qwest on Monday, claiming it's not being paid for their termination.
The town bought a device that can distinguish between cellular calls and landline calls coming in over Qwest's long distance service. Qwest has won an injunction halting the shut-off for two weeks.
Most reaction has been that the town is crazy, that it doesn't stand a chance.
But they don't know the rest of the story.
The biggest problem we have with Moore's Law is that we think of it linearly. (That's the man himself, from the BBC.)
Chips get faster and faster, faster and faster. That's the short form for Moore's Law, as Moore himself wrote it back in the 1960s.
But there are other ways for chips to get better other than by just getting faster. RISC made chips better. Low power technology makes chips better. FPGA makes chips better. Technologies like IBM's new "chip morphing" make chips better.
So I'm going to do something really big here. I'm going to re-state Moore's Law, for the 21st Century, as companies like Intel and IBM now understand it.
Chips get better and better, faster and faster.
I hope Mr. Moore' will approve.
Andy Oram has a long story at O'Reilly today detailing the problems with universal service and public policy.
It's a great historical overview.
But it's missing one key ingredient. And it's a surprising ingredient for Andy to miss.
That ingredient is Moore's Law.
The price of stock in Intel fell after the company announced its earnings for the quarter had doubled. (That's not the real Intel symbol, by the way. It's something I found at a nifty Web company in Bristol, England.)
The reason: falling margins, rising inventories, and a prediction growth will slow in the second half.
Am I worried? No. Here's why.
These lazy, hazy crazy days of summer are always the dog days for technology. (The fine art to the right is by Beth Carver.)
This year, however, it's worse.
Throughout my 20 years as a technology freelancer the phone has stopped ringing in May and hasn't started up again until August. This year, of course, it hasn't rung at all.
The Bell not tolling for me wouldn't be so bad if it were tolling for someone else. But it's not.
The Paul Otellini era at Intel opened unofficially yesterday. (Photo from the University of California.)
In this era, Intel will try to move from being "just" a chipmaker to being a standards-setter, a la Microsoft. It will move from enabling the future to trying to define it.
The era starts with the three-chip project code-named Grantsdale. Otellini didn't talk speeds-and-feeds on introducing it. He talked about features, Features like built-in Wi-Fi, and support for new fault-tolerant storage, features that will define an Always-On world.
Then give us a blogroll, tell your friends, spread the word, or drop a note.
It's lonely on this side of the typewriter. (Yes, I know it's a PC, but the interface will always be a typewriter to me.)
I'm still a Craig Barrett fan.
Barrett has a year left to run Intel before turning it over (most likely) to Paul Otellini. It's a reflective time. And in a recent talk with News.Com, he reflects on the "complacency" of America.
As Intel CEO this doesn't matter much to Barrett. The company can grow anywhere. But as an American it must upset him, especially since, before joining the company he was an assistant professor of materials science at Stanford. He's walked the walk of education.
So what should Craig Barrett do next? I have a few ideas.
But in the end it will do them no good.
SBC is taking this strike to reduce its labor costs. It thinks that will save it.
But the problem at SBC isn't labor costs. It's that the value of its installed plant, its capital, is depreciating in line with Moore's Law, while much of it was bought on 30-year assumptions.
The impact of Moore's Law accelerates with time. It doesn't decelerate.
Labor costs can only decline arithmetically, in other words, while the impact of what's happening is growing geometrically -- downward.
Gordon Moore's 1964 prediction was based on the idea that we could shrink the size of components indefinitely. (Oh, and yes the title of this is a pun.)
If you limit your look at Moore to that one point, last week's announcement by Intel that it will change the way it looks at chips is, indeed, Gordon Moore's Last Sigh. (What, you haven't bought the book yet? What's wrong with you?) That method for increasing chip speeds is, henceforth, inoperative. (Picture of Gordon Moore from CNN.)
But Moore's Law is going to keep on keeping on. Moore's article cited a method for making chips faster, but Moore's Law itself was really a challenge to the industry, to keep those improvements going. Here's how the challenge will be met:
Moore's Law is a challenge. It's not a scientific principle.
Moore's Law tells the electronics industry what it should hope to do, however it can do it, based on the idea that, in 1966, the goal of 100% improvement every 18 months looked achievable for some time to come.
What most people don't know is that, in many cases, and in many different areas of technology, engineers and scientists have been blowing the Moore's timetable to smithereens.
Sony's announcement of a 25 Gigabyte CD made partly of paper has to rank as the Moore's Law story of the year. (The image is from Sony's Press release, with special thanks to Lyle Clarke for pointing it out.)
This is a so-called Blu-Ray disc, using a blue laser beam which, because it's so short, doesn't read below the disc's surface, into its substrate. One way to translate that 25 GByte size, by the way, is to note that it's two hours of High Definition TV. Hi-def movies need Blu-Ray.
So the breakthrough here isn't just in the paper. Repeat, the breakthrough here isn't just in the paper.
A quiet revolution is about to overtake the cellular industry.
The revolution is being launched by Nvidia, best known for its graphics chips, and a new line of chips it's going to ship soom called GeForce. In the words of Phil Carmack, vice president for handheld products, who spoke at the Mobile Entertainment Summit, we’re talking a thousand-fold magnitude in the improvement of common cell phones, starting next year.
On the CTIA show, of course, all the talk is about enabling content that wasn't really new in the late 1970s. The most popular game on cellphones remains bowling. Yes, there's talk about ring tones (10 second bursts of sound that play when your phone rings), which should be a huge business, and the Hollywood crowd was out in force this year, pawing for revenue. (The illustration, by the way, is from the Domestickers line, number 96 if you're keeping score at home.)
But they ain't seen nothing yet.
While that's all well-and-good, and that might even be possible, I think the comments miss the point.
Yeah, you'll have to click below to get to the point.
The more alert among you may notice some changes have been made here at Moore's Lore. Among the new features:
I hope you like the new Moore's Lore. Please feel free to write me about it directly with any feedback.