\n"; echo $styleSheet; ?>include("http://www.corante.com/admin/header.html"); ?>
No, not on the football field, silly.
(The original Rice seal, to the right, dates from 1911, and carries its own story, including Confederate gray "warmed into life by a tinge of lavender.")
I'm talking about the laboratory, where Rice is successfully managing the transition from Dr. Richard Smalley's "Buckyball foundation" generation with new research into the link between optics and electronics.
Professor Peter Nordlander has announced "a universal relationship between the behavior of light and electrons" which "can be exploited to create nanoscale antennae that convert light into broadband electrical signals capable of carrying approximately 1 million times more data than existing interconnects."
This is big. In many ways it's as big as the original BuckyBall discovery, and more readily exploited.
More after the break.
I know people are going to read this as good news.
Intel announced it is putting $340 million into expanding plants in Colorado and Massachusetts.
What could be wrong with that?
Back in 1985, you would have spent big money to get an Intel 386 chip, with over 100 Megabytes of storage, and a local network that ran as fast as 1 megabits per second.
I know I didn't have one. The closest I saw to one that year was an entrepreneur 10 miles north of me who had a Digital Equipment PDP-8 minicomputer in his office.
Yet that is just what you see in the picture to the right:
Thanks to Moore's Second Law (complexity causes costs to scale exponentially) competition in the semiconductor business is held in an ever-thickening mud, which represents the cost of building new capacity.
The number of company-owned fabrication plants, or fabs, must decline over time, as their cost rises above even corporate affordability. The decision to build one must be taken with increasing care, with an eye toward a far-off future. It's the opposite of what happens in the product cycle, at the other end of the factory floor, where things are constantly accelerating.
While Intel has played its hand in Asia, AMD has chosen Europe, specifically the former East Germany. More specifically Dresden, firebombed during World War II, left for dead during the Cold War.
In 2003 AMD broke ground for its second Dresden Fab, AMD 36. The plant goes into volume production next year, at a point where AMD's designs seem to be excelling those of Intel.
Market share, in other words, could make a big swing next year.
At the very same time, AMD is advancing in court, forcing Intel to defend an already-fading monopoly. A few years ago Intel had knocked AMD practically out of the ballpark. With the Dresden Fab 36 that won't be true, but AMD figures Intel must still have a case to answer for, because its hyper-competitive marketing department never changed tactics.
Evidence will likely show that Intel did have a near-monopoly under Craig Barrett, and that it did abuse its position in its dealing with big customers. But a court finding for AMD would still be a mistake.
I was giving more thought to yesterday's rumors of Cisco buying Nokia (or part of it).
The more I thought about it, the more I realized there is a very smart M&A move Cisco could make on today's technology board, something that would give it an infusion of both technology and backbone, plus get it into the mobile markets it seems so hot for.
But what's in it for Cisco? Plenty.
Rebecca McKimmon (left, from her blog) took a shot at Cisco's China policy recently, confirming through a spokesman that the company does indeed cooperate with the government.
This is not news. So does nearly every other U.S. tech company.
The U.S. policy is, and has been, full engagement with China. This has already hurt Cisco. Back in the 1990s one of the prices for getting into the market was to share technology. Cisco did so, and a few years later Huawei, a Chinese company, had routers and bridges very similar to Cisco's old stuff, along with most of the Asian market (thanks to lower prices).
McKimmon's point now is that
China Cisco is cooperating with the worst excesses of the China government, which is seeking to have both the world's best Internet technology and full control over what people do with it.
That is a good point, but I don't think you
don't go after Cisco to make it.
When the Comdex show closed its doors a few years ago a lot of people threw up their hands and decided it was some sort of secular turning-point, the lesson being that people didn't do trade shows anymore.
Well it was a turning point. But not of the ind they thought.
Fact is, Comdex lives. It lives in Taiwan, and it's called Computex.
The show just finished, and the reports are still dribbling in. But what's clear is that the same spirit of innovation, the same corporate social mobility, and the same establishment of distribution that marked the Comdex show in its heyday all took place in Taiwan.
This is meaningful on several levels.
AMD is the most infuriating company imaginable.
If Intel is the big dog of the chip world, AMD is the little dog jumping around it, nipping at its heels, acting like it (not Chipzilla) owns the street.
Its latest legal assault may be its dumbest move yet.
Strictly from a timing standpoint, it sucks.
This Administration does not look kindly at anti-trust claims. They settled with Microsoft, they gave the cables and Bells a duopoly (leaving America a third-world broadband country), and they seem content to let China monopolize world trade while India gains control of services. All this is in pursuit of an ideology that becomes less-and-less distinguishable from Putinism and Kleptocracy by the day.
Short form. If they had a case they should have filed it in Europe.
Irregular readers of this blog may think Gordon Moore invented the microchip.
He didn't. Moore did have a major role. He was part of the Fairchild team, co-founder of Intel, and his Moore's Law article popularized the changes that chips would bring.
But Jack Kilby won the Nobel Prize for the microchip, in 2000. He died today.
The original invention, designing multiple devices on a single piece of substrate, was invented in two places at once. One team, which Kilby headed, worked at Texas Instruments. The other team, with Moore, Robert Noyce, and other key men, worked at Fairchild Semiconductor.
The resulting patent was shared, but it was Kilby's team that created the basic technology. The key contribution from the Noyce team involved manufacturing process.
More on Kilby after the break.
Not only is Apple switching its chip supply contract from IBM to Intel, but it is moving to Intel processors in the bargain.
In making the announcement this morning, Steve Jobs said he didn't see how he could continue making great products beyond next year "based on the Power roadmap."
Right after his speech he had a cagey interview with CNBC's Ron Insana. "Its not as dramatic as youre characterizing it," he insisted.
"This is going to be a gradual transition. Hopefully a year from today well have Intel-based Macs in the market. Its going to be a two-year transition.
"As we look into the future, where we want to go is different (from IBM's product roadmap). A year or two in the future Intels processor roadmap aligns with where we want to go.
"I think this will get us where we want to be a year or two down the road." Jobs refused repeated requests by Insana to explain what he meant by that. (Jobs is also shaving even more closely than this picture shows. He's down to tiny stubble around a a still-brownish moustache. Hey, Steve, I'm 50 too.)
What I think he means, simply, is video.
Beyond this, most of what I wrote last week holds. This deal is not material to Intel, which continues to face loss of major market share to AMD among Windows and Linux users.
But there are also vital lessons here for followers of Moores Law, lessons I need to impart.
The secret's out.
Intel is re-interpreting Moore's Law. Not repealing it. Not rejecting it.
They're reinterpreting it. That's the significance of the change incoming CEO Paul Otellini (right) is making.
Before Moore's Law was like Samuel Gomper's famous quote about what labor wanted. More. More circuits, more speed, more cycles, more bits. More.
As of today Intel's new direction is better. Better doesn't always mean more. In the case of microprocessors it can mean putting more computers on each chip (multi-core) or running with lower power. In terms of communications it can mean a host of attributes, from security to coverage to throughput.
The best way to understand the future is to look into how chips are changing.
Two transitions are transforming Moore's Law. The original article, in 1964, described only the density of circuits on silicon substrate.
The rule implied that chips could get better-and-better, faster-and-faster. Doubling bigger numbers means bigger incremental changes in the same time. Over the years chemists and electrical engineers learned to apply this exponential improvement concept to fiber cables, to magnetic storage, to optical storage, even to radios, so that 802.11n radios will transmit data at over 100 Mbps -- twice what earlier 802.11g models could deliver, but still 50 Mbps more.
The transitions have to do with what we mean by better.
China puts more people to death each year than any country in the world. (Yes, even more than Texas.) China is a brutal dictatorship that oppresses its people as no other country, the most Totalitarian regime on Earth. My mentioning this may get Corante blocked to all of China, by the state's firewall system, the most extensive Internet censorship regime on the planet.
By contrast, Emperor Hirohito and the brutal system he led are dead. Japan acknowledged its sins in the 1951 Treaty of San Francisco and has since been a functioning democracy where politicians must accomodate the views of voters. Japan's Constitution forbids it to make war on its neighbors. Japan contributes more to good causes than any other national governnment.
This is power politics. China is pushing Japan out of the world power picture, letting Taiwan know that resistance is futile, and successfully challenging America's status as a Great Power. Just 12 years ago we were The Hyperpower. Now we're becoming second rate, losing our status to tyrants.
The reaction in the U.S. to all this has been silence. Deafening silence.
Few U.S. outlets have covered the story. The right-wing Cybercast "News" Service actually offered a balanced perspective. The New York Times offers only a fearful editorial on possible Chinese revaluation of the Yuan -- at another time this would be called appeasement.
The reason for this silence is not subject to dispute.
Now that dual-core chips are a reality (to be followed in time by four-core, eight-core, etc.) software companies face a dilemma on pricing. (Picture from AMD.)
Traditionally software companies have priced per-processor. But if a single chip has multiple processors, which could be doing different things, then shouldn't you require two licenses?
The hole is the whole U.S.
Intel plans on mass producing WiMax chips and going into rapid deployment, offering end-user speeds far in excess of what U.S. phone outfits provide with DSL.
The problem is that's the speed limit for most backhauls. Go to most WiFi hotspots, or most home networks, and DSL is the backhaul platform. We're talking 1.5 Mbps, max.
There are two types of chips key to the Always On world.
These are sensor chips and RFID chips.
Both contain tiny radios. The two can also be combined.
A sensor chip, as its name implies, tests specific conditions, and is reporting back with data on those conditions. A motion sensor is an example. A heart monitor is an example.
An RFID chip merely identifies the item its on. The chips that will go onto passports will be RFID chips, and RFID identification is at the heart of efforts by retailers by Wal-Mart, as well as service providers like Grantex.
Ive also written, recently, about applications that combine RFID and sensor ships. Bulldog Technologies is rolling out a line of these chips that not only identify containers in transit, but monitor their condition and shippers know the contents are safe.
Always On applications will use all these types of chips as clients on WiFi or cellular networks, with applications located on gateways that run at low power, with battery back-up, and have constant connections to the Internet.
Last month Intel's mobility chief Sean Maloney was in the hunt to head H-P, a job that eventually went to Mark Hurd of NCR. (Watch out. Dana is about to criticize a fellow Truly Handsome Man.)
But how well is Maloney doing his current job?
Intel's role in the development of Always On is crucial, and its strategy today seems muddled. It's not just its support for two different WiMax standards, and its delay in delivering fixed backhaul silicon while it prepares truly mobile solutions.
I'm more concerned with Maloney's failure to articulate a near-and-medium-term wireless platform story, one that tells vendors what they should sell today that will be useful tomorrow.
Intel seems more interested in desktops and today's applications than it is in the wireless networking platform and tomorrow's applications.
Incoming CEO Paul Otellini says Intel is going to sell a platforms story, not a pure technology story. Platforms are things you build on.
I bought a new laptop yesterday.
And to my surprise I violated my Iron Law.
Dana's Iron Law of Laptops holds that an ounce on the desk is a pound in my hands.
My favorite laptop of all time was a 2-pound Sinclair ZX-81. It had a tiny screen (nearly non-existent) but it had a pliant membrane keyboard that let me write and send stories from a beach. I haven't seen anything so light, rugged and useful since.
Instead, laptops have been desktop analogs. When desktop power increased, so did that of laptops, and they became no lighter in the process. Even today most laptops on the market weigh 7-8 pounds.
So why did I get one?
Technology moves in waves. What's passe in one place may be very cool in another. This is how you can cross the digital divide.
Here's an example. At the same time NTT DoCoMo is closing down its Personal Handyphone System, moving customers to more advanced forms of mobile telephony, it's growing like topsy in China, and Atheros is rolling out a new PHS chip.
How does this work?
As we approach the 40th anniversary of Gordon Moore's Electronics article, the man himself (Intel co-founder and namesake of this humble blog) has appeared to join the celebration.
While the headlines spoke of Moore's skepticism on materials that might replace silicon, I was more intrigued by his views on Intel, where his foundation still holds a considerable stake.
He's pretty happy. He likes the idea of pushing platforms over performance. It makes sense to him.
Moore also gave an irascible cur whom he quit a half-century ago credit for the creation of what's now Silicon Valley.
As it prepares for its developer forum this week, Intel faces an audience of bankers who have not lost faith in it, but who don't understand what it means by "platform."
Credit Suisse First Boston, for instance, looks at the word "platform" and sees only desktop or server. It figures Intel is waiting for Microsoft's Longhorn to demand more processing power of computers and bail it out.
If that's the strategy Intel describes, then it is Clueless. But that's not the strategy Intel is pursuing under new CEO Paul Otellini (right).
That's right, gang. The old joke from The Graduate is here again, aiming to drive silicon into the ground.
Nanomarkets, a market research outfit with a beat that looks like tons of fun from here (call me) has a $2,000 report out with a hockey stick chart for plastic semiconductors, estimating the market at $5.8 billion in 2009 and $23.5 billion three years after that.
Plastic electronics -- chips built on conductive polymers and flexible substrates, will be cheaper, take less power, and (obviously) be more flexible than silicon circuits. This makes them perfect for, say, mobile phones.
It will also bring a bunch of new suppliers to the electronics market, names like Dow Chemical, DuPont, Kodak, and Xerox, along with the usual suspects.
What does this mean?
One of the nastiest open secrets in the Internet is the switching bottleneck.
Optical fibers move data at, well, light-speed. But electricity moves data much more slowly. Getting between the two is like trying to get onto a freeway from an old cloverleaf junction -- there's not enough of an acceleration lane.
Many companies, including Intel, have been working this problem for a long time. Photonic switching is already a reality. But linking silicon directly to optics remains elusive.
That's the heart of Intel's claimed breakthrough, announced yesterday. Intel managed to produce a full Raman effect on silicon. This should enable Intel to build lasers just as chips are built.
Right now electronic signals have to be multiplexed, and packaged, before getting into the optical net. It's a very expensive, complex process. It's one of the chief capital costs a telecommunications provider faces.
But if PCs had their own photonics, they could plug directly into fiber and, as their processing speeds increased, take full advantage of what fiber can do. You could even have photonic processing inside silicon chips. Voila -- no bottleneck.
That's the hope, anyway. As Alan Huang, a 20-year veteran of this silicon laser business points out, "it's a neat science experiment" and there's a long way to go before this shows up on your desktop.
Still, imagine the implications, as Intel is now doing. Tom's Hardware Guide reports:
Permanent hardware encryption isn't going to happen. (The image, by the way, is from DBC of Germany, a player in this market game.)
This does not mean we should give up on encryption as protection, or on hardware for encryption. It's just that, just as Moore's Law means today's state-of-the-art PC is tomorrow's door stop, so today's RFID lock could become tomorrow's open door.
Unfortunately this has major implications for the security industry as it is today.
In a New Yorker profile of chef Mario Batali (left) there's a wonderful scene of Mario rooting around a waste pail, looking for what the author-turned-prep chef has tossed away.
Our job is to sell food for more than we paid for it, Mario lectures him. You're throwing money away.
Apple Computer is the greatest exponent today of what I call Batali's Clue. Your job, as the maker of products, is to get more for your creation than the cost of the electronic "food" that goes into it.
It's a vital Clue because components in the Moore's Law age spoil like dead fish on a wharf.
Here's an example plucked from today's headlines. (Well, the ad pages.)
Think of it as a LAN on a chip. Not just the network itself, but the computers on the network and, to some extent, the people behind the computers as well. (The illustration is from the first section of Blatchford's report.)
Software programs on the chip, called apulets, portion work out among the computing sections, then recompile the results, the way an editor does at a newspaper desk. (Only without the coffee and the yelling and the pressure or the beer after work for a job well done.)
The result is true multi-tasking. As good as some teenagers, who will listen to music, watch TV, and gab on the phone while allegedly doing their homework, and still get As. (You know who you are.)
The best thing, though, is that this thing scales. You have 8 cells on the chip now. You can have more.
I'm no electrical engineer. I just went to school with some fine ones and picked up some of the lingo by osmosis. But it does seem to me that the "dual core" ideas Intel has committed to are merely extended here, in a way very consistent with Moore's Law.
The key point Moore missed (because it wasn't relevant to the paper, hadn't been discovered, and don't you dare criticize Mr. Moore for this) is that the exponential improvements he saw in silicon fabrication apply elsewhere. As I've written many times here, they apply to fiber, they apply to storage, to optical storage, to radios.
And now, for the first time, they may apply to chip design.
A few more points:
To all those wishing to bury Moore's Law. There are more tricks left in it than are dreamt of in your philosophy.
We all know about "dual-core" chips. Intel has switched development here, AMD has them in droves. They're basically multiple chips drawn on the same piece of silicon, taking advantage of parallel processing on-the-chip. Great stuff. Makes chips faster, makes processing faster, and keeps Moore's Law going.
Now IBM (with Sony) is rolling out what it calls Cell technology . This extends the dual core philosophy, a single chip that passes instructions to as many as eight processors at once. (Think of it as an editor chip in the "slot" of a computerized editing desk.) IBM says it can handle up to 10 instructions at one time.
All the speculation surrounding the Cell involves where it might go, and what it might do. (They're putting it first into Sony's Playstation 3, but it's listed as a PowerPC advance.)
But that's now what you should be thinking about.
The other day a colleague sent me a party invitation. The headline was "HP Plans Retirement Party for Moore's Law." (Real retirement parties, of course, feature lovely cakes like this specimen, from the Carolina Cake Co., Hilliard, Ohio.)
Moore's Law has been buried more often than Dracula, but like Elvis it keeps coming back.
As I've written, the exponential improvements Moore first revealed in silicon have been replicated in optical fiber, in hard drives, in radios, across the technological universe. And it shows no sign of ending.
In fact, the "Retirement Party" was a tongue-in-cheek reference to a new Hewlett-Packard technology that could extend the life of Moore's Law improvements many, many years.
It's called a crossbar latch and in theory it's just a circuit line crossed by two other lines. But it's capable of performing the same functions as a circuit etched in silicon, and when made on nanoscale, it's more efficient.
The key is that the size of the crossbar latch can scale down further than today's circuits. They can be made smaller, thinner, run closer together, and hence, create more circuit density, which is what Moore's Law is all about.
The significance of WiFi-cellular roaming doesn't lie in cutting voice costs. (The picture, by the way, comes from Novinky, a Czech online magazine, a story about DSL.)
The significance of WiFi-cellular roaming lies in Always On applications.
Think about it. Cellular channels are relatively low in bandwidth, WiFi channels are high in bandwidth.
Now, you're wearing an application, like a heart monitor. When you're at home, or in your office, this thing can be generating, and immediately disgorging, tons and tons of data, detailed stuff that may be fun for your doctor to analyze later.
I have talked about this before, but now everyone else is talking, too. So we will, again. (The picture, by the way, is of a single-chip radio from two years ago, a "mote" from Cal Berkeley. The link is very worthwhile.)
What does it mean for TI to make, and Nokia to sell, a complete cellular phone on a single chip? For one thing, it means phones can be one-chip cheap.
Right, cheap as chips.
Along with all their other implications, the mass adoption of mobile phones represents the first step in the single-chip era.
If you look inside the guts of your phone you are unlikely to find a big honking circuit board. (The circuit board illustration is from Sciencetechnologyresources.com.) Instead you will find one, two or three single chips performing major functions in an integrated way.
This is happening across-the-board in technology. We've gone from circuit boards in the 1980s to modules in the 1990s, to single chips. Just as early IBM PC add-in board producers created "multi-function cards" to assure a price worthy of retail distribution 20 years ago, so chip makers today put multiple functions on many chips, creating entire systems no bigger than a finger-nail.
The BBC has a piece today showing how the World of Always On could be invisible, worn instead of held.
We've already seen undershirts embedded with medical sensors. But Ian Pearson predicts we're going to move, over the next 10 years, to a world of devices imprinted on the skin.
One of my problems with most business journalism is we tend to write about companies the way we do sports teams, and it's not that simple.
But mid-way through John Markoff's latest torching of Intel I got a Clue that the company has finally figured things out and is going to turn around.
It was one word, from incoming CEO Paul Otellini.
In a previous life I did some work for Intel's mobile and wireless folks. One thing I learned is they were inherited from Motorola and are based in Chandler, Arizona, rather than in San Jose.
They're pretty easy to scam.
Rather than insisting on the Intel way, which is to define a robust, modular scalable standard that can handle multiple generations of product, these guys follow their competitors' rules. They essentially beg manufacturers to take their products, then trumpet the announcement like it's a big deal when, in fact, it's not. Just beause the maker of a mobile product decides they'll try your stuff doesn't mean they're committing to it -- they commit to what sells.
What are the true facts?
The 802.11 market is stalling.
I know because Broadcom has warned that its sales are flat.
Broadcom absolutely rocks in the Wi-Fi chip market. It is constantly ahead of the curve. It has great relationships with OEMs and product marketers. TI and Intel look good, but no one plays the inside game as well as Broadcom, trust me.
And if Broadcom is catching a cold, then everyone else has pneumonia.
Why is this?
You almost never hear from IBM anymore, except once a quarter when they announce record earnings.
It's time they got some proper respect. It's time they got their props. (Image from the BBC.)
This year IBM will bring in $4.66 in profit for each share of common stock, currently worth about $85. It has sales of almost $93 BILLION per year now, and brings $8 billion of that to the bottom line. (By contrast Microsoft, which claims it has run out of ideas, has one-third the sales, albeit nearly the same level of profit.)
Ah, yes, you say, but what has IBM done for me lately?
Over at The Feature Mike Masnick has a little piece asking whether mobile phones are a black hole for the chip industry. (This drawing passes for Mike's picture over at The Feature.)
On the surface the charge is silly. Chip makers make chips, phones use chips. Phones are quickly replaced, which means the industry sells more chips. If by "black hole" you mean something that sucks up all the industry's capacity, that's not necessarily a bad thing.
But there is danger here, and it's based on the nature of the phones now being produced.
Intel announced what many considered a "blow-out" quarter, with sales up 20% and net income nearly doubling.
Imagine what they could do without one hand tied behind their back. (The image is from a 1995 paper on Internet payment systems by Michael Pierce of Trinity College, in Ireland.)
In Intel's case, the hand behind its back is communications. Intel dominates basic computing, although its lead in servers is shaky enough that it needs promotions. But in the chips that run cell phones, or routers, or any device based on communications, Intel is an also-ran.
While the attention of the public is focused on gay marriage, let's talk today instead about a possible divorce. (The image is of an old movie poster, taken from a schedule of shows at San Francisco's Roxie Cinema.
The evidence is becoming overwhelming. Intel has deliberately made its 64-bit chips incompatible with those of AMD. Yet when Microsoft decided to secure its XP operating system at the chip level, it did so through an alliance with AMD.
So, I put the question to you. Has the WinTel marriage ended? Can this marriage be saved?
While that's all well-and-good, and that might even be possible, I think the comments miss the point.
Yeah, you'll have to click below to get to the point.
Back in the early 1990s personal computing faced a crisis.
The power of the computer had exceeded the needs of most tasks. The answer, it was felt, was to keep more than one task up at a time, to "multitask." (The piece to the left is called "The Multitasking Queen," by Beverly Naidus. You can find it, and several other interesting artworks by the same artist, here, and some background on the installation at Washington University in St. Louis here.)
It took time for software to catch up to this but it did happen.
And now hardware has the same need.
We're accustomed to thinking of chips as doing one thing. You have processors, memory chips, radio chips, communication processors. You have graphic chips and application chips.
But by combining several related functions onto one product, printer makers found new life building "multi-function" products that could fax, print, scan and copy. Why not define all that functionality as one chip?
Processor makers learned long-ago how inefficient it was to keep memory on separate chips, and so "cache" memory holds data as it waits for processing.
By taking this lesson to heart, chip makers are reaching toward a new growth spurt. I hesitate to call it a boom, because true booms are based on doing something new, and we're not talking here about something new, more like something borrowed.
But it's an important concept to Always-On. When you can combine, say, a sensor and a radio, you have a single-chip medical device that is manufactured at the chip plant. The same thing happens when combine communications and a camera's workings. Make every chip that lives in the world an 802.11 client, and the chip-maker has a new growth market, while you as a systems vendor have many new applications.
Suddenly there's a real motive toward making that wireless access point a platform, with a modular, scalable, PC operating system. The wireless network is no longer the end of the road. It's no longer a cul de sac. Like many modern suburbs, it too becomes the city. And it should be planned-for the way suburbs should be planned, with growth assumed, and plenty of lanes for new traffic.
Moore's Law was never intended to be a scientific principle. Gordon Moore was an engineer and entrepreneur. He never claimed to be Isaac Newton or Albert Einstein.
He saw a pattern, and issued a challenge. The pattern was that, using a photographic process, he could double the number of circuits on a piece of silicon every year and a half of so. This was true in 1965, and it's still true. (That's Intel's main office above, a view taken from Intel's Web site.)
But at the heart of Moore's Law was a challenge to engineers everywhere. Keep it going. Even when we reach the molecular scale, find some new technologies, some new ways of making chips, that will keep it going.
And so it has been. So I predict it will remain. Because there are many ways of getting out of the box the original semiconductor business put itself into. There are other materials besides silicon, and other technologies for forming chips besides photographic etching.
No company on Earth understands this better than the company Moore himself co-founded, Intel.
So it's no surprise that Intel has found a way to solve one of the biggest bottlenecks in the silicon world, the transfer between electricity (which moves relatively slowly) and light (whose top speed of 186,000 miles/second is more than a good idea). The first result of this breakthrough is a "silicon modulator" running at 1 GHz, but by the end of the year Intel says the same technology will produce a modulator running at 10 GHz, which is as fast as the best gallium-arsenide modulators. (Editors at The New York Times mistakenly trumpted this as a general chip-making breakthrough.)
The news hits just seven months after Intel bought West Bay Semiconductor, a maker of optical chips. This is the way good mergers work. Give some bright people the resources they need to make a splash faster than they can on their own.
It's a small news item, as these things go. W.J. "Jerry" Sanders is resigning as chairman of the company he founded in 1969, Advanced Micro Devices. (The picture at right is from the AMD web site.)
Of course, AMD without Jerry is a lot like Playboy without Hefner, or News Corp. without Murdoch.
Jerry Sanders was, when all is said and done, one of the most seminal figures of the computer age. For starters, just look at the picture. Look at that suit, those twinkling eyes, that smile. The man made every rival CEO look like a candidate for BBC's "What Not To Wear."
This is quite amazing, given that AMD never did, from its founding until today, get out of the shadow of its larger chip rival, Intel. AMD is Gimbel's to Intel's Macy's, Pepsi to Intel's Coke, Avis to Intel's Hertz.
And the reason for that has much to do with how the two companies are run. Intel is run based on a strong corporate culture, one that rigorously rewards intelligence, and that tries hard to fight its own internal bureaucracies. AMD, on the other hand, is a truly entrepreneurial company, one molded in the image of one man, Jerry Sanders, and whatever his plans were at the moment.
The best historical comparison I can make is to the automotive business, and Walter Chrysler. (The picture of Chrysler at left is taken from the UAW's site for Daimler-Chrysler.) Chrysler called his own autobiography, "The Life of An American Workman," and in fact his early career was blue-collar. But he was, in the end, one of those mythical entrepreneurs who create their own myths.
And his company never did catch up to General Motors. This was partly because GM's founder, Will Durant, was (like Gordon Moore) perfectly happy to give up the reins for someone with more on the ball, in GM's case Alfred P. Sloan.
Sloan was, like Intel's legendary Andy Grove, perfect for his time, but in deference to Grove (one of the great men of our time) Sloan's were different times. Sloan built a set of product lines, he built a corporate culture that could outlast him, and he stuck his more entrepreneurial rivals, Ford and Chrysler, into second-and-third place for all time.
So it was with Jerry. AMD is not a bad little outfit. Its market capitalization is $5.44 billion. You can count on one hand the number of entrepreneurs who have come from nothing to a figure like that, and hung on to their offices that long.
Through it all Jerry was always Jerry. He was notable, quotable, never boring. He was AMD, and I suspect that he always will be.
So raise a glass this weekend to the great Jerry Sanders. The semiconductor industry will never see his like again.