Source URL: http://www.itworld.com/it-management/346559/why-intel-cant-seem-retire-x86

March 04, 2013, 8:56 PMBy Andy Patrizio, ITworld https://web.archive.org/web/20130308135126im_/http://www.itworld.com/sites/default/files/intel_cpushack_0.jpg Image credit: CPUShack Museum

It's rare that technology can last multiple decades, but it does happen. Bob Metcalf invented Ethernet while working at Xerox PARC in the early 1970s and it still runs the Internet, TCP/IP was a DARPANet creation of the early '70s and sendmail, used in SMTP email routing, was created in 1979. So for all the modernity of technology, we're still using a lot of stuff that's middle-aged in human terms.

The x86 microarchitecture is another aged technology, and it has survived more assassination attempts than Fidel Castro. What makes the number of attempts on x86 more interesting is that Intel is the one who keeps trying to take it out. On at least three occasions, the company had what it thought was the successor to x86 and in all three cases it failed to one degree or another.

While those chips failed, x86 only got stronger in the process. Its fight with ARM may prove the greatest challenge ever, but for now it's still playing out. Let's take a look at those three would-be successors to x86.

iAPX432

It is possible to be too far ahead of your time, as the iAPX432 showed. It was ambitious and extremely complex, and a total failure. Begun in the mid-1970s and shown in 1981, iAXP was a multi-chip, 32-bit microprocessor referred to as a "MicroMainframe" or a "mainframe on a chip." It had a very advanced design that included garbage collection, built-in fault tolerance and support for object-oriented programming. It promised multiprocessing in clusters of up to 63 nodes.

And it was a disaster. At the same clock frequency as a 286, the 432 ran at one quarter the speed. Intel never even shipped it to the market. So what went wrong? Just about everything.

"I think they tried to do too much at the time, trying to integrate the latest and greatest out of universities that didn't lend itself to hardware at the time," says John Culver, owner of the CPUShack Museum and historian on all things CPU.

Martin Reynolds, a research fellow with Gartner, says the 432 comes from a concept called the semantic gap, where programmers noticed that they got the best code when the chip's instructions reflected the code they were writing. So if the instruction looked like Fortran or COBOL instructions, you got the best results.

"That's the idea behind the semantic gap, to make everyone speak the same language," says Reynolds. "They put in very high-level instructions so the gap between code and instructions were very short. That allowed programmers to do things very quickly." The problem is that along came the C language, which blew every other language out of the water and it ran terribly on the 432.

iAPX432 could have been Intel's Waterloo. All of its top talent was working on the processor. Fortunately, two junior engineers named John Crawford and Pat Gelsinger were working on a side project, turning the 16-bit 80286 into a 32-bit chip. Intel had their work – the 80386 – on which to fall back, and a good thing, too.

But the iAPX432 wasn't a waste of engineering time. Much of the multitasking and memory management features found their way into the 386 and 486 designs, and Intel would later bring a single-chip version of the 432 to market called the i960.

The i960 found its way into embedded systems and Intel sold it for almost 20 years as an embedded controller. "Most people consider the 960 to be a failed design because you didn't see it in a PC, but it didn't go out of production for 20 years," said Culver.

i860

The i860 was Intel's first big stab at RISC processors (although it could be argued the 432 was a RISC chip). It came out in 1992, right around the same time Intel released the 486DX2, which featured an internal clock that was twice as fast as the CPU bus, a revolution for the time.

(Just to show you how things have changed, your CPU clock now is on average 22 to 30 times faster than the bus.)

But Intel ran into a few problems. For starters, the market wasn't sure which side Intel was on. Intel put both processors out there and let the market decide, and the market chose x86, the processor with what was by then a huge existing library of software. i860 was a whole new design with no software and it suffered from a chicken and egg problem all new processors face.

Then there was the fact that the RISC market really heated up in the '90s, with SGI's MIPS processor, DEC's Alpha, HP's PA-RISC and eventually IBM's Power all fighting it out.

In the end, the i860 was undone because the compilers couldn't fully optimize code for it, says Culver. "It had a niche success where code could be done very specifically, code that does one thing and does it very well. It was used in things like high speed image processing, almost DSP-like tasks. That's due to its design. It almost has an on-board graphics processor," he said.

i860, though, never caught on as a general purpose CPU as Intel intended. It had ideas like zero signal process instructions and single instruction multiple data (SIMD) that are now standard in GPUs. But it was terrible at context switching, said Culver. It could take 2,000 cycles to switch tasks, which is an eternity for processor. It was also terrible at multitasking.

The floating point registers in the i860 that made it so popular in multimedia would find their way into the x86 in the form of MMX, or MultiMedia Extensions. They would be introduced in the Pentium line in 1996 and are still in use today.

Itanium

What started out as a "brilliant" idea on paper, as Reynolds puts it, has turned into an embarrassment. "It was going back to simple. We were going to make the internals very fast and use the compiler to set the instructions up for us so they run very quickly. To that point, hardware had to take instructions, pull them apart and reassemble them to run through the processor," says Reynolds.

Intel predicted IA-64 would replace x86, so it didn't bother working on a 64-bit version of x86. DEC, HP and SGI all gave up their RISC efforts in favor of Itanium and HP was Intel's development partner on the project. Sun promised a port of Solaris. Only IBM stayed out, content with its Power architecture.

And then the wheels came off. Performance was poor and no one would port their x86 apps to Itanium. DEC and SGI ceased to be effective competitors in the marketplace. Sun stayed with Sparc. The initial Itanium chips sold for as much as $2,000. Support dropped off before it even reached the market in 2001.

"Intel went to software developers and said 'we need code for this.' Code writers said 'we need compilers because nothing we have is optimized for this architecture'," says Culver. He faulted Intel for not making optimal compilers available for developers when it shipped, which stopped the processor's momentum dead.

Reynolds also faults Intel for not providing the compilers developers needed. "Developers rely on the compiler to build the code in such a way it can use all those execution units. Without good compiler support, that's not going to happen. Clock speed doesn't matter as much because you do everything in parallel. Poor branch prediction slows the computer way down because then clock speed becomes most important," he said.

Then AMD struck. Former DEC engineer Dirk Meyer, who helped design the 64-bit Alpha processor, designed the Athlon desktop and eventually Opteron server chips, which were 64-bit x86, had the memory controller on the chip, and eventually became dual-core.

AMD went from also-ran to major competitor almost overnight. Suddenly there was a 64-bit x86 desktop available for under $200. Opteron was the first 64-bit x86 processor and was the first platform used in server consolidation efforts because it busted the 4GB memory limits of 32-bit processors. In the space of two years, AMD went from 0% server market share to 20%.

This forced Intel's hand on x86. After saying x86 didn't need 64-bits because we would go to Itanium, x86 went 64-bit. Core processors for desktops and Xeon for servers began gaining more and more features found in the Itanium, such as memory error correction. The Xeon 7500, launched in 2010, added a number of RAS (Reliability, Availability, Scalability) features found only in Itanium.

Then in November 2012, Intel announced plans to merge the Itanium and Xeon architectures, sharing essential on-chip features. Intel said it was doing this to reduce development costs, but with 80 percent of servers running x86 and 14 percent using RISC/Itanium and shrinking fast, according to Reynolds, the future does not look good for Itanium.

Conclusion

In all three cases, the processors were undone by two things: a lack of adequate compilers and the entrenchment of x86. With each passing year, x86 only accumulates more software. Intel has a compiler business but the standard is Microsoft's Visual Studio and Microsoft plans Visual Studio around its own releases, not Intel's.

Despite multiple attempts to retire x86, it may end up that ARM will be its undoing. The processor in your smartphone is getting faster and more powerful with each generation. Initial tests on Nvidia's upcoming Tegra 4 processor is that it will be three times faster than the Tegra 3, according to Jim McGregor, president of Tirias Research.

"That's equal to somewhere between a Core i3 and i5. Even when you don't include new chips, if you look at the rapid progression of latest processors from Qualcomm, Nvidia and Apple, you see how quickly those things are ramping up and these are still 32-bit processors," says McGregor.

The worst part for Intel is that it had an answer to ARM: the StrongARM/XScale ARM processor that it sold off to Marvell in 2006. "If they stuck with StrongARM, they'd be leaps and bounds ahead of where they are now with Atom," says McGregor.

It will be a challenge awaiting Intel's next CEO, as current CEO Paul Otellini is headed for retirement later this year. He was with Intel the entire time, through the iAPX432, the i860, the Pentium bug fiasco (he was the general manager of the Pentium group at the time) and Itanium. Otellini did a lot to turn around the mess at Intel, including responding to AMD's Athlon challenge, but in the end, he couldn't stem the ARM tide.