Yesterday was a historical day. Everybody around me in the press section was betting on a set-top box or an Intel-built PowerPC. Instead of that, we witnessed the shortest keynote in my memory; the one where the CEO of Apple and the CEO of Intel actually hugged onstage, if briefly; and the one where it was revealed that the Macs Steve was running the demos on had “Intel Inside”.
Well, so I said:
There are several more likely explanations than Apple ditching all of its AltiVec optimizations and switching to a plain x86 codebase…
Intel building a hybrid chip that runs both standard x86 and PowerPC/Altivec code either natively, or with some hardware assist, even if it’s just an endianness switch bit…
About the only thing I’m sure won’t happen is Apple building, and running Mac OS X on, standard x86 motherboards.
I also wrote several posts and magazine articles in the past years arguing against the last point.
And nearly all more knowledgeable tech writers agreed with me. After all, we all knew the Pentium sucks, right? Or at least the PowerPC architecture ran rings around it, or toasted it, or whatever other metaphor was used in the Apple commercials some years ago, when it was demonstrably true. They even showed one of those during the keynote.
Now, suddenly, the handed-down wisdom is that the Pentium still sucks – but it sucks much faster than the current G5 does. The mobile varieties suck less power while still sucking faster than the aging G4. The dual-core Pentium sucks doubly well while the dual-core G5s are still in IBM’s and Motorola’s labs.
Yes, Altivec still beats Intel’s SSE2 easily – it has more registers, it’s faster for the same clock speeds, it has more and better instructions. But at a slower clock speed, and when the rest of the system can’t keep up, it doesn’t matter. Yes, the PowerPC instruction set is more modern, it has more registers, and so forth. Same argument.
So what can I say now? Unfortunately, I have to say it’s time to swallow the bitter pill and go with the flow. There were some dismayed faces around me, but the reactions were much more contained than the ones I got from people who weren’t at the keynote; they didn’t have the soothing Reality Distortion Field (and most important, the convincing demo) to cushion them from the shock. I’ve already been invited to join indignant petitions, to write debunking articles, and so forth.
I’ve used Macs almost exclusively since they came out in 1984. I have occasionally had to use a Windows PC; I’ve even owned one for an interesting year, where I could multiple-boot the BeOS, the Rhapsody beta, and even Windows 98 for accessing my bank, on the same machine. But in general, I’ve shied away from PCs and I’ve been proved right and right again: most of them were shoddily made, they were flaky, hard-to-configure beasts, and there was the continual hassle of making stuff built by multiple manufacturers to get to work together. Nothing “just worked”.
And for the several OSes I ran on my PC, I had to hand-pick a motherboard, a video board, a network board and so on that were on all the support lists; in the end it turned out that there was such a combination, somewhat expensive even, but it worked well. I even connected it to an Apple monitor. And, while running Rhapsody or BeOS, suddenly the ugly PC box didn’t matter as much.
So, what changed my mind all of a sudden? The fact is that we all discounted the weekend news of the Intel switch as a rumor. Apple switching to Intel was nonsense, and saying the transition would happen from 2006 to 2007 was even greater nonsense. The notorious Osborne effect was cited by many; Osborne preannounced a much better computer than their current one, and nobody bought it, waiting for the new model, which never came out.
Now, what everybody was thinking subconsciously was that Apple couldn’t just suddenly switch to Intel; there’s no lead time to convert all the new technologies; there’s no way to build a good PowerPC emulator for the Pentium; they’ve invested too much in Altivec; third-party developers wouldn’t be able to convert and would abandon ship. At the same time, pre-announcing a switch for the future would mean doing an Osborne and cannibalize the existing line. It turns out that obviously pre-announcing was a much better idea – but only when the “secret double life” of Mac OS X and the iApps was revealed.
I mean, Darwin always had an x86 version, and Apple did admit that parts or all of Mac OS X were compiled now and then for x86 – just to check if they had any subtle architectural dependencies in them. “Just in case” they had to switch in the far future. Nobody would have thought they were building their entire software line on both architectures all of the time. Yes, “just in case” is a phrase I heard many times from many people yesterday.
Many are of two minds about how long the transition should take; some are worried that it will be too long, and Apple’s sales will fall because of the uncertainty; some are worried that it will be too short, and that developers will suddenly publish Intel-only binaries that will leave PowerPC owners behind. I don’t believe that will happen. Steve Jobs said PowerPC “will be supported for a long time”, and that should mean at least 5 years, perhaps more. No developer in his/her right mind would ship an Intel-only binary for a long time, if ever; it’s so easy to build a fat binary; and to do it just to save a couple of megabytes or less on program size is so second-millenium…
The original argument of compiling stuff for two, or even more, architectures will always hold. As long as Apple has an internal build for PowerPC, Intel will always fear that IBM can come back and take the platform away again. Or some third manufacturer (AMD?) could do it. In fact, this signals the end of CPU architecture dependence. Apple could well have multiple architectures in multiple product lines, and nobody should care about which is in a particular machine. They’re even urging people to stop coding directly for Altivec or SSE2 and use their Accelerate framework instead, meaning a Cell version of Mac OS X may already be in their labs.
So, it’s a new type of freedom. Freedom of architecture. IBM underperformed, they’re out; at least for now. Intel works better now, they’re in; at least for now. Next year, some other chip may be hot, Mac OS X will be on it, and recompiling will be even easier. We’re free!
Truly these are interesting times. Next: a look at the Transition Kit. As soon as I find out what of that is not covered by the NDA.
Leave a Comment