Whew. It’s over. This year’s WWDC was certainly one of the most intense – and one of the most promising – that I can remember.

It started on a somewhat sad note: Steve Jobs was clearly unwell, moving slowly and his voice was unusually weak. He was onstage for only a few minutes before turning things over to his VPs. Then, towards the end, when he talked about iCloud (which is obviously something he put a lot of energy into) he got better. Still, I have a feeling that this might have been Jobs’ last keynote; he’s clearly not going to be around for many more years. However, the general impression I had was that everybody feels that Apple is now synchronized enough with his mindset to go on indefinitely without him.

This WWDC was clearly about pointing out future directions. And it was all about software. Many commenters are pointing out this or that added feature in (say) Mail, or in the Finder, or in iOS 5. Others are bemoaning the lack of hardware announcements. Of course there have been the usual comments about Apple copying this or that from Windows. Still others are gleefully pointing out how the iPhone 5 (for instance) has been delayed – this when it hasn’t even been announced, but they apparently believe in each other’s rumors!

Let’s put the hardware issue away first. As I’ve been saying for a couple of years now, Apple’s heavy investment into Clang, LLVM and connected technologies like LLDB is now paying off. This trio will very soon be Apple’s main developer tools backend. They’ll be free from overweight, ancient, license-encumbered stuff like gcc and gdb and the results are very encouraging. Without going into details (NDA ahem), suffice it to say that fellow developers have seriously agreed with me that the new tools are better – this or that detail notwithstanding – than anything else on the mobile or desktop market today.

Also Apple is now free to make hardware details irrelevant. When Apple switched to Intel six years ago, I wrote:

Winners:

  • Apple, of course. As I commented below, they’re free (or will be, in a year) of the CPU-architecture-as-a-religion meme. They get a literally cool CPU/chipset for their PowerBooks; although I suppose they won’t use that name in the future; how about IBook icon_wink.gif? They get dual-core CPUs right now, and a 64-bit version in the future.

And this is still true. At that time, too, some people saw Apple “imitating” the Wintel machines by adopting Intel CPUs as a negative thing (or even as positive, depending on their bias). Now with Clang/LLVM becoming Apple’s mainstream tools, they could switch CPUs anytime without users noticing; the new Intel-based Macs were still normal Macs, and normal users didn’t care which architecture the ran on. And indeed, lately rumors have abounded about ARM-based MacBooks. But, as I wrote at WWDC 2005:

…it’s a new type of freedom. Freedom of architecture. IBM underperformed, they’re out; at least for now. Intel works better now, they’re in; at least for now. Next year, some other chip may be hot, Mac OS X will be on it, and recompiling will be even easier. We’re free!

For the normal user, hardware specs aren’t that important anymore beyond a certain threshold – if they’re sufficient for the job, the details are unimportant. As the old joke goes:

A lady stands in front of an enclosure in the London zoo and gestures towards one of the hippopotami, asking a passing zookeeper:

“Please, can you tell me if that hippopotamus is male or female”?

“I’m sorry to say that this information would be of interest only to another hippopotamus…”

As Jobs said in the keynote, now it’s all about “devices”. Desktops, laptops, iPads, iPhones – all are equal devices in the iCloud. Few people think of their iPad/iPhone as a computer; the innards are of interest only to those of us who have to develop software or hardware to (ahem) “mate” with those devices. Will the next iPhone or laptop use Apple’s A5 chip, or will there be an A6? My mom doesn’t care, and yours shouldn’t – unless she’s a fellow developer. Not even if she’s a stock analyst!

Next: software directions.