Solipsism Gradient

Rainer Brockerhoff’s blog

Browsing Posts tagged CPU

Wow, that was fast

No comments

Slightly over two years ago, shortly after the Intel switch announcement, I wrote:

…the current installed base is something over 30 million PowerPC Macs (or even more, depending on your sources). By the end of 2007, Intel Macs will be perhaps 15% of that. It will take at least 5 years, probably more, for Intel Macs to surpass the PowerPC Mac installed base.

I’m pleased to see that Mac sales were so good that this milestone – equal number of PowerPC and Intel Macs – will apparently be reached before the end of 2007. At that time, some people feared developers would start releasing Intel-only versions of their software soon; as far as I know, except for natural exceptions like Parallels and VMWare, this hasn’t happened.

And, while recent news indicates that the upcoming Leopard’s hardware requirements have been upped a little, most recent G4s and all G5s will still run it well. Older G4s will, I suppose, be more disqualified by video speed restrictions than by CPU speed as such.

State of the iPhone

No comments

So, half a dozen softwares are now out there that unlock the iPhone in various ways. In the simpler case, they allow the installation of various third-party applications and/or twiddling details. In the more complex case, they mess around with the various phone/SIM settings to allow the iPhone to be used with other provider’s SIM cards.

As I wrote before, Apple has apparently allowed this to happen by not implementing strict security measures. Now that the various unlocking techniques have stabilized, Apple has announced that an upcoming software update might cause “modified” iPhones to become “permanently inoperable”. Just a few days later, the iPhone update to version 1.1.1 came out; it featured the same warning in bold on its installation screen; and it did, indeed, cause some modified iPhones to lock up – the new vernacular is “bricked”, which I think somewhat of an exaggeration. Furthermore, the new software seems as tamper-resistant as the iPod touch software, indicating that Apple has checked out current unlocking techniques and implemented harder locks.

So far, all that was to be expected. What was, to me, unexpected was the reaction of some sectors of the press and of some users – mostly the same people who opposed the iPhone price cut, it seems.

Legally, it seems Apple is in the clear. The warranty and license agreement clearly say that any such tampering is at the phone owner’s risk. Surprisingly, some people seem to feel “entitled” to get warranty support even if they completely disobeyed the license! (Just as they felt “entitled” to have the price kept constant for a long period after they bought it, I suppose.)

The core of the argument seems to be “I paid for the machine, therefore I have the right to do whatever I want with it…” (I completely agree so far) “…and Apple has the obligation to give me full support, warranty and updates no matter how I mess with it!” Now here is where we part company. Sure, I suppose current consumer protection legislation may sometimes be interpreted that way (note I’m not a lawyer and less familiar with US legislation than with the Brazilian one); but you surely can’t pretend that Apple is a public utility or a non-profit charity.

Even from the technical standpoint, these expectations are unreasonable; allow me to explain this in more detail. The problem is one of “state“, in this case defined as ” unique configuration of information in a program or machine”.

In the first computers, the state of the computer was completely predictable when it was turned on: if it had Core memory, it was in essentially the same state it had when it was last turned off, and if the computer had reasonable power supply sequencing, you could just press the start button and continue. For more complex machines this was too hard to do, and the manufacturers declared that the machine was in an undefined/unreliable state after power-on, and that therefore you had to reload the software. For newer machines with semiconductor memory, everything was of course lost during power-off, and software reloading became equally necessary. To do so, you had to enter a short program in machine language using the front-panel toggle switches; this program, in turn, would read the actual software you wanted to run from a peripheral. This was the direct consequence of the machine coming up in a “null state”.

It wasn’t long before people thought of several ingenious ways to make this process more convenient. On the IBM1130, for instance, the hardware was set to read a special card from the punched-card reader, interpret the 80 columns (12 holes each) as 80 16-bit instructions, and execute them. The most commonly used of these cards simply repeated the process with the built-in cartridge disk drive, reading the first sector on the cartridge and executing it. Later on, the falling cost of ROM led to the boot software simply being built into the machine – the Apple II had a complete BASIC interpreter built-in, for instance. The apex of this evolution was the original Mac 128, where most of the system software was in the boot ROM – the system disk simply contained additions and patches. (The QI900 microcomputer I helped design in the ’80s had all system software, with windowing, multitasking and debugging, built into its ROM.) Here we had a well-defined “state” when the machine came up – it would execute a well-known program, and do predictable things, until external software came into play.

In the ’90s the limitations of this became apparent. OSes grew to a size beyond what could be stored in ROM, and no single Boot ROM could do justice to all models and peripherals (*cough* BIOS). Flash memory came up, the built-in software was renamed to “firmware”, and updates to that became commonplace. It was easy to “brick” a system if power went out or if you otherwise interrupted a firmware update before it was complete. In that event, a motherboard swap was usually the only solution, because the interruption left the firmware in a partial, nonworking “state”.

Consider now the iPhone. Its entire system (OS X 1.x) is built into firmware, mostly in a compressed state. This is expanded and run by the main ARM processor, obeying a built-in boot ROM. Supposedly, there are at least two more processors, taking care of network communications and of the cellular radio; each of these has its own boot ROM, and the radio processor has separate flash memory to hold state information regarding the SIM card, cellular system activation and so forth. One of these processors no doubt controls the USB interface to allow the main processor’s flash memory to be reloaded externally. Furthermore, every SIM card also has flash memory on it, containing the IMSI number, network identity, encryption keys and so forth, bringing one more source of complexity to the process.

In other words, you have a complex system of at least 3 processors interacting, each one with a boot ROM, two with flash memory containing state information. Powering up such a beast is a complex dance of each one waking up, testing its peripherals, checking its own state, then trying to talk to each other, then communicating to bring the entire system into a working state. Furthermore, the necessities of the cellphone system and of testing out such a complex piece of hardware mean that the iPhone must decide, on each power-up, in which of several states it’s in: factory testing, just out of the box, activated, reloading the main firmware, working, “plane” mode, and so forth. This is usually done by writing special values to reserved sections of the various flash memories, and of making sure they are always consistent with each other by checksumming and other technical arcana. Should they be found inconsistent, the system will probably try to regress to a simpler state and start over there, in the extreme throwing up its metaphorical hands and plead to be returned to the factory. Ideally, firmware writers strive to make it impossible to “brick”, unless an actual hardware defect occurs, of course; in practice, it’s rarely possible to envision all possible combinations of what could happen, and too few designers do assume a malicious agency is trying to trip them up at all times.

So, what do these various hacks do to unlock the iPhone? They rely upon bugs in the communications software, firstly, to make the system fall back into a state where it pleads for an external agency to reload its main firmware; cleverly substituted instructions then make it do new things. After several, progressively more complex, phases of this, new applications can be installed. Up to this point, only the main flash memory has been affected and installing a new software update will just bring the system back to the standard state. Now, one of the new applications may try to mess with the radio firmware; it will clear or set regions of it to bring the radio processor’s state out of step with reality, or even write bogus activation data into it.

Now, of course, the system’s state has been moved completely out of the state space envisioned by its designers. When it powers up, the state is sufficiently consistent – the various checksums check out OK, for instance – for the various processors to confidently start working. However, a few actual values are different from the intended ones – enough to let a different SIM card work, say. Now, if the hackers had the actual source code and documentation available, all this could be done in a reliable way. But this not being the case, they had to work by testing changes in various places and observing what happened, clearly not an optimal process.

Consider, now, the software update process. It assumes that the iPhone’s various processors and firmware(s) are in one of the known states – indeed, this is required for the complex cooperation required for uploading new software. If this cooperation is disrupted, the update may not begin – leading to an error message – or, worse, it may begin but not conclude properly. At this point, one or more of the iPhones processors may try to enter a recovery routine, either wiping the flash memories or to reinitialize them to a known state. No doubt this will be successful in most cases, and the new update will then be installable on a second attempt. However, the recovery may fail – since the exact circumstances couldn’t be foreseen – or it may be assuming false preconditions (like, a valid AT&T SIM card being present). The system will probably try to recover at successively lower states until falling back to the “can’t think of anything more, take me back to the factory” mode; or it may even lock up and “brick”.

Should Apple’s firmware programmers have tried to prevent this from happening? Well, up to a point they certainly did, as many problems other than hackers can cause such errors – electrical noise, badly seated or marginally defective SIM card, low battery, for instance. The system has to fail gracefully. However, it’s certainly not reasonable to expect them to specifically recognize and work around (or even tolerate!) the various hacks; after all Apple’s contract with AT&T certainly requires them to evidence due diligence in preventing that.

Firmware for such a complex system evolves continuously. The new 1.1.1 iPhone software seems to do many things differently from the original version, even though much of the UI is the same; same goes for the iPod touch software. Neither has been hacked as I write this. Did they now put TrustZone into operation? No idea; time will tell. My hunch is that Apple will eventually come out with an SDK for third-party applications sometime; the question is when. Perhaps after Leopard, perhaps at the 2008 WWDC. Does Apple need AT&T, or any partner carrier, at all? Maybe for now they do, and the unlocking wrangle will continue. In the long run, Apple will be better off with a universal phone that will work anywhere; possibly we’ll have to wait for the current generation of carriers to die before this happens. Interesting times.

One of the salient points repeated at the WWDC keynote was Leopard‘s support for “64 bits top to bottom“. However, a close peek at the slide shown this year showed a subtle difference to last year’s – the word “Carbon” was missing. Of course a storm of confusion soon ensued, with the usual wailing and gnashing of teeth from some quarters and polite shrugging from others. Apple stock fell and rose again, some developers professed bliss while others threatened to leave the platform, non-developers wrote learned analyses about obscure technical points, not to speak of reports of raining frogs or even an unconfirmed Elvis sighting in a Moscone restroom. Allow me to try to explain all (well, Elvis excepted).

First of all, there are a few implications in moving an operating system to 64 bits. I hear that Windows Vista comes in distinct 32-bit and 64-bit versions and that the latter is able to run 32-bit applications (with some restrictions) inside a compatibility box. In contrast, Leopard uses Apple’s experience with architectural migrations to support 32 and 64 bit applications natively on both PowerPC and x86 architectures – not so easy in the second case, but necessary since nearly all currently shipping Macs use Intel’s Core 2 Duo, which is 64-bit capable.

For this, Apple took advantage of Mach-O’s support for “fat binaries” – in this instance called “obese”. Obese binaries contain four different executables: PowerPC 32, PowerPC 64, x86 32 and x86 64. When running one of these applications, the system selects the best supported architecture and links the application to the corresponding (and equally obese) system libraries.

Enter the Carbon vs. Cocoa question. Cocoa APIs are derived from NeXT’s software and are called, usually, from Objective-C. Carbon APIs, to be called indistinctly from C, ++ or Objective-C, were first introduced in Mac OS 8.5 or thereabouts and were, themselves, a much-needed simplification of the “Classic” Mac APIs. Carbon was thereafter positioned as the way to port existing applications to Mac OS X, while Cocoa was supposed to be the right way to write new applications for the new system. No doubt the old NeXTies inside Apple pressed for Carbon being excluded from the start, but Microsoft, Adobe and Macromedia (to quote just the big companies) didn’t want to recode everything on short notice.

A necessary sidenote: the exact definition of “Carbon” is surprisingly hard to pin down, even among experienced developers. Here’s my own (although I’ve never written a Carbon app myself). There are Carbon APIs and Carbon applications. A Carbon application, for me, uses the Carbon Event Model – calling Carbon APIs to get events from the system. Until recently, a Carbon application would also, necessarily, use Carbon windows and the GUI widgets for those, mostly contained in the HIToolbox framework. Starting with Tiger it’s possible for Carbon applications to use Cocoa windows containing Cocoa GUI widgets, with some contortions of course. Other Carbon APIs – like the File Manager, or QuickTime – can be called indistinctly from Carbon or Cocoa applications.

Here’s where things started going awry, from the standpoint of established or multiplatform developers. Apple has always been of several minds about Carbon policy – it was often dismissed as a temporary “transition” technology, while people who interfaced with those developers had to reassure them that Carbon was not going away anytime soon and was not a second-class citizen. Porting software from the Classic Mac OS to Carbon wasn’t always easy; some larger applications took over a year. At the same time, it was seen as being much easier than tossing the whole codebase and recoding in Objective-C/Cocoa.

Now, a few years after Mac OS X was introduced Microsoft, Adobe and so forth had a substantial investment in maintaining parallel codebases for their Carbon applications and, understandably, began dragging their feet about converting to Cocoa at any time soon, or even at all. Due to pressure from these developers the Carbon GUI APIs began to incorporate new elements present only in Cocoa until then, and to all appearances Carbon and Cocoa were now positioned as equal and parallel APIs. In secret, of course, Apple hoped that “those people” would sooner or later see the light and begin doing their next x.0 version in Cocoa. In turn, “those people” harbored serious doubts about Objective-C (seeing it as a dead language with an unreadable syntax) and secretly hoped Apple would “recode Cocoa in C++”. Here’s a significant e-mail from an Apple engineer to the carbon-dev list:

No one reading this list should be under any illusions about Apple’s use of Objective C. Apple really likes Objective C. There are a lot of third-party developers who are using Objective C to program for Mac OS X and who really like it. Apple is not going to stop using Objective C. I’m not making a value judgement here, just stating a simple reality that everyone needs to understand. Do not think that someday Apple will “wake up” and realize that it would be better to recast all of our APIs in C++. That’s not going to happen.

So then came the PowerPC/Intel transition. Cocoa developers already were using Xcode, while many Carbon developers still were using the defunct Metrowerks CodeWarrior; transitioning large codebases to Xcode proved to be cumbersome. Still, people threw in more person-years to bring their apps up to the new standard. Then, at last year’s WWDC, Apple announced the migration to 64 bits, taking the opportunity to remove all legacy, obsolete or deprecated APIs from the new frameworks. Some Cocoa APIs were removed but, again, Carbon developers had more work to do. So once again, more person-years of work were invested.

It now seems that someone in Apple engineering management decided that they couldn’t afford to keep supporting two separate-but-equal APIs anymore, and the “transition” policy was revived regarding 64-bit Carbon applications. From what transpired during WWDC I deduce that some more of the Carbon APIs were taken off the “supported for 64-bit” list, most notably the part of the HIToolbox that concerns Carbon windows and GUI widgets. Therefore, 64-bit Carbon applications would seem to be either not supported at all, or supported only in a transition mode that used Cocoa windows and GUI widgets.

Naturally, Carbon developers were very bitter about this, while some Cocoa developers were asking if their 64-bit Cocoa apps would be able to call normal Carbon APIs (the answer is yes). So far, the most complete explanation I could find is this one (from the same engineer):

Fundamentally, Apple engineering is focused on Cocoa much more than Carbon, and Apple’s engineering management made the decision to un-support 64-bit Carbon to emphasize that fact.

So there you have it. Summary: 32-bit Carbon stays where it is and works fine until further notice – I don’t think they’ll be “deprecated” any time soon. The Leopard Finder itself is still a 32-bit Carbon application! Not until Mac OS 10.6 (LOLCAT, or whatever they’ll call it) comes out, which may take 3-4 years at least, and probably not even then. But 64-bit pure-Carbon apps may be unsupported, or even not run properly, when Leopard comes out in October. Cocoa isn’t going away, and is the future. Has been the future since Mac OS X 10.0 came out, in fact. On the other hand, there’s a migration path – use the Cocoa GUI, then later convert to a Cocoa app. People who have invested a lot of time in Carbon feel really bad about this, and I agree Apple mishandled this badly from a PR standpoint. On the other hand, investing a lot of time in Carbon is now revealed to have been a throw-good-money-after-bad move; some people say “I told you so”.

The final question is, how come neither Microsoft nor Adobe are screaming their heads off about this? While I was wondering about this, I realized that, for normal Mac users, Microsoft Office really doesn’t handle data sets big enough to need 64 bits; they can stay on 32 bit as long as it exists. As for Adobe, at first glance, Photoshop at the very least is just begging for 64 bits… really? Here’s what one Adobe engineer says:

I could have spent this whole cycle moving us to 64 bit rather than working on startup time, but would that give you more of what you want? Add 20 seconds to the startup time you are seeing for the beta for all versions/platforms of Photoshop and compare the value of that version to one where the histogram would be 10% faster on 64 bit machines (and most of the rest of Photoshop being 5% slower). It is true, there are some things, like histogram, that would be 10% faster, I wrote the code to verify this. But, the rest of the product would have been slower without a few people spending the whole cycle going over all of the slow parts and bringing them back to where they were on 32 bit. Most operations on a 64 bit application like Photoshop are actually slower by a small amount if time isn’t spent optimizing them.

Read the excellent comments on that post, especially the more recent ones, for much more discussion of the details on the Photoshop side – I suppose many of those would apply equally to other large Adobe/Macromedia apps.

So there you have it.. the big guys don’t need to move up for now. The small guys are mostly in Cocoa already. Unfortunately, the intermediate cases have fallen into the crack for now – think multiplatform CAD software for instance. It’d be very sad to see them leaving the platform in a huff about this; I sincerely hope Apple will contact all of them privately and smooth things over for now, somehow, though I can’t really imagine how. Maybe they’ll even re-add support in October, now that the point has been made.

Update: fixed a misconception about the PowerPC->Intel migration, see explanation above.

Re: iPhone updates

No comments

An Italian newspaper article quotes Dario Bucci of Intel Italia as saying that the iPhone uses Marvell‘s Xscale-derived CPUs. (Curiously enough, the current version of this article doesn’t show this part of the interview anymore…)

Well, who cares? Indeed, by now I agree with HiveLogic that the CPU is irrelevant. Marvell bought Xscale from Intel about 6 months ago. Xscale, in turn, uses some of the ubiquitous ARM cores that were rumored to be the iPhone’s CPU (as well as, probably, powering some other chips in there). But as the HiveLogic article says:

And now the iPhone uses yet another CPU, and we should still expect OS X to feel like OS X. Apple seems to be pushing the idea that the CPU shouldn’t matter to the user of an Apple product. And I think that?s why Apple isn’t talking about the iPhone?s CPU.

Right. With Leopard, Apple’s development tools support building apps for any combination of 32-bit, 64-bit, PowerPC (big-endian) or Intel (little-endian) CPUs. Since the gcc compiler supports ARMs and many other architectures, and the major stumbling block (the endian issue) has been solved, by now OS X can be safely assumed to run on nearly any modern CPU.

Now, in the past, I’ve been as prone as any to argue endlessly about the superiority of the PowerPC architecture (or of the 68K architecture for that matter) over x86, but for most practical purposes I have to admit all that has become a non-issue – especially for a device like the iPhone. I for one welcome our new <your architecture here> overlords, and that’s that.

Happenings

No comments

Things seem to be moving along well. Here are some random observations.

As I thought, Time Machine is starting to prompt manufacturers to begin offering affordable RAID backup systems. First out there was the D-Link DNS-323, which has 2 SATA drives and a gigabit network interface, and now there’s the Iomega UltraMax (no URL yet), 2 SATA drives and USB/Firewire interface. At WWDC I actually looked for such a product but couldn’t find anything suitable; they were all too large, too expensive or both. Time Machine without RAID means putting all your backups into one basket, so expect lots of better and less expensive backup drives to show up before or at next MacWorld Expo in January 2007.

The 64-bit iMacs are just out, as well as speed-bumped Mac minis. The timing on this is significant. There’s the mysterious “showtime” event announced for Sept.12, the initial day of Apple Expo Paris – and also the final day of IBC Amsterdam, the “content creation” conference. On the end of the month Apple will be present at Photokina. Of course this means that the upgrade are not significant enough to be presented at these events; rumors are flying about what media-related products will be announced. I suppose that movie sales over the iTunes Music Store is pretty much a given, although that’d a pretty unexciting, US-centric, thing by itself.

I suppose that the putative iPhone might be counted under “media”, as everybody seems to expect a phone-capable iPod instead of a music-capable cellphone under that name. While I’m a happy owner of a 3rd-gen iPod – by coincidence bought in Paris shortly after Steve Job’s last Apple Expo keynote 3 years ago – I can’t see why I would want a cellphone built into it. Or a PDA; I bought the original Palm Pilot when it came out and couldn’t get used to that either.

Should Apple bring out a product that might be classified as a “phone”, as a stockholder I seriously hope it’s not a me-too cellphone/music player/PDA. Just look at the restrictions that have so far hampered world-wide deployment of the iTMS. Combine that with the hundreds of technical and regulatory circumstances that govern cellphones in the various countries, and it’s a recipe for disaster; just check out what happened to that Motorola/Apple phone. So, hopefully, Apple will bring out something pioneering and generally usable – perhaps involving new wireless and VoIP technologies.

The iMac announcement also has deeper meaning. With the new 64-bit chips supposedly running faster at the same price point, it’s mostly a question of chip availability to convert all the line. I seriously expect all Macs to be 64-bit capable in January. Converting the iMacs at this time also means that more developers will have extra time to port their apps, if necessary. When Leopard comes out sometime between January and March a surprising number of applications will be ready for it.

Update: Apple has patented a “multi-functional hand-held device” that purports to:

… include two or more of the following device functionalities: PDA, cell phone, music player, video player, game player, digital camera, handtop, Internet terminal [and/or] GPS or remote control.

The patent covers:

Touch Screen, Touch Sensitive Housing, Display Actuator, Multi-Functionality, Form Factor, One-Handed vs. Two-Handed Operation, Footprint/Size, Full Screen Display, Limited Number of Mechanical Actuators, Adaptability, GUI Based on Functionality, Switching Between Devices (GUI), Operating at Least Two Functionalities Simultaneously, Configurable GUI (User Preferences), Input Devices, Pressure or Force Sensing Devices, Force Sensitive Housing, Motion Actuated Input Device, Mechanical Actuators, Microphone, Image Sensor, Touch Gestures, 3-D Spatial Gestures, Perform Action Based on Multiple Inputs, Differentiating Between Light and Hard Touches, Example of a New Touch Vocabulary, Speaker, Audio/Tactile Feedback Devices, Communication Devices (wired & wireless) and Change UI Based on Received Communication Signals.

…all that’s missing is a biological signal sensor and a recreational pharmaceutical dispensing device, to make this the functional equivalent of the “Joymaker” Frederik Pohl wrote about in his 1965 book The Age of the Pussyfoot. I wonder if that counts as “prior art”…?

WWDC thoughts

No comments

WWDC 2006 had two main topics: the new Mac Pros and Leopard. Regarding the Mac Pros, they seem quite competitive and well-built. I’m very content with my current iMac G5 (which I bought at last year’s WWDC), so I haven’t looked at them closely; desk space is important to me. By the way, it seems that Apple Brazil has just released the local price of the standard configuration, and it comes out to US$5400. Ouch.

Regarding Leopard, there’s a nice write-up at Wikipedia, so I won’t try to enumerate everything here.

All in all, I can say this was one of my best WWDCs yet. As I’ve said before, the timing was excellent. Apple has obviously made the most of the (unusual) June-to-August delay and from the developer standpoint all important stuff was in place. Most of the Leopard APIs seem to be well-defined, reasonably stable, and of course the tools are all in place. Xcode 3.0 and the new developer tools “just work”. In one Q&A session Chris Espinosa, was asked about the stability of the tools – whether developers should rely on them for new products, or should wait for the GM release – answered “we all use them daily for building Leopard, so they have to work well!”

So, this is important news for developers. Before, existing tools were used to build a new system and the next generation of tools came out with (or after) the GM release. Now, Apple has obviously been working on the new stuff iteratively; first versions of the new frameworks were used to build first versions of the new tools, then those were in turn re-used to work on the new frameworks. Certainly this has always happened to some extent, but I believe that this synergy between tools and frameworks has now hit an important inflection point.

Apple has clearly been working towards this for years. Mac OS X ‘s frameworks are now 4-way universal, containing binaries for PowerPC 32, PowerPC 64, Intel 32 and Intel 64 bits. Therefore, applications can now be built for all 4 environments, and all are fully supported by the new developer tools. This is a well-known capability of the Mach-O executable format by the way, not a new thing; NeXT applications were also distributed for several architectures, and the Virtual PC 7 executable has binaries optimized for 5 (!) different PowerPC variants.

Framework synergy is a marvelous thing. Apple has just released a short movie showing off CoreAnimation. The “city towers” effect in the second half can be now rendered in realtime on a MacPro – something that even two years ago would have been impossible. Looking very closely you can see that some of those squares are actually playing movies! But only a developer can fully appreciate the other important aspect: the code to generate this movie has shrunk down to less than 10% of what would have been necessary in Tiger.

Let’s talk somewhat vaguely (NDA mumble mumble) about how synergy might have made this happen in the CoreAnimation case. The MacPro has a faster, 64-bit CPU architecture, as well as faster video cards. 64-bit processes can use extra CPU registers and the new vector operations which seem to, finally, have equivalent power to the PowerPC G5’s AltiVec. The LLVM compiler is now used in the OpenGL stack to add a significant speedup (and this even for low-end machines!). Add in Quartz optimizations. Add in easy Cocoa support for all of these. Add in runtime efficiences introduced by Objective C 2.0. Add in new debugging and optimization made possible by implementing DTrace. Add in the ease of programming all this with less and more reliable code. Add in some extra stuff I can’t talk about… it adds up! And comparable gains are visible all through the new system.

At the risk of repeating myself, all this ho-hum talk about Leopard just being Spaces, Time Machine and some UI tweaks to iChat and Mail is so wrong. Granted that Steve Jobs had to show some non-developer news at the keynote, given the wildly unrealistic expectations. But, it really was a Developer Preview. It was released at the right time and to the right people in order to make sure that, whenever the Leopard GM comes out (my guess would be in January at MacWorld Expo), some hundred cool new applications will be available on the same day. And they’ll necessarily be Leopard-only; expect Leopard to be adopted by a significant portion of the user base in a very short timeframe.

Now to my own projects. I’d really love to have the upcoming XRay II to be Leopard-only, but that would delay release too much, and it doesn’t really need 64-bit capabilities. However, I’ll really need some fixes introduced in the last Tiger updates, so 10.4.7 will be the minimum supported version, which should be no hardship, as I can’t imagine anyone voluntarily staying with older Tiger versions. However, some of the stuff I’ve seen at WWDC has completely changed my plans regarding certain features and capabilities, so I’ll opt for implementing things in a way that might be a little constrained under Tiger but will really be much better under Leopard.

RBSplitView has been a marvelous experience for me. It’s been very widely adopted and even the Cocoa team has promised to take a look at it (no promises, of course). And it was very gratifying to be instantly recognized by many famous developers – of course my XRay II/RBSplitView t-shirt was intended to make this very easy! I’ve received lots of positive feedback and I’m working hard on implementing my own fixes and all suggestions. Hopefully I’ll have version 1.1.4 out in a very short time. This should be universal and fully compatible with Xcode 2.4. A 64-bit version compatible with the new Interface Builder will, unfortunately, have to wait until a more widely available Leopard beta comes out – I’m waiting for word from Apple about when it’ll be kosher, as I’ll necessarily have to include some new Leopard headers and APIs.

I have updated Nudge to be universal, and it seems to still be useful on Tiger for mounted network volumes. Expect the update to be available in a few days – I’m still doing some last-minute checking. Zingg! is, unfortunately, much harder to update at the moment. I haven’t touched it since 1.4.1 came out over 2 years ago, and the source code has been pushed out to some CDR backups – and the two I’ve found are, sadly, unreadable. I still have some hope of finding a copy someday (there’s a ton of stuff stashed away from my move), but don’t count on it. Recoding it from the ground up will have to wait for Leopard, where it’ll be much easier to do – there were some awful undocumented things I had to do at the time. Sorry about that. The USInternational keyboard layout will soon be repackaged in a way that (hopefully) will work around the “custom keyboard layout is randomly deselected” bug in Tiger. I’ll need to wait for a Leopard beta to come out to check if it’ll be upwardly compatible, though. I’m trying (again) to go through Apple channels to have it included with the standard system, perhaps this time it’ll work out?

So, I’m just back from getting my WWDC badge. I’ve seen the famous banner and all icons on it are known – the only one I had doubts on (above the SpotLight icon) is supposedly from a Mac OS X Server utility. Even the 64-Bit icon was previously used when the G5 came out. Ah right, we now know what the Leopard “Big X” looks like – black with a white border. Drat, I need to change the XRay II icon to reflect that…

The relative sizes and positions give no hints. There are a few hardware icons. One iPod Nano. 3 iMacs, 2 laptops and one desktop – the latter one from the side, so the front may be different. Or the banner might just be there as a misdirection and may be changed on Thursday… the Xcode icon is very large – so large that one can read the small print on it, but then it’s a developer’s conference. On the other hand, people “in the know” did tell me to make sure not to miss the developer tools sessions.

Certainly a major release of Xcode is in the works. 2.5 or 3.0, it doesn’t matter, but my personal hunch is that the superannuated Interface Builder application will be phased out and integrated into Xcode. Let’s hope that connections like outlets and bindings will be easier to visualize and debug, and that the IBPalette interface is finally officialized so that we can write non-trivial palettes.

I’ll be under NDA for details – things announced at the keynote excepted – so these will be my final pre-WWDC speculations. On the hardware front, 64-Bits is of course guaranteed, with one of the new “Core 2 Duo” chips. A Mac Pro will certainly be out, although the name may not be exact, and the casing will probably be a minor variation on the current one. There’s a good Ars Technica writeup about the new Intel CPUs, and expectations are that the whole new range will fit nicely into the spectrum from MacBook Pros to the Mac Pros – possibly with a dual-core, dual-CPU at the top, although it might also be that Intel has been reserving their quad-core chip for Apple to announce. Intel Xserves might also appear.

I don’t expect a new iPod to be announced in a big way, except as a footnote to the usual summing-up of past sales; at a developer’s conference, it’ll be big news only if it had an official API for developers to extend its functionality, which might actually be a neat way for Apple start a new iPod generation in a privileged position; stranger things have happened.

I’m reasonably certain that we’ll each get a Leopard preview DVD. I’ve seen rumors of changes to applications, which I consider less interesting as they’re not really a part of the OS itself, at least from my developer’s standpoint. I use relatively few of the iApps every day – Safari and iChat are the ones I leave open, and my wishlist for those is small.

Real Leopard features I expect to see:

RBSplitView adopted! Well, not likely, but it’d be nice… I’ve told Apple I’d gladly give them the code, anyway.

– A new UI theme, or at least a migration of the default windows theme to the new “cool gradient/smooth metal” look.

– Some new Cocoa widgets, especially the more successful ones from the Tiger iApps. I hope to see them do Brent Simmon’s “big time tabs control”; I need it badly for XRay II.

– A new Finder. I’ve mostly gotten used to the old one, but still…

– Resolution independence. We need to get away from the pre-rendered bitmap widgets. People are already starting to use object-based PDF files for that, but they’re a pain to make and don’t look good at all resolutions. My ideal solution here would be a new NSImageRep and corresponding file format that would do for images what the TrueType format did for fonts: resolution-independence with special hinting for small sizes.

– More extensions to Objective-C. Garbage collection should be a given. Unloading NSBundles is supposed to be in the works. Frameworks included inside applications can’t be easily updated and versioning is pretty much useless for practical purposes.

– Hopefully we’ll see expanded metadata capabilities and a more useable SpotLight. I hardly use it in Tiger because it’s so slow and limited. The ability to have additional named forks should go hand-in-hand with full NTFS support. Other file systems would also be nice (ZFS, anyone?).

– Virtualization. I’ve written about this several times. My personal opinion is that Apple should write a fully trusted hypervisor into the EFI (using the TPM) and run everything inside virtual machines, including Mac OS X for Intel itself. Booting some version of Windows into a second VM would be easy, then, and there wouldn’t be a full version of Mac OS X for Intel for people to run on standard PCs either. I don’t think dual-booting is a good solution, I believe Apple was just testing the waters with BootCamp. No idea what would happen to Parallels in this scenario; they might be bought out by Apple, or by Microsoft, I suppose. Here are more thoughts on virtualization from Daniel Jalkut and Paul Kafasis.

– 64-bit “cleanness”. Meaning, Carbon and Cocoa and everything else running in 64-bit apps. And very probably, also, on the G5s. However, I’m not sure (and no time to research at this moment) how mixing 32 and 64 bits works on the Intel CPUs. I remember reading somewhere that it’s not as easy as it is on the G5, where you can have 32-bit processes co-existing with 64-bit processes.

Unlikely or even impossible:

– A new kernel.

– iPhone, iPDA, iGame, iTablet. iAnything in fact. There are rumors about VoIP support and there might be some sort of hardware for that, but I can’t see Apple doing a me-too cellphone.

– Some goodie under the seat (like when the iSight was introduced, which I missed out on, argh!).

In the meantime, I’d better get back to my coding… more after the keynote!

Some posts ago, I mentioned a white paper on FW800; now James Wiebe has written an update:

f you are making storage decisions based on rollouts of FireWire 800 technology, your purchasing priorities are sadly out of order. Apple was the only champion of FireWire 800; a task it seemed to take reluctantly. Now, Apple is making marketplace moves that are absolving itself of FireWire 800.

Worth a read.

In other news, there’s no news on progress to make the new Intel iMacs boot anything else. There’s even a reward posted; I’ll be very surprised if someone collects it anytime soon (or even, at all). As the Apple/Intel FAQ notes, some people have suceeded in rendering their iMac unbootable by trying to change the EFI parameters. As that page also notes, it appears – despite some reports to the contrary – that the Core Duos Apple is using do report the VMX flag, which stands for Intel’s virtualization technology.

There’s a somewhat alarmist article out, regarding the Core Duo’s current 34 erratas, only one of which is slated to be fixed in subsequent production runs. Of course the number of such erratas is proportional to a chip’s complexity; both PowerPCs and older Intel chips have comparable numbers. I’ve looked over the list and couldn’t find anything immediately alarming, as nearly all of them have a software workaround and/or are not revelant to user code. Also, some of these seem to be inherited (and never fixed) issues from older Intel designs, meaning they’re considered harmless. All in all, you shouldn’t worry about this.

Photos licensed by Creative Commons license. Unless otherwise noted, content © 2002-2024 by Rainer Brockerhoff. Iravan child theme by Rainer Brockerhoff, based on Arjuna-X, a WordPress Theme by SRS Solutions. jQuery UI based on Aristo.