Solipsism Gradient

Rainer Brockerhoff’s blog

Browsing Posts in Hardware

One of the salient points repeated at the WWDC keynote was Leopard‘s support for “64 bits top to bottom“. However, a close peek at the slide shown this year showed a subtle difference to last year’s – the word “Carbon” was missing. Of course a storm of confusion soon ensued, with the usual wailing and gnashing of teeth from some quarters and polite shrugging from others. Apple stock fell and rose again, some developers professed bliss while others threatened to leave the platform, non-developers wrote learned analyses about obscure technical points, not to speak of reports of raining frogs or even an unconfirmed Elvis sighting in a Moscone restroom. Allow me to try to explain all (well, Elvis excepted).

First of all, there are a few implications in moving an operating system to 64 bits. I hear that Windows Vista comes in distinct 32-bit and 64-bit versions and that the latter is able to run 32-bit applications (with some restrictions) inside a compatibility box. In contrast, Leopard uses Apple’s experience with architectural migrations to support 32 and 64 bit applications natively on both PowerPC and x86 architectures – not so easy in the second case, but necessary since nearly all currently shipping Macs use Intel’s Core 2 Duo, which is 64-bit capable.

For this, Apple took advantage of Mach-O’s support for “fat binaries” – in this instance called “obese”. Obese binaries contain four different executables: PowerPC 32, PowerPC 64, x86 32 and x86 64. When running one of these applications, the system selects the best supported architecture and links the application to the corresponding (and equally obese) system libraries.

Enter the Carbon vs. Cocoa question. Cocoa APIs are derived from NeXT’s software and are called, usually, from Objective-C. Carbon APIs, to be called indistinctly from C, ++ or Objective-C, were first introduced in Mac OS 8.5 or thereabouts and were, themselves, a much-needed simplification of the “Classic” Mac APIs. Carbon was thereafter positioned as the way to port existing applications to Mac OS X, while Cocoa was supposed to be the right way to write new applications for the new system. No doubt the old NeXTies inside Apple pressed for Carbon being excluded from the start, but Microsoft, Adobe and Macromedia (to quote just the big companies) didn’t want to recode everything on short notice.

A necessary sidenote: the exact definition of “Carbon” is surprisingly hard to pin down, even among experienced developers. Here’s my own (although I’ve never written a Carbon app myself). There are Carbon APIs and Carbon applications. A Carbon application, for me, uses the Carbon Event Model – calling Carbon APIs to get events from the system. Until recently, a Carbon application would also, necessarily, use Carbon windows and the GUI widgets for those, mostly contained in the HIToolbox framework. Starting with Tiger it’s possible for Carbon applications to use Cocoa windows containing Cocoa GUI widgets, with some contortions of course. Other Carbon APIs – like the File Manager, or QuickTime – can be called indistinctly from Carbon or Cocoa applications.

Here’s where things started going awry, from the standpoint of established or multiplatform developers. Apple has always been of several minds about Carbon policy – it was often dismissed as a temporary “transition” technology, while people who interfaced with those developers had to reassure them that Carbon was not going away anytime soon and was not a second-class citizen. Porting software from the Classic Mac OS to Carbon wasn’t always easy; some larger applications took over a year. At the same time, it was seen as being much easier than tossing the whole codebase and recoding in Objective-C/Cocoa.

Now, a few years after Mac OS X was introduced Microsoft, Adobe and so forth had a substantial investment in maintaining parallel codebases for their Carbon applications and, understandably, began dragging their feet about converting to Cocoa at any time soon, or even at all. Due to pressure from these developers the Carbon GUI APIs began to incorporate new elements present only in Cocoa until then, and to all appearances Carbon and Cocoa were now positioned as equal and parallel APIs. In secret, of course, Apple hoped that “those people” would sooner or later see the light and begin doing their next x.0 version in Cocoa. In turn, “those people” harbored serious doubts about Objective-C (seeing it as a dead language with an unreadable syntax) and secretly hoped Apple would “recode Cocoa in C++”. Here’s a significant e-mail from an Apple engineer to the carbon-dev list:

No one reading this list should be under any illusions about Apple’s use of Objective C. Apple really likes Objective C. There are a lot of third-party developers who are using Objective C to program for Mac OS X and who really like it. Apple is not going to stop using Objective C. I’m not making a value judgement here, just stating a simple reality that everyone needs to understand. Do not think that someday Apple will “wake up” and realize that it would be better to recast all of our APIs in C++. That’s not going to happen.

So then came the PowerPC/Intel transition. Cocoa developers already were using Xcode, while many Carbon developers still were using the defunct Metrowerks CodeWarrior; transitioning large codebases to Xcode proved to be cumbersome. Still, people threw in more person-years to bring their apps up to the new standard. Then, at last year’s WWDC, Apple announced the migration to 64 bits, taking the opportunity to remove all legacy, obsolete or deprecated APIs from the new frameworks. Some Cocoa APIs were removed but, again, Carbon developers had more work to do. So once again, more person-years of work were invested.

It now seems that someone in Apple engineering management decided that they couldn’t afford to keep supporting two separate-but-equal APIs anymore, and the “transition” policy was revived regarding 64-bit Carbon applications. From what transpired during WWDC I deduce that some more of the Carbon APIs were taken off the “supported for 64-bit” list, most notably the part of the HIToolbox that concerns Carbon windows and GUI widgets. Therefore, 64-bit Carbon applications would seem to be either not supported at all, or supported only in a transition mode that used Cocoa windows and GUI widgets.

Naturally, Carbon developers were very bitter about this, while some Cocoa developers were asking if their 64-bit Cocoa apps would be able to call normal Carbon APIs (the answer is yes). So far, the most complete explanation I could find is this one (from the same engineer):

Fundamentally, Apple engineering is focused on Cocoa much more than Carbon, and Apple’s engineering management made the decision to un-support 64-bit Carbon to emphasize that fact.

So there you have it. Summary: 32-bit Carbon stays where it is and works fine until further notice – I don’t think they’ll be “deprecated” any time soon. The Leopard Finder itself is still a 32-bit Carbon application! Not until Mac OS 10.6 (LOLCAT, or whatever they’ll call it) comes out, which may take 3-4 years at least, and probably not even then. But 64-bit pure-Carbon apps may be unsupported, or even not run properly, when Leopard comes out in October. Cocoa isn’t going away, and is the future. Has been the future since Mac OS X 10.0 came out, in fact. On the other hand, there’s a migration path – use the Cocoa GUI, then later convert to a Cocoa app. People who have invested a lot of time in Carbon feel really bad about this, and I agree Apple mishandled this badly from a PR standpoint. On the other hand, investing a lot of time in Carbon is now revealed to have been a throw-good-money-after-bad move; some people say “I told you so”.

The final question is, how come neither Microsoft nor Adobe are screaming their heads off about this? While I was wondering about this, I realized that, for normal Mac users, Microsoft Office really doesn’t handle data sets big enough to need 64 bits; they can stay on 32 bit as long as it exists. As for Adobe, at first glance, Photoshop at the very least is just begging for 64 bits… really? Here’s what one Adobe engineer says:

I could have spent this whole cycle moving us to 64 bit rather than working on startup time, but would that give you more of what you want? Add 20 seconds to the startup time you are seeing for the beta for all versions/platforms of Photoshop and compare the value of that version to one where the histogram would be 10% faster on 64 bit machines (and most of the rest of Photoshop being 5% slower). It is true, there are some things, like histogram, that would be 10% faster, I wrote the code to verify this. But, the rest of the product would have been slower without a few people spending the whole cycle going over all of the slow parts and bringing them back to where they were on 32 bit. Most operations on a 64 bit application like Photoshop are actually slower by a small amount if time isn’t spent optimizing them.

Read the excellent comments on that post, especially the more recent ones, for much more discussion of the details on the Photoshop side – I suppose many of those would apply equally to other large Adobe/Macromedia apps.

So there you have it.. the big guys don’t need to move up for now. The small guys are mostly in Cocoa already. Unfortunately, the intermediate cases have fallen into the crack for now – think multiplatform CAD software for instance. It’d be very sad to see them leaving the platform in a huff about this; I sincerely hope Apple will contact all of them privately and smooth things over for now, somehow, though I can’t really imagine how. Maybe they’ll even re-add support in October, now that the point has been made.

Update: fixed a misconception about the PowerPC->Intel migration, see explanation above.

Wow, 15 days without a post. It’s been a slow couple of weeks, news-wise, and I’ve been distracted by off-line problems; sorry about that.

Of course the TV has finally shipped, there’s been tons of reports about it, and Apple’s stock price even got a good boost from that. Still, it’s a device I find it hard to comment upon, either positively or negatively. I rarely watch TV or even DVDs, our TV is an old model that has none of these new-fangled inputs or features (I think), and even if the device were available here I’m not in the target market. What does seem slightly interesting is that it apparently runs Mac OS X (not the “lite” OS X many expected), and therefore some people have already twiddled it to install additional video codecs.

Other than that, I’ve just read an excellent piece by former Apple manager John Martellaro, essentially arguing that Apple has first-class engineers and designers and doesn’t (at least not nowadays) do anything dumb, although it may look like it from the outside standpoint:

What I’ve noticed is that there is hardly a single writer, including myself, who has complete insight into Apple’s reasoning and design decision for a product.

…when you get a lot of smart people together in an Apple conference room, and let them fight it out, good things happen. One person will invariably have insight and hindsight that’s lacking in the others. By the time the dust clears, and a lot of scribbling has been done on the white board, a pretty good solution will have been worked out. Gotchas will be discovered and diagnosed. Experience with the customer, intimate knowledge of Mac OS X internals, and next generation technologies coming down the road will lead to sound engineering judgment from the group.

…Just remember, no matter how experienced any one writer is, they can seldom out-think a corporation as good as Apple.

Indeed. There are many young pundits, journalists and developers out there that are way too eager to jump on the “Apple is obviously brain-dead” bandwagon – of course “young”, nowadays, describes almost everyone from my viewpoint icon_biggrin.gif. In contrast, I think that, today, most questionable decisions from Apple can be blamed on limited human resources. Doing insanely great stuff takes time and needs first-class people.

Another never-ending discussion is the Leopard shipping date. I stiil agree with Ars Technica’s Jacqui Cheng that Leopard should ship at WWDC. However, people have been picking up a rumor that Apple is delaying Leopard by several months to (supposedly) get Macs to boot Vista. Huh? This completely illogical reasoning is aptly skewered by Daniel Eran at RoughlyDrafted:

Apple didn’t exactly scramble to get iTunes working on Vista, and iTunes is an important part of Apple’s business. That being the case, will Apple hold up the release of Leopard for months in order to support Vista in Boot Camp, a product that Apple makes no money in providing?

The story is so absurd on so many levels that it’s hard to find a place to start pointing out why it’s so stupid.

It really is very strange. Apple says they will ship in spring (these local seasonal references are really obsolete in a global context, but that’s another rant). Spring in Cupertino goes until a week or so after WWDC, people tell me. Even so, people who have not seen anything of Leopard beyond some leaked screenshots wrote excitedly about a MacWorld release, then about a March release, then when their wild predictions aren’t confirmed start to moan that “Apple’s been having trouble getting Leopard out” and now, even, that “Leopard had reportedly been delayed until October”. I really hope that Apple will show more details before WWDC, but I won’t be too surprised if they don’t.

Musings on Apple

No comments

Now that the waves around the iPhone have mostly died down, and we’re in a “silent period” between announcements, some further musings.

My earlier ideas about OS X in other products and a second-generation “tablet” device have percolated into the punditosphere. The trigger seems to have been the recent surge in larger or faster solid-state memory devices, as well as shipment of the first hybrid flash/disk drive. See, for instance, Jason D. O’Grady commenting about another analyst’s write-up:

There are numerous reasons why a diskless MacBook (or nanoBook) is the next logical progression of the notebook computer…

What’s interesting about the Reuters piece is that [it] claims that the nanoBook would run the stripped down, multi-touch version of Mac OS X that will ship with iPhone as opposed to the full-blown version…

In an included poll, however, 76% of voters said they’d want such a sub-notebook to run the full version of Mac OS X, and only 10% claim to accept with OS X (Lite). Others are skeptical of wider use of flash memory, even for larger iPods:

There is one brutally limiting factor to flash, though: cost. Flash is almost ten times more expensive than hard-disk memory. Although significant adoption of flash over the last 12 months has seen prices drop enormously, it’s still too costly to buy in the quantity needed for video iPods. Apple has a good relationship with its flash manufacturers though, and may secure a helpful price reduction it can pass on to consumers. But will that be enough to justify vanquishing the hard disk completely?

Still, I agree that prices are falling fast and that such a device may well be pre-announced at WWDC in June for shipment before the end of 2007. On the other hand, when so many financial analysts agree that such a device is in the works, it makes me suspect that they must be wrong… icon_smile.gif

Speaking of WWDC, only some radical holdouts (and a few financial analysts) still believe in an end-of-March launch of Leopard. I can’t say much about it because of NDAs; but to put Leopard on the market by the end of this month – meaning that, because of manufacturing and shipping times, it would be have to be ready about today – is impossible. Yes, some of the aforementioned radicals say that Apple has secret advanced builds in their labs and all the seed versions they sent out since last were just a cover. Hah. I’ll believe that when I see it; maybe not even then.

Re: Sony Reader

No comments

Taking up my old thread of e-book readers and electronic paper, I just found an interesting write-up of such technologies and of the latest variation: reusable paper. Worth a read.

Too hot

No comments

Since the beginning of the year (or, perhaps, more germanely, the beginning of the hot season) the hard drive in my iMac G5 would have a little clicking fit. This is usually the first sign of impending drive failure, but as it usually would stop and get back to work in a few seconds I did nothing except resolve to backup more often than I usually do. Just FYI, it’s a 20″ iMac G5, the last series before the iSight model, with a 250GB Maxtor SATA drive.

The last few days it’s been hotter than usual – often around 33C during the day – and around the middle of last week the clicking started to happen more often, and it would sometimes take minutes to recover… this only happened when I was booted from a certain partition, and never when booted from the other partition, so I tended to spend more time in the latter situation. I also installed Marcel Bresink’s excellent freeware Temperature Monitor, which told me that the iMac’s built-in hard drive temperature sensor showed 54C, while the drive’s own SMART sensor said it was at 70C. Which of course is somewhat beyond the usual rated operating temperature of 60C…

I finally got some free time to actually do something about it and proceeded to do a full backup of my home folder and of selected other folders to an external hard drive. I then tried to do an erase-and-zero-data operation on the internal drive, which (after 10 hours!) failed with an I/O error. And the drive temperature went up to 72C while the external sensor still said 54C! Something was very wrong.

Well, clearly this meant the drive was no longer reliable and I proceeded to find a replacement. Only a few months earlier I’d phoned around to find a larger backup drive, finding out that nobody had anything larger than a 160GB IDE in stock, and that an exorbitant price. SATA drives were “about to come in”. This time, too, the first stores I tried had no large drives available, until the nice people at TecMania pointed me at WAZ, where I promptly found a 320GB SATA drive for about US$210.00, not too bad for someone in a hurry. So on Saturday I was the proud owner of a new Western Digital WD3200KS, and gained several dozen GBs space, not too bad.

The new drive’s power consumption specs were about 20% lower than the old Maxtor’s, so I was reasonably confident that it wouldn’t overheat as badly. Still, after installing it, I looked closely at the way the temperature sensor was mounted on the drive bracket. It turns out that the bracket on that side is a thin metal strip fixed to the drive with two mounting screws, and the sensor is glued on near the middle. However, even with the screws properly tightened, the metal strip arches out a little in the middle, so that there was a small air gap between the sensor place and the drive itself – clearly not a thermically optimal solution, and this might explain the huge 18C difference between the internal and external temperature readings.

I googled around and some people had indeed run into the same problem. A few had mounted external fans onto the air inlet and/or outlets, and some had even cut into the iMac cover to do so! This seemed a little radical to me, especially as it would drastically cut resale value. Another user recommended cutting off the sensor and re-gluing it onto the drive body itself, something which I actually considered doing, but I found the sensor cable would be too strained if I did so.

The actual solution I implemented is shown here:

I added the round-headed Philips screw in the middle of the mounting bracket, which goes into the center (previously unused) hole on that side of the drive. I also spread a thin layer of thermal heatsink paste onto the mounting bracket, in the space between the two holes on each side of the sensor. The air gap was completely eliminated, and indeed after I fired the system up and restored my backups, the temperature gap between internal and external sensors was reduced to a much more reasonable 4C.

This means that the drive peaks at about 58C; still within the nominal operating range of 60C max, but uncomfortably close to the upper limit. By coincidence while I was doing this, I became aware of a Google paper (pdf) about disk failures. Very interesting; they investigated an awful lot of drives, and concluded that elevated temperature wasn’t necessarily a factor; then again, their operating temperatures were below 50C.

Meanwhile, I’m monitoring the drive closely and think of alternate methods to make the sensor’s temperature track the drive’s temperature more closely (which would make the cooling fan kick in a little earlier). My first attempt, putting a piece of tape over the sensor to take it out of the fan’s airstream, didn’t make any appreciable difference.

Update: Another paper on disk failures just came out. Also very interesting.

I’ve been getting some positive feedback lately about my “Interesting Times” articles, so I thought I’d repost some pointers to them. The column itself is, sadly, now defunct, but new material crops up now and then; I’ve decided to post it here instead. In retrospect, the way this blog/forum is organized could use a few revisions, but that’s not likely to happen very soon.

So, “Highly Advanced But Obsolete” talks about the QI900, which was an 8-bit CP/M-based computer I helped design in the middle-80s:

…the Z80 was too slow for a fully graphical interface, and we hadn’t the mechanical know-how to build a mouse.

…here’s the final result: the QI-900 had menus…

…and moveable windows…

…and, even better than the original Macintosh, it had preemptive multitasking – or rather, multithreading inside the same application.

I promised a follow-up article with more details, but never had the time to do the necessary research. Maybe later in the year.

Everybody’s favorite seems to be, however, “This Internet isn’t worth anything…“, where I tell some stories about setting up a commercial ISP in the early 90s:

(At Embratel – that was the government’s telecomm monopoly)

Me: “I want an Internet connection.”

Embratel Salesman: “OK. I suggest a 2400 or 9600 link, the price will be X cents per packet. That’s 20% of what it costs to send a TELEX. Isn’t that revolutionary?”

Me: “A packet means how many Kbytes?”

Embratel Salesman: “What? It’s 64 bytes per packet!”

Me: “And if a user decides to download a larger file, say, 500 Kbytes? It’ll cost hundreds of dollars!”

Embratel Salesman: “Don’t worry, that will never happen!”

Re: iPhone updates

No comments

An Italian newspaper article quotes Dario Bucci of Intel Italia as saying that the iPhone uses Marvell‘s Xscale-derived CPUs. (Curiously enough, the current version of this article doesn’t show this part of the interview anymore…)

Well, who cares? Indeed, by now I agree with HiveLogic that the CPU is irrelevant. Marvell bought Xscale from Intel about 6 months ago. Xscale, in turn, uses some of the ubiquitous ARM cores that were rumored to be the iPhone’s CPU (as well as, probably, powering some other chips in there). But as the HiveLogic article says:

And now the iPhone uses yet another CPU, and we should still expect OS X to feel like OS X. Apple seems to be pushing the idea that the CPU shouldn’t matter to the user of an Apple product. And I think that?s why Apple isn’t talking about the iPhone?s CPU.

Right. With Leopard, Apple’s development tools support building apps for any combination of 32-bit, 64-bit, PowerPC (big-endian) or Intel (little-endian) CPUs. Since the gcc compiler supports ARMs and many other architectures, and the major stumbling block (the endian issue) has been solved, by now OS X can be safely assumed to run on nearly any modern CPU.

Now, in the past, I’ve been as prone as any to argue endlessly about the superiority of the PowerPC architecture (or of the 68K architecture for that matter) over x86, but for most practical purposes I have to admit all that has become a non-issue – especially for a device like the iPhone. I for one welcome our new <your architecture here> overlords, and that’s that.

iPhone updates

No comments

No significant iPhone news for a few days, so I’m stopping the “Your Subject Here” subject – if I’d known there’d be so many updates on that; I’d really have put in a non-joke phrase, so the Googlebots would have an easier time. Sorry about that.

A few interesting updates have com up in the meantime, however. For instance, a preliminary bill of materials estimate by iSupply details expected manufacturing costs of $245.83 and $280.83 for the 4GB and 8GB iPhones. I have a little experience in embedded design and the hardware numbers look reasonable to me; no idea how they arrived at the software costs. $7 for OS X is a weird figure, probably calculated by analogy with Windows CE OEM pricing, which would be totally inapplicable to an Apple product. Is the iPhone expected to pay for the total OS X development costs? In any event, these figures imply a gross margin of roughly 50%, very good even for an Apple product; average margin for their last quarter was 31.2%.

This report and some other articles have confused me even more about the “Cingular subsidies”. Are we really to believe, as one Cingular VP said, that Steve Jobs humbly agreed to all their usual business practices and “bent a lot” to get their contract? The iPhone judo theory is a little more convincing. Anyway, some people are saying that in June Cingular will subsidize perhaps $150 off the iPhone’s prices, making them retail for $350 or $450 net; others are saying that the subsidy is already built-in and the “unlocked” prices would be $800 to $1000. Or perhaps those prices are the real prices and Cingular will be neither subsidizing nor penalizing iPhone users, being content to charge just for their service and basking in the “halo effect” of being next to the Apple radiance…

Meanwhile UI guru Bruce Tognazzini has posted a long article about his impressions of the iPhone. Worth a read, but here are some nice quotes:

The origins of these bits and pieces, however, is not what’s important about the iPhone. What’s important is that, for the first time, so many great ideas and processes have been assembled in one device, iterated until they squeak, and made accessible to normal human beings. That’s the genius of Steve Jobs; that’s the genius of Apple…

…I have yet to get my hands on an iPhone?frustrating! (You can imagine Bill Gates’s frustration. He probably has a cadre of engineers ready to take it apart, put it back together with a couple of screws missing, and paint it brown.)…

…email echoes the voicemail interface. It is clean and simple. What is startling is the apparent hard separation of email, SMS, and voicemail. What I would want is a single list, defaulting to the newest and unread/unheard first. I don’t care about the medium, and neither should iPhone.

Of all the iPhone features, this is the one that seems to have completely missed the target. It would be like Blackberry having three lists: One for mail with more than 100 characters, one for mail with fewer than 100 characters, and one for mail sent from more than 3000 miles away.

That last suggestion is marvelous, at least as an option for less technical users; Apple should really try to do this.

Meanwhile, we’re back to business-as-usual, it seems. Apple posted absolute record numbers for the last quarter and the stock went down afterwards, it’s 10% off the peak of some days ago as I write this… with no justification except some vague complaints about Apple’s conservative guidance for the next quarters. Analysts just aren’t getting used to conservative guidance followed by better numbers, it seems. Some others complain about lesser growth in Mac sales, but given that Adobe’s software suite isn’t out yet, the posted figures look quite good to me.

John Gruber and several others are coming to agree with me about the general philosophy of OS X as a generic OS family for Apple devices. Good.

Finally, I’ve often commented on the excessive price of Apple products in Brazil, but for the first time this has been reported by international sites:

…the survey prices the 2GB nano in US dollars and found that Brazilians pay the most for an iPod, shelling out $327.71, well above second-place India at $222.27.

Canada was the cheapest place to buy a nano, at $144.20…

Photos licensed by Creative Commons license. Unless otherwise noted, content © 2002-2025 by Rainer Brockerhoff.
Iravan child theme by Rainer Brockerhoff, based on Arjuna-X, a WordPress Theme by SRS Solutions. jQuery UI based on Aristo.