Solipsism Gradient

Rainer Brockerhoff’s blog

Browsing Posts tagged WWDC

OK, the XRay II file browser now works sufficiently well that I can leave it alone for a week or two, and get to (re!)doing the basic plugin stuff. It still lacks keyboard navigation and scrollwheel support, both of which should be easy to do. It still slows down when accessing a larger folder – or /Network – and I’ll probably have to make this multithreaded soon.

Other UI stuff that works now is dragging&dropping a single file onto the main window or the dock icon. Yes, there’s now a single main window and I’ll do tabs. Real Soon Now. While the idea of supporting multiple windows – like in the current XRay – was attractive at first, experience showed that people were liable to drop a hundred files at once, making the system freeze in an iterative spasm of ever-more-slowly appearing little windows. And very few people ever asked to be able to compare two files side-by-side, anyway. So I’m going for a large main window which will be able to show more stuff at once.

The latter change in direction is also in line with moving away from a (small) “better Get Info” window, towards showing file internals. Finally, having one browser for each window would also become heavy after 2 or 3 windows… not to speak of synchronization issues when the whole thing becomes multithreaded.

The plugin interface is still in flux, but I’m making headway on the basic simple ones: file attributes and metadata (well, some to start with), as well as hex dumps for data and resource forks. The problem is that all those have to converge asymptotically to a nice implementation over multiple iterations, and I won’t be sure of my basic design choices until they’va all been done at least twice…

Still, I’m confident that I’ll be able to show an alpha version at WWDC. Very probably at Buzz’s bloggers dinner net Monday.

Since yesterday, we’re safely ensconced at a nice hotel in downtown San Francisco, where we’ll stay for 4 weeks – leaving just after WWDC.

The trip up to Corvallis (OR) was marvellous, especially the visit to Crater Lake. It’s the prettiest lake I’ve ever seen, and I’ve seen quite a lot. All this took a little more time and energy than we’d budgeted for, and after visiting with some friends we passed up a chance to see the famous DaVinci Days Festival. Instead, we opted to just walk around the beautiful Oregon State University campus, which was well worth the visit.

Going south the next day we opted for the more scenic 101 route (instead of the 5, which we’d driven up on), and it also was great. The Pacific shore is always beautiful and we also had an opportunity to revisit Ferndale, where we’d been on a previous Los Angeles-Vancouver trip.

So, everything’s settling down and for the next 3 weeks I’ll try to do some extra work on XRay II. Hopefully I’ll have a working alpha to show off at WWDC. Or at least a T-Shirt… icon_wink.gif

So, it’s 6-6-6 in whatever date ordering you prefer, and this is supposedly the Number of the Beast. Or perhaps not. Do I care? Not really…

…it’s my birthday, however. This specific birthday is my 37th (in hexadecimal of course icon_wink.gif)and it’s a rare one, in that I’m not away on a trip. Last year my birthday present was a surprise, all right: I was at the WWDC keynote and listened to Steve Jobs announce the Mac-Intel switch. Later in the day I tripped on a San Francisco sidewalk and, fortunately, suffered no serious harm.

Hopefully, today will bring no serious surprises either way, and I’m looking forward to the positive ones that this year’s WWDC will bring. More details in a month or so…

Intel debugging

No comments

WWDC was a great opportunity to talk to friends in the flesh and to make contact with Apple people, all of whom were extremely patient and polite.

One of the very last talks I had was with a senior ADC Labs manager (sorry, the name isn’t in my cache just now), who assured me that a couple of months down the road Intel machines will be available at the ADC Compatibility Labs for debugging.

I have just received an e-mail from ADC Labs stating that their equipment is also available for remote debugging over either ssh or Timbuktu, at no charge to Select or Premier developers. So I’ll just have to watch for the new machines to pop up on the equipment list and presto; no need to spring for the Transition Kit.

WWDC: winding up

No comments

It’s been a memorable, tiring, interesting, productive week. My stay at The Mosser Hotel has been enjoyable; the service is excellent, the rooms are OK though a little cramped, and the location can’t be beat. Only a block from Moscone and even nearer to Market Street and the SFO Apple Store. Many thanks also to the Hyperjeff, my courteous and forgiving roommate.

I’m now back at the South Airport Travelodge, spent the morning day getting packed and lugging my brand new iMac G5 around; a heavy beast, but worth it! And in the afternoon visited my old haunts at Berkeley. So now that that’s done, I’ll have the Sunday to rest up, organize my writings, and get set for the trip back to Brazil around Monday noon. More anon.

The dust is slowly settling, Apple stock is behaving normally, and everybody and their dog have emitted opinions about the MacIntel story. So who may win, and who may lose in the next 12 months?

Winners:

  • Apple, of course. As I commented below, they’re free (or will be, in a year) of the CPU-architecture-as-a-religion meme. They get a literally cool CPU/chipset for their PowerBooks; although I suppose they won’t use that name in the future; how about IBook icon_wink.gif? They get dual-core CPUs right now, and a 64-bit version in the future. Even the stock analysts are liking this, though for mostly the wrong reasons. Also, switching processors did establish a precedent for Apple: Intel knows they’re not captive clients, and they’ll have to treat Steve Jobs with kid gloves lest he switches away again. Finally, Mac OS X 10.5 (Leopard) will come out simultaneously, or even some weeks before, Longhorn, and with smaller minimum requirements. The new low-end Intel Macs may even take a goodly part of the low-end market away from Microsoft.
  • Intel. They were certainly getting tired of being perceived as just the evil tail end of the evil Wintel dragon. Intel’s not very pleased with Microsoft these days and they were being pressed on other fronts. Getting Apple’s business is a glamorous endorsement which has far more weight than Apple’s smaller marketshare leads outsiders to believe. They’ll certainly be pleased to have a partner which actually will insist on getting the latest and greatest stuff, without being concerned about backwards compatibility issues like, say, legacy BIOS support. By the way, Intel will now be free to cut away, say, the 50% of the Pentium that still support all those legacy modes and compatibility instructions, and supply Apple with an optimized-for-Mac OS X chip. With all the silicon saved, they could double the number of cores, put in more cache, support Altivec instructions or whatever they fancy; after all, it won’t really have to boot Windows anyway, right? Finally, Intel’s attempts to produce new PC designs were, let’s be charitable, not much good. They now have the best design team in the industry to showcase their new technology.
  • Developers. At least the Cocoa developers, the Open Source developers, and the Carbon developers that already were using Xcode. For most vanilla apps, it’s just a recompile and some tweaking. The added discipline will be good for people, and the market will grow a lot. This is a great time to be a Cocoa developer, and I for one intend to take advantage of it.
  • Stockholders. Both Apple and Intel stock will benefit from the new synergy between the two companies. I’m an Apple stockholder, and I’ll be looking at Intel stock very carefully soon.
  • Gamers. Let’s face it, most full-screen games usually push aside the underlying OS when they come in, have their own user interface, talk directly to the graphics card and deign to let the OS do something only for mundane stuff like saving scores files. So many developers didn’t even bother to port to the Mac. When Virtual PC or a similar product comes out, gamers will have access to all Windows games at full speed; and it’s almost certain that the Intel Macs will have some virtualization facility built in, but won’t dual-boot. As long as Mac OS X will be whatever the new machines boot into, Apple will certainly allow other OSes to run under its control; that way, the user will always have the Mac OS X GUI visible somewhere. The effect of this on game developers is debatable. Some will be relieved not to have to do dual versions anymore. Mac-only developers will lose the Altivec advantage, so this may have some impact.

Losers:

  • Metrowerks. CodeWarrior has, unfortunately, been going downhill since they were acquired by Motorola, and is now officially dead. Apparently their Intel compilers have been bought by Nokia, a move so outside my field of expertise that I’m not going to comment further on it, but they’re out of the picture now. Apparently they’re concentrating on the embedded market now, where they may still do well.
  • Microsoft. Or at least partially. Windows is the new Classic; Windows apps will run in a “Red Box” and will look quaint and old-fashioned. As I said above, Leopard may well eat into Microsoft’s low-end marketshare. And the Wintel meme is dead; people now know that there are alternatives to Microsoft, and are actively looking for them. Defecting to the Mac will now appear easier and more natural for non-techies. Yes, Microsoft will keep a finger in the pie; their sales of Office won’t go down, as any Windows sale lost will be compensated by a Mac sale, and they’ll certainly be selling much more copies of Virtual PC, if they can bring the cost down. Still, this is a philosophical defeat for Microsoft in several aspects.
  • Adobe et al. Adobe are publicly committed to port their stuff to Xcode and Intel. From what I heard from inside sources, the Xcode transition will take several times as much effort and time as the Intel transition per se. Adobe and other companies with huge codebases that used CodeWarrior have their own software workflow and converting this may take up much, maybe all, of the next 12 months. What this will do to the apps they bought from MacroMedia is anyone’s guess; I’ll say that some of them may not survive. From past experiences the move may be altogether too much for Quark, who’ve taken years to do a not-so-good Carbon port.
  • AMD. The consensus seems to be that AMD, even though they might seem at first glance a better fit to Apple than Intel, apparently didn’t have the necessary product line depth to fit the new Apple. Still, nothing says they couldn’t supply chips for future high-end Apple products.
  • AV developers. Apple’s Pro AV products already had the market pretty well sewn up, and now they’ll be running well-optimized on the new Macs from day one. Competitors will be at least a year behind; not an enviable position.
  • Cluster users. The whole G5/Altivec hype was really justified for these guys; Xserve clusters have been building a well-deserved reputation for very high-end scientific computing. I don’t see a comparable Intel-based machine coming out from Apple before 2007. The same applies to 64-bit computing. Steve Jobs hinted that new PowerPC machines are still in the pipeline, so this may be moot, but the folks I’ve talked to here are quite nervous.

I don’t want really to conduct a long discussion here, but…

Ibis Itiberê S Luzia wrote:

“The soul of the Mac is the CPU”. What is the meanning of the therm “Mac”? If I’m not wrong a “Mac” is a computer and not a software. The software is called “Operational System” which in this case can be System 7,8,9 or X. And at least what differentiated a Mac from an ordinary PC? Was the CPU, wasn’t? We were able to get experiences that ordinary PC users didn’t accomplished. We were able to run programs that they could’nt. The great difference was that Apple had a CPU that it helped to develop together with IBM and Motorola. They had the “difference” and this maked Apple so different.

I think that may have been more applicable in the past. In 1984 I bought my first Mac. The Macintosh was the user experience, the Mac operating system, the 68K CPU, the SCSI interface, the NuBus boards, the ADB Keyboard and mouse, the 3.5″ floppies. All these components enabled something extra in the user experience.

This is quantum physics in that it really needs someone operating the computer to have the “experience”. All of the components I’ve listed above have been changed: the operating system is now Unix and NeXT based, the CPU migrated to the PowerPC, SCSI, NuBus, ADB and floppies were replaced by new technologies. But people agree, when they sit down at an iMac G5, that it’s still a Mac – although a completely different Mac from the 1984 Mac 128K.

So, I’m actually writing this at an Intel Mac. It’s still a Mac. Everybody here at WWDC agrees with me, as far as I can tell. The user experience has evolved, but the essence has remained. It’s faster for some things, it’s slower for other things. This is irrelevant; it’s a different model, that’s all. It uses other chips inside. That’s irrelevant too.

Let’s move on. There’s tons of new stuff to do and write about.

Yesterday I spent some time at one of the prototype Macs with a Pentium inside. There was one of them with the cover removed. The most remarkable thing was – and, now that photos have leaked, I don’t think I violate any NDA by saying this – that there was nothing remarkable about the box (a standard cheese-grater PowerMac) or about the motherboard (a standard Intel motherboard). There was nothing remarkable about it usage-wise either; unless you looked at the “About This Mac” window, or at the System Profiler Report, or at the Processor preference panel, there was no way of telling what CPU was inside. It ran some unreleased build of Tiger, and there was this huge conspicuous security cable on it for some reason icon_smile.gif.

But it walked like a Mac, it quacked like a Mac, it was a Mac to all intents and purposes. I downloaded a dozen of random software packages off the Internet, they all just worked – under the Rosetta translator, which I had to see working to really believe in. The perceptual speed was, perhaps, a little faster than my 1GHz PowerBook; quite usable. I suppose this will get faster after a year of tweaking; another word in everybody’s mouth these past days.

Ah, and a bit of news which also leaked out today: Steve Job’s machine was not a souped-up quad processor monster, just the same box I had been using, but with some extra RAM. So, Rosetta is one cool app. I talked to one of the guys on the Rosetta team and he confirmed what I saw happening at the keynote: when you launch an app the second time, it uses the cached translated binary, so it launches much faster.

I looked at the installed libraries, drivers and applications: of course they were all “fat” binaries. The entire system is fat. Oops, sorry, “universal binary” is the politically correct version now. It’s not quite double the size of the standard Tiger installation, but who cares in these days of 100+GB disk drives?

I checked out some of my own projects on the new Xcode 2.1. Nearly all standard Cocoa stuff just compiled and ran with no modifications, no matter what combination of architectures I compiled it for. You can even step-debug PowerPC binaries on this thing, they somehow made gdb Rosetta-aware, so that the translated executable is back-linked to your PowerPC source code; very cool. This is probably a bonus of the Mach-O executable format, like the universal binary format itself. The old Carbon CFM format will run under Rosetta, but not natively; CFM is the new Classic, it seems. The old Classic appears to be dead at last; I suppose getting the Classic compatibility layer running under Rosetta would be a huge pain.

I don’t have any straight Carbon projects to test. I do have one new project that twiddles bytes that flow to and from the disk at a lot of places, because it uses a legacy format (one dating from 1984, by the way). I got most of it converted in less than an hour by scanning for certain source code patterns and putting byte swap function calls around the pertinent expressions. Now, the publicly available Guidelines list dozens of exceptions, where porting takes some extra work: if you use custom resource formats or Apple Events, if you use bitfields, if you want to divide by zero, and so forth. I think the biggest headache will be for whoever has invested time in writing great gobs of code in PowerPC assembly or Altivec; fortunately I never did this myself.

One place where I later lost another hour of work was in a somewhat obscure open source module which made unwarranted assumptions about the order local variables were allocated on the stack. Now, this is something which certainly works for one-off applications, but to actually publish such a thing without calling attention to it, is somewhat foolhardy. This is where many of the conversion failures will come from, I believe; sloppy coding and unwarranted assumptions.

Photos licensed by Creative Commons license. Unless otherwise noted, content © 2002-2023 by Rainer Brockerhoff. Iravan child theme by Rainer Brockerhoff, based on Arjuna-X, a WordPress Theme by SRS Solutions. jQuery UI based on Aristo.