Solipsism Gradient

Rainer Brockerhoff’s blog

Browsing Posts tagged Mac

So, I’m just back from getting my WWDC badge. I’ve seen the famous banner and all icons on it are known – the only one I had doubts on (above the SpotLight icon) is supposedly from a Mac OS X Server utility. Even the 64-Bit icon was previously used when the G5 came out. Ah right, we now know what the Leopard “Big X” looks like – black with a white border. Drat, I need to change the XRay II icon to reflect that…

The relative sizes and positions give no hints. There are a few hardware icons. One iPod Nano. 3 iMacs, 2 laptops and one desktop – the latter one from the side, so the front may be different. Or the banner might just be there as a misdirection and may be changed on Thursday… the Xcode icon is very large – so large that one can read the small print on it, but then it’s a developer’s conference. On the other hand, people “in the know” did tell me to make sure not to miss the developer tools sessions.

Certainly a major release of Xcode is in the works. 2.5 or 3.0, it doesn’t matter, but my personal hunch is that the superannuated Interface Builder application will be phased out and integrated into Xcode. Let’s hope that connections like outlets and bindings will be easier to visualize and debug, and that the IBPalette interface is finally officialized so that we can write non-trivial palettes.

I’ll be under NDA for details – things announced at the keynote excepted – so these will be my final pre-WWDC speculations. On the hardware front, 64-Bits is of course guaranteed, with one of the new “Core 2 Duo” chips. A Mac Pro will certainly be out, although the name may not be exact, and the casing will probably be a minor variation on the current one. There’s a good Ars Technica writeup about the new Intel CPUs, and expectations are that the whole new range will fit nicely into the spectrum from MacBook Pros to the Mac Pros – possibly with a dual-core, dual-CPU at the top, although it might also be that Intel has been reserving their quad-core chip for Apple to announce. Intel Xserves might also appear.

I don’t expect a new iPod to be announced in a big way, except as a footnote to the usual summing-up of past sales; at a developer’s conference, it’ll be big news only if it had an official API for developers to extend its functionality, which might actually be a neat way for Apple start a new iPod generation in a privileged position; stranger things have happened.

I’m reasonably certain that we’ll each get a Leopard preview DVD. I’ve seen rumors of changes to applications, which I consider less interesting as they’re not really a part of the OS itself, at least from my developer’s standpoint. I use relatively few of the iApps every day – Safari and iChat are the ones I leave open, and my wishlist for those is small.

Real Leopard features I expect to see:

RBSplitView adopted! Well, not likely, but it’d be nice… I’ve told Apple I’d gladly give them the code, anyway.

– A new UI theme, or at least a migration of the default windows theme to the new “cool gradient/smooth metal” look.

– Some new Cocoa widgets, especially the more successful ones from the Tiger iApps. I hope to see them do Brent Simmon’s “big time tabs control”; I need it badly for XRay II.

– A new Finder. I’ve mostly gotten used to the old one, but still…

– Resolution independence. We need to get away from the pre-rendered bitmap widgets. People are already starting to use object-based PDF files for that, but they’re a pain to make and don’t look good at all resolutions. My ideal solution here would be a new NSImageRep and corresponding file format that would do for images what the TrueType format did for fonts: resolution-independence with special hinting for small sizes.

– More extensions to Objective-C. Garbage collection should be a given. Unloading NSBundles is supposed to be in the works. Frameworks included inside applications can’t be easily updated and versioning is pretty much useless for practical purposes.

– Hopefully we’ll see expanded metadata capabilities and a more useable SpotLight. I hardly use it in Tiger because it’s so slow and limited. The ability to have additional named forks should go hand-in-hand with full NTFS support. Other file systems would also be nice (ZFS, anyone?).

– Virtualization. I’ve written about this several times. My personal opinion is that Apple should write a fully trusted hypervisor into the EFI (using the TPM) and run everything inside virtual machines, including Mac OS X for Intel itself. Booting some version of Windows into a second VM would be easy, then, and there wouldn’t be a full version of Mac OS X for Intel for people to run on standard PCs either. I don’t think dual-booting is a good solution, I believe Apple was just testing the waters with BootCamp. No idea what would happen to Parallels in this scenario; they might be bought out by Apple, or by Microsoft, I suppose. Here are more thoughts on virtualization from Daniel Jalkut and Paul Kafasis.

– 64-bit “cleanness”. Meaning, Carbon and Cocoa and everything else running in 64-bit apps. And very probably, also, on the G5s. However, I’m not sure (and no time to research at this moment) how mixing 32 and 64 bits works on the Intel CPUs. I remember reading somewhere that it’s not as easy as it is on the G5, where you can have 32-bit processes co-existing with 64-bit processes.

Unlikely or even impossible:

– A new kernel.

– iPhone, iPDA, iGame, iTablet. iAnything in fact. There are rumors about VoIP support and there might be some sort of hardware for that, but I can’t see Apple doing a me-too cellphone.

– Some goodie under the seat (like when the iSight was introduced, which I missed out on, argh!).

In the meantime, I’d better get back to my coding… more after the keynote!

Fast update

No comments

So, my new Intel Mac mini is in and working. I bought the basic version; Core Solo at 1.5GHz, 512MB. It fits nicely under my iSub woofer. My iMac G5 controls it over Ethernet for remote debugging, and after some initial setup it doesn’t need mouse, keyboard or display.

I’ve already restarted debugging XRay II as a Universal Binary and universal versions of some other stuff will be out soon. Stay tuned…

OK, so here are the details on remote debugging; I’ve finished this phase of XRay 2 development and will in the next few weeks be fully available for pressing on with it.

The basic idea is that I have only PowerPC Macs, and since nobody I know in Brazil has received an Intel Mac (except for a couple of DTKs, which I didn’t want to use), the solution was to use Xcode‘s remote debugging capabilites, running my executable at a machine in the ADC Compatibility Labs. These are open at no extra charge to paying developers, but most of what I’ll detail would apply to any other machine.

Most of it is explained in the Xcode User Guide. Note that I used Xcode 2.2.1, but I think this facility has been available at least since 2.0. Click on the “Remote Debugging in Xcode” section in the left frame. First, however, send e-mail to adc.labs(at)mail.apple.com and ask for machine time, explaining for how long you need the machine; I asked for 3 days (thanks, Joe Holmes!). Basically, they’ll set up a newly formatted Mac with everything standard installed, including the latest developer tools. You should check that you have the same version, I suppose. They’ll have ssh and Apple Remote Desktop activated, and will send you the IP address, usercode and password. For illustration, let’s say the IP number is 10.1.1.1 (NOT an actual IP!), the usercode is “adclabs” and the password is “stevejobs”; substitute the actual values as appropriate.

In other words, all you’ll do there will be inside the default home folder “adclabs”. This user is also an administrator, so you’ll be able to use the password whenever needed for that. If you have a second Mac handy, you could rehearse first with that, of course; it’s what I did, as I’m normally not that handy with the Terminal. (Thanks, by the way, to John C. Randolph, Mike Ash and several others for helping me with details.)

First step is to generate a public and private key pair; you can skip this if you already have done so in the past. Open Terminal and type:

ls ~/.ssh/

If it lists a few files, among them one called “id_rsa.pub”, you already have a key pair. If not, type:

ssh-keygen -b 2048 -t dsa

This will take about a minute and then prompt you for a file path; type <return> to use the default path. It will then prompt you for a passphrase, twice. Don’t leave this empty and don’t use too short a phrase. You now should have the id_rsa.pub file in the ~/.ssh directory.

Second step is to open Terminal and type:

ssh adclabs@10.1.1.1

wait for the Password: prompt and type in “stevejobs”, or whatever password they sent you. You should see the normal Terminal prompt now, with a name like “CE-Lab-ADC-Compatibility-Labs-Intel-iMac” at the start of the line.

Now you’d better change the password to the same passphrase you used for the RSA key – yes, usually it’s recommended to use different passwords here, but that way you won’t have to remember which one to use where; it’s just for a couple of days, anyway. Type

passwd

and you’ll be prompted for the original password, then twice for the new password. Create a .ssh directory with

mkdir ~/.ssh

and log out by typing

exit

Next step is to transfer the public key to the remote Mac. To do this, at your local prompt, type

scp ~/.ssh/id_rsa.pub adclabs@10.1.1.1:~/.ssh/authorized_keys

it will ask for your password again, and transfer the file over. Now log in again with:

ssh adclabs@10.1.1.1

and if all is well, it won’t ask for your passphrase or password again, but just log in. Now restrict permissions on your key by typing

chmod go-rwx ~/.ssh/authorized_keys

Now you need to set up a local build folder. The trick here is that both machines should see your build folder at the same absolute path. There are several ways to achieve that; on a local network, you could have one of the machines serve the entire folder to the other, then use symbolic links to map the same path. However, I found that over a long distance it’s most efficient to have mirrored folders at both machines, and copy the contents over with an extra build phase. Here’s what I did.

At the remote machine, type

mkdir ~/build

which will create an empty build folder in the Home folder. Log out and close Terminal.

Now, on your local machine, you need to prep Xcode for what you’ll do. Double-click on your main project group and go to the “General” tab. Click “Place Build Products In: Custom location” and type in “/Users/adclabs/build” as the location. (Supposing, of course, that you don’t have a user called “adclabs”…)

Also check “Place Intermediate Build Ïiles In: Default intermediates location”, which probably will already be checked. Now click on your target and, from the Project menu, select “New Run Script Build Phase”. Make sure the new build phase is the last one, and enter this line as the script:

rsync -rz /Users/adclabs/build adclabs@10.1.1.1:/Users/adclabs

Finally, double-click on your executable in Xcode, and in the “Debugging” tab, select “Use Pipe for standard input/output”, check “Debug executable remotely via SSH”, and in the “Connect to:” field, type

adclabs@10.1.1.1

Now you’re ready. You’ll notice a delay of a few minutes while the last build phase transfers the files over, and on the start of a debugging run there’ll be several errors logged to the debug console, but eventually you’ll be debugging and single-stepping as usual, albeit more slowly. For GUI debugging, of course, you’ll have to use Apple Remote Desktop; I wish Apple would include a 1-user license for this in the Select package, as it’s rather expensive…

Have fun! I’ve tried to double-check most of this as I typed it in, please tell me if something didn’t work.

Update: fixed a typing error.

hmafra wrote:

He did it again! One of his takes now is the kernel thing. Speed, he says.

What he writes makes some of sense, like the part on cross-licensing agreement. I still don’t buy it, though.

I was about to comment on that.

I checked with some friends who know more about the kernel, and they say he’s completely wrong. In fact, there are two myths at work here. The first one says that Mac OS X uses a Mach microkernel, which is wrong. XNU, which is the Mac OS X kernel, is effectively monolithic as the whole BSD stuff runs right alongside the Mach stuff in the same context. The Mach code takes care of memory allocation and thread scheduling, the BSD code does most of the rest. None of the switching that would make a pure microkernel inefficient. Granted that there are some kernel functions which are slower than the equivalent calls in, say, Linux; but this just means that Mac OS X isn’t currently suited to huge server farms, and that Apple can tinker with this if necessary without switching kernels at all. In fact, they’re probably already doing this with the Intel versions of Leopard.

The second myth is that only Avie Tevanian was holding Mach in place by sheer orneriness, and that now that he’s gone, everybody will heave a sigh of relief, throw Darwin out, and shoehorn Linux (or even XP) into its place. That too is completely wrong. Bertrand Serlet has been in charge of the OS policy for at least two years now. And consider that XNU, because of the Mach component, is well-suited to scale to a larger number of processors. And consider that Intel is coming out with new chips that supposedly will scale well to 4, 8 or even more cores…

The idea of Leopard implementing the Windows API is, at first look, interesting. (Let’s discard the misinformation about “Microsoft saving Apple”, and that the cross-licensing included the Windows API.)

After all, Mac OS X already has several APIs for writing applications. BSD with X11, Carbon, Cocoa, Java, and so forth. Why not an additional one? Well, it’s possible in theory. In fact, the WINE people are working on such a thing. However, why should Apple make it too easy to move applications into Mac OS X? Such apps would never be full-class citizens, the appearance would be awkward, drag&drop would probably be impossible… no, virtualization is the way to go. Running Windows inside a secondary window would also be a constant reminder of which environment is the native one, which is more in Apple’s interest.

Tempora Mutantur

No comments

Yes, the times sure are changing. Today I even found myself largely agreeing with a Paul Thurrott article:

Since the euphoria of PDC 2003, Microsoft’s handling of Windows Vista has been abysmal. Promises have been made and dismissed, again and again. Features have come and gone. Heck, the entire project was literally restarted from scratch after it became obvious that the initial code base was a teetering, technological house of cards. Windows Vista, in other words, has been an utter disaster. And it’s not even out yet.

Doesn’t that sound a lot like the ill-fated Copland project?

Sadly, Gates, too, is part of the Bad Microsoft, a vestige of the past who should have had the class to either formally step down from the company or at least play just an honorary role, not step up his involvement and get his hands dirty with the next Windows version. If blame is to be assessed, we must start with Gates. He has guided – or, through lack of leadership – failed to guide the development of Microsoft’s most prized asset.

Perhaps Microsoft’s most serious mistake, retrospectively, was that Gates and Ballmer were too compatible. Ballmer should have driven Gates out of the company in the 80s, then Gates should have matured elsewhere, only to return triumphantly in the 90s with new, cool technology, in the nick of time to save the company that was going broke after Ballmer in turn had been pushed out… sounds familiar, too? icon_smile.gif

Now here’s another interesting part:

Here’s what you have to go through to actually delete those files in Windows Vista. First, you get a File Access Denied dialog (Figure) explaining that you don’t, in fact, have permission to delete a … shortcut?? To an application you just installed??? Seriously?

…What if you’re doing something a bit more complicated? Well, lucky you, the dialogs stack right up, one after the other, in a seemingly never-ending display of stupidity. Indeed, sometimes you’ll find yourself unable to do certain things for no good reason, and you click Allow buttons until you’re blue in the face. It will never stop bothering you, unless you agree to stop your silliness and leave that file on the desktop where it belongs. Mark my words, this will happen to you. And you will hate it.

This is exactly what happened to me when I, a few months ago, had to install Windows XP for my wife’s business (to run a proprietary vertical app, if you must know). I tried to set up an admin account for myself and a normal user account for the receptionist. This being the first time I’d ever seen XP, I did them in the wrong order… and then tried to organize the desktop and taskbars. In the end I had to wipe and reinstall everything. It seems Vista won’t be any better, sadly.

Thurrott goes on to complain about glass windows and the Media Center UI, which I can’t comment on myself. But, here’s a thought:

    One of the “stealth” features of Apple products is that more and more people are being subconsciously educated as to what constitutes good design.

We certainly aren’t that used to columnists criticizing details of the Windows UI; specialists like Don Norman, sure, but not mainstream columnists. Personally, I’d about given up commenting on bad UI to Windows users… they either just emit a blank “huh?” or say somewhat ruefully “well, that’s what computers are like, you know”. Not that the Mac UI is itself perfect – it’s still a work in progress – but at least we developers, and many people inside Apple, deeply care about producing good UI. (Here’s one example among hundreds.) If that attitude is now leaking out to the general public, so much the better.

Thanks to John C. Randolph for pointing out that article.

Rainer Brockerhoff wrote:

…Indeed, the usual hardware developer notes which used to come out a few weeks after a new PowerPC Mac was released are still absent for all Intel Macs – not that these notes ever went into chip-level detail of any Apple hardware.

The developer note for the Intel iMacs just came out (along with a few others). There are some interesting tidbits – for instance, EFI Flash memory size is 2 megabytes – but no mention of the TPM chip, as expected. Also, all the recent notes are very terse and go into less detail, even for PowerPC machines…

Rainer Brockerhoff wrote:

…But the main advantage is that the OSes for the virtual machines can be simplified. All the tricky little kexts and drivers you see on current PowerPC Macs will be substituted by one or two “generic” versions which will interface to the virtual peripherals simulated by the hypervisor, and the actual machine’s peripheral drivers will be in EFI or on the cards themselves.

One variation of this idea would be for Apple to sit down with game developers and define a basic Intel VM; just enough to see a drive partition, input devices and the video card. This would make porting very easy, as most games try to take over as much of the machine as possible anyway, and you’d have optimum performance.

Well, people sure have short memories.

I’ve commented several times before (for instance, here [8 months ago!], here, here, or here), that Apple’s Intel Macs contain an Infineon TPM chip. This from the very first developer transition kit, up to the latest released machine.

See my first link for details, and John Gruber‘s excellent analysis.

Today I was surprised to find several indignant articles pointing out that:

It looks like Intel has embedded “Trusted Computing” DRM protection in its Infineon chip and forgot to tell people.

and

…nobody wants to admit that the Intel Macs currently on sale have a TPM chip.

This is not only old news, but has been extensively photographed and discussed. It’s well-known that Apple uses the TPM chip in increasing degree in Mac OS X 10.4.x, to prevent people installing it on generic PCs, and it’s certain that Mac OS X 10.5 will also do so.

Does Apple come right out and say so? Admittedly not. Indeed, the usual hardware developer notes which used to come out a few weeks after a new PowerPC Mac was released are still absent for all Intel Macs – not that these notes ever went into chip-level detail of any Apple hardware. At the same time, Apple withdrew publication of a few kernel source files for Darwin, the open-source base for Mac OS X. Both facts demonstrate that Apple’s security locks are still in flux and may change extensively in the near future. Will all these things be documented in the future? Hard to say. If the TPM chip’s encryption is sufficiently strong, they could be documented without defeating Apple’s purpose; but keeping details hidden always helps.

Is this evil? Well, depends on your definition of course. As Gruber points out, people who are incensed about this should also boycott Linux for its support of several TPM chips, including Infineon’s. Certainly, Apple has a right to enforce its current license terms which state that Mac OS X should run only on Apple hardware.

But what else will the chip be used for in the future? As I’ve repeatedly wrote here before, using it for DRM protection of media – which is what most of the critics claim to fear – isn’t likely. Mostly because, if you do the math, Intel Macs will be a minority for years and any such protected media would either not work at all or be open on PowerPC Macs, of which there are several tens of millions still in operation.

What’s far more likely – and we’ll know for sure in August – is that the TPM chip will be used to boot a trusted hypervisor at the EFI level. Apple has even patented a scheme to run tamper-resistant code and more than one OS at once. From the wording it’s obvious that the TPM chip is used for that:

In one embodiment the system comprises a processor and a memory unit coupled with the processor. In the system, the memory unit includes a translator unit to translate at runtime blocks of a first object code program into a blocks of a second object code program, wherein the blocks of the second object code program are to be obfuscated as a result of the translation, and wherein the blocks of the second object code program include system calls. The memory unit also includes a runtime support unit to provide service for some of the system calls, wherein the runtime support unit is to deny service for others of the system calls, and wherein service is denied based on a tamper resistance policy.

So, what I think likely is that the machine will boot into the trusted hypervisor. This will be encrypted into firmware and decrypted, and checked against tampering, by the TPM chip. Once this is running it will show a screen like the Boot Camp boot selector, with one important difference: you’ll be able to select more than one OS to boot up. All of them, including Mac OS X itself, will run inside a virtual machine.

What’s the advantage? Of course all OSes will run at near-native speeds if nothing else is running at the same time – the hypervisor’s overhead will be negligible. In fact, this scheme has been used and refined on mainframes for decades, where it is assisted by hardware; now that Intel’s Core processors have hardware virtualization support, it should be easy to do likewise.

But the main advantage is that the OSes for the virtual machines can be simplified. All the tricky little kexts and drivers you see on current PowerPC Macs will be substituted by one or two “generic” versions which will interface to the virtual peripherals simulated by the hypervisor, and the actual machine’s peripheral drivers will be in EFI or on the cards themselves. This reduces disk and RAM usage at the expense of performance, although this shouldn’t be a problem except for games – but then, as I said below, hardcore gamers will prefer to boot directly into “the most popular game loader” anyway.

Another extremely desirable gain for Apple will be that they’ll only have a version of Intel Mac OS X that runs on this trusted virtual machine. To get this running on a generic PC, people would have to reimplement the entire Apple hypervisor too, write drivers etc., and even this would be easily defeatable by the TPM chip. Still, it’s a major architectural change and for that reason we’ll only see this in Leopard.

What boots it?

No comments

OK, people have asked me to comment on Boot Camp Public Beta.

If you’ve been away for the last few weeks, the $13K+ prize to make Windows XP boot on an Intel Mac has been won by two puzzle addicts. Granted that their solution is complex to implement and runs slowly due to the lack of proper video drivers (and others), but it’s still impressive. My Intel Mac mini hasn’t arrived yet, so I can’t speak from firsthand experience, but it seems it overlays just enough legacy BIOS responses on the Mac’s EFI to interact with an complementarily modified Windows XP.

Well, Wil Shipley and others donated money to that effort, and this seems to have convinced Apple, about a week later, to make “Boot Camp” public. It consists of three parts: a firmware upgrade that puts the (optional) legacy BIOS support module into the firmware, a small utility that allows nondestructive repartitioning of an Intel Mac’s hard drive, and a CD containing XP drivers for most (though not all) Intel Mac peripherals. It’s a beta, and some things don’t work yet, but it’s much smoother than the hacked-together version. In effect, the Intel Macs can now be dual-booted with Windows XP; also, people report progress in booting some Linux variants, and Vista support may not be impossible anymore. Ah yes, Apple has also stated that something like this will be a part of Leopard aka Mac OS X 10.5, which will be demoed at the upcoming WWDC and may be out around the end of the year. And AAPL stock shot up nearly 10% over the next two days…

So much for the facts. Interpretations are diverse; in fact, I haven’t seen so many divergent comments since Intel Macs were announced last June.

As usual, after a couple of days, Gruber, Siracusa and a few others posted excellent analyses of the situation. However, much of the immediate commentary was – let’s charitably say – weird. Immediate doom has been predicted for Apple first and foremost, as well as for Microsoft, for Dell, and for software developers. Let’s look at that last idea first.

Most non-developers are saying that, obviously, Mac developers will now fold up and die, or migrate to become Windows developers in droves, or (if they support both platforms) discontinue Mac versions of their products. After all, all Mac support questions can now be answered by “boot into XP”. And Windows is where the money is, right?

Wrong. Let’s check each type of developer separately. There are the two big ones: Microsoft and Adobe. Microsoft obviously won’t close the Macintosh Business Unit (MBU); I hear it’s their top division in terms of income per employee. Obviously, most Mac users want Mac versions of their applications, even if they have to be from Microsoft. The same goes for Adobe products; most of them were, originally, ported from the Mac to Windows anyway. And even if Adobe is having a hard time porting their stuff from CodeWarrior to Xcode, eventually they’ll do so.

At the other end of the spectrum are small developers like myself, up to small 3- or 5-person shops. Very few of those are multiplatform. I can safely say that an overwhelming percentage are Mac-only because developing on the Mac, for the Mac, is enjoyable and lucrative. Read Wil Shipley’s interview and his WWDC Student Talk and see what I mean. Here’s a pertinent part:

I love the Mac user base because they tend to be people who are into trying out new software and recommending it to each other and giving the little guy a chance. Windows users have demonstrated, ipso facto, that they do not believe in the little guy.

The two types of Windows users I’ve identified at my café are:

a) I use Windows to run Word and Excel and browse the web (and read e-mail in my web browser), and

b) I’m a programmer and I spend all my time in a Windows IDE or hacking around with my system.

The problem is that market (a) already has all the software they think they’ll ever need, and clearly isn’t into looking beyond what they already have or they’d have noticed they could do all that they currently do, and more, but much easier, on a Mac. And market (b) is too small for me to aim any software at it.

No doubt most non-developers (and Windows developers like (b) above) believe that developers mostly hate their jobs and just do whatever distasteful thing is necessary to maximize their income. Well, it’s not really that way; granted that many of us have to work to pay for the groceries, and Mac-related jobs are not really plentiful (yet!), but many .NET slaves spend extra hours at their home Macs to write really cool software.

In other words, we write for the Mac because it’s satisfying and would do it even for free, all day, every day (assuming the grocery problem to be solved somehow). Would I migrate XRay to Windows? No way. The tools aren’t there, the APIs are uncool, and the Windows community – well, as far as I can tell, there’s no Windows community at all. And regarding the market size, better a small fish in a small pond, and all that.

So what about the middle-sized software companies? Here the situation may not be as clearcut. It depends a lot on company culture, I suppose. Are the people in charge active Mac users but also target Windows just because, well, they might sell a lot of copies over there? Or are they primarily Windows developers which also have a Mac version championed by a couple of vocal believers among their programmers? It could be either way, and only time will tell. But should some of the latter type close out their Mac support, they might have done it anyway sooner or later.

Now, game developers are a special case. Discounting for the moment some diehard Mac-only game developers, reactions among the multiplatform gamers have been very cautious. After all, a game user is the person most likely to dual-boot into Windows just to run the very latest game at full speed – though such a fanatic is still more likely to have a dedicated, souped-up PC just for that purpose. So, widespread availability of Boot Camp might, really, lead some game companies to neglect Mac versions, purely for economical reasons.

Update: Ouch, I forgot to put in John C. Randolph’s comment on this:

Apple now lets you use the most popular game loader!

…and he’s sooo right! icon_biggrin.gif

Stay tuned for more comments on this…

Photos licensed by Creative Commons license. Unless otherwise noted, content © 2002-2026 by Rainer Brockerhoff.
Iravan child theme by Rainer Brockerhoff, based on Arjuna-X, a WordPress Theme by SRS Solutions. jQuery UI based on Aristo.