Posted by camilotelles:
funny
Posted by camilotelles:
funny
Well, people sure have short memories.
I’ve commented several times before (for instance, here [8 months ago!], here, here, or here), that Apple’s Intel Macs contain an Infineon TPM chip. This from the very first developer transition kit, up to the latest released machine.
See my first link for details, and John Gruber‘s excellent analysis.
Today I was surprised to find several indignant articles pointing out that:
It looks like Intel has embedded “Trusted Computing” DRM protection in its Infineon chip and forgot to tell people.
and
…nobody wants to admit that the Intel Macs currently on sale have a TPM chip.
This is not only old news, but has been extensively photographed and discussed. It’s well-known that Apple uses the TPM chip in increasing degree in Mac OS X 10.4.x, to prevent people installing it on generic PCs, and it’s certain that Mac OS X 10.5 will also do so.
Does Apple come right out and say so? Admittedly not. Indeed, the usual hardware developer notes which used to come out a few weeks after a new PowerPC Mac was released are still absent for all Intel Macs – not that these notes ever went into chip-level detail of any Apple hardware. At the same time, Apple withdrew publication of a few kernel source files for Darwin, the open-source base for Mac OS X. Both facts demonstrate that Apple’s security locks are still in flux and may change extensively in the near future. Will all these things be documented in the future? Hard to say. If the TPM chip’s encryption is sufficiently strong, they could be documented without defeating Apple’s purpose; but keeping details hidden always helps.
Is this evil? Well, depends on your definition of course. As Gruber points out, people who are incensed about this should also boycott Linux for its support of several TPM chips, including Infineon’s. Certainly, Apple has a right to enforce its current license terms which state that Mac OS X should run only on Apple hardware.
But what else will the chip be used for in the future? As I’ve repeatedly wrote here before, using it for DRM protection of media – which is what most of the critics claim to fear – isn’t likely. Mostly because, if you do the math, Intel Macs will be a minority for years and any such protected media would either not work at all or be open on PowerPC Macs, of which there are several tens of millions still in operation.
What’s far more likely – and we’ll know for sure in August – is that the TPM chip will be used to boot a trusted hypervisor at the EFI level. Apple has even patented a scheme to run tamper-resistant code and more than one OS at once. From the wording it’s obvious that the TPM chip is used for that:
In one embodiment the system comprises a processor and a memory unit coupled with the processor. In the system, the memory unit includes a translator unit to translate at runtime blocks of a first object code program into a blocks of a second object code program, wherein the blocks of the second object code program are to be obfuscated as a result of the translation, and wherein the blocks of the second object code program include system calls. The memory unit also includes a runtime support unit to provide service for some of the system calls, wherein the runtime support unit is to deny service for others of the system calls, and wherein service is denied based on a tamper resistance policy.
So, what I think likely is that the machine will boot into the trusted hypervisor. This will be encrypted into firmware and decrypted, and checked against tampering, by the TPM chip. Once this is running it will show a screen like the Boot Camp boot selector, with one important difference: you’ll be able to select more than one OS to boot up. All of them, including Mac OS X itself, will run inside a virtual machine.
What’s the advantage? Of course all OSes will run at near-native speeds if nothing else is running at the same time – the hypervisor’s overhead will be negligible. In fact, this scheme has been used and refined on mainframes for decades, where it is assisted by hardware; now that Intel’s Core processors have hardware virtualization support, it should be easy to do likewise.
But the main advantage is that the OSes for the virtual machines can be simplified. All the tricky little kexts and drivers you see on current PowerPC Macs will be substituted by one or two “generic” versions which will interface to the virtual peripherals simulated by the hypervisor, and the actual machine’s peripheral drivers will be in EFI or on the cards themselves. This reduces disk and RAM usage at the expense of performance, although this shouldn’t be a problem except for games – but then, as I said below, hardcore gamers will prefer to boot directly into “the most popular game loader” anyway.
Another extremely desirable gain for Apple will be that they’ll only have a version of Intel Mac OS X that runs on this trusted virtual machine. To get this running on a generic PC, people would have to reimplement the entire Apple hypervisor too, write drivers etc., and even this would be easily defeatable by the TPM chip. Still, it’s a major architectural change and for that reason we’ll only see this in Leopard.
OK, people have asked me to comment on Boot Camp Public Beta.
If you’ve been away for the last few weeks, the $13K+ prize to make Windows XP boot on an Intel Mac has been won by two puzzle addicts. Granted that their solution is complex to implement and runs slowly due to the lack of proper video drivers (and others), but it’s still impressive. My Intel Mac mini hasn’t arrived yet, so I can’t speak from firsthand experience, but it seems it overlays just enough legacy BIOS responses on the Mac’s EFI to interact with an complementarily modified Windows XP.
Well, Wil Shipley and others donated money to that effort, and this seems to have convinced Apple, about a week later, to make “Boot Camp” public. It consists of three parts: a firmware upgrade that puts the (optional) legacy BIOS support module into the firmware, a small utility that allows nondestructive repartitioning of an Intel Mac’s hard drive, and a CD containing XP drivers for most (though not all) Intel Mac peripherals. It’s a beta, and some things don’t work yet, but it’s much smoother than the hacked-together version. In effect, the Intel Macs can now be dual-booted with Windows XP; also, people report progress in booting some Linux variants, and Vista support may not be impossible anymore. Ah yes, Apple has also stated that something like this will be a part of Leopard aka Mac OS X 10.5, which will be demoed at the upcoming WWDC and may be out around the end of the year. And AAPL stock shot up nearly 10% over the next two days…
So much for the facts. Interpretations are diverse; in fact, I haven’t seen so many divergent comments since Intel Macs were announced last June.
As usual, after a couple of days, Gruber, Siracusa and a few others posted excellent analyses of the situation. However, much of the immediate commentary was – let’s charitably say – weird. Immediate doom has been predicted for Apple first and foremost, as well as for Microsoft, for Dell, and for software developers. Let’s look at that last idea first.
Most non-developers are saying that, obviously, Mac developers will now fold up and die, or migrate to become Windows developers in droves, or (if they support both platforms) discontinue Mac versions of their products. After all, all Mac support questions can now be answered by “boot into XP”. And Windows is where the money is, right?
Wrong. Let’s check each type of developer separately. There are the two big ones: Microsoft and Adobe. Microsoft obviously won’t close the Macintosh Business Unit (MBU); I hear it’s their top division in terms of income per employee. Obviously, most Mac users want Mac versions of their applications, even if they have to be from Microsoft. The same goes for Adobe products; most of them were, originally, ported from the Mac to Windows anyway. And even if Adobe is having a hard time porting their stuff from CodeWarrior to Xcode, eventually they’ll do so.
At the other end of the spectrum are small developers like myself, up to small 3- or 5-person shops. Very few of those are multiplatform. I can safely say that an overwhelming percentage are Mac-only because developing on the Mac, for the Mac, is enjoyable and lucrative. Read Wil Shipley’s interview and his WWDC Student Talk and see what I mean. Here’s a pertinent part:
I love the Mac user base because they tend to be people who are into trying out new software and recommending it to each other and giving the little guy a chance. Windows users have demonstrated, ipso facto, that they do not believe in the little guy.
The two types of Windows users I’ve identified at my café are:
a) I use Windows to run Word and Excel and browse the web (and read e-mail in my web browser), and
b) I’m a programmer and I spend all my time in a Windows IDE or hacking around with my system.
The problem is that market (a) already has all the software they think they’ll ever need, and clearly isn’t into looking beyond what they already have or they’d have noticed they could do all that they currently do, and more, but much easier, on a Mac. And market (b) is too small for me to aim any software at it.
No doubt most non-developers (and Windows developers like (b) above) believe that developers mostly hate their jobs and just do whatever distasteful thing is necessary to maximize their income. Well, it’s not really that way; granted that many of us have to work to pay for the groceries, and Mac-related jobs are not really plentiful (yet!), but many .NET slaves spend extra hours at their home Macs to write really cool software.
In other words, we write for the Mac because it’s satisfying and would do it even for free, all day, every day (assuming the grocery problem to be solved somehow). Would I migrate XRay to Windows? No way. The tools aren’t there, the APIs are uncool, and the Windows community – well, as far as I can tell, there’s no Windows community at all. And regarding the market size, better a small fish in a small pond, and all that.
So what about the middle-sized software companies? Here the situation may not be as clearcut. It depends a lot on company culture, I suppose. Are the people in charge active Mac users but also target Windows just because, well, they might sell a lot of copies over there? Or are they primarily Windows developers which also have a Mac version championed by a couple of vocal believers among their programmers? It could be either way, and only time will tell. But should some of the latter type close out their Mac support, they might have done it anyway sooner or later.
Now, game developers are a special case. Discounting for the moment some diehard Mac-only game developers, reactions among the multiplatform gamers have been very cautious. After all, a game user is the person most likely to dual-boot into Windows just to run the very latest game at full speed – though such a fanatic is still more likely to have a dedicated, souped-up PC just for that purpose. So, widespread availability of Boot Camp might, really, lead some game companies to neglect Mac versions, purely for economical reasons.
Update: Ouch, I forgot to put in John C. Randolph’s comment on this:
Apple now lets you use the most popular game loader!
…and he’s sooo right!
Stay tuned for more comments on this…
…see also Top 100 April Fool’s Day Hoaxes. Personally, I think the BBC’s Spaghetti Harvest is unbeatable.
Update: Chuq von Rospach reveals the truth about the RDF.
Now and then I read complaints about Xcode on blogs and mailing lists. It’s come a long way but some parts are still slow and cumbersome, granted. One of the complaints – which usually comes from Java or Windows C++ migrants – is that Xcode has no refactoring aids. Some people even publish workarounds.
So what is this refactoring thing anyway? According to Wikipedia:
Refactoring is the process of rewriting a computer program or other material to improve its structure or readability, while explicitly keeping its meaning or behavior…
Refactoring does not fix bugs or add new functionality. Rather it is designed to improve the understandability of the code or change its structure and design, and remove dead code, to make it easier for human maintenance in the future. In particular, adding new behavior to a program might be difficult with the program’s given structure, so a developer might refactor it first to make it easy, and then add the new behavior.
I’d tend to agree with that, up to a point. I usually refactor when I reach a dead end in the software’s structure, that is, when the current structure won’t allow me to proceed implementing what I want to implement. Or – probably the same thing, essentially – when I find myself implementing things I don’t want to implement anymore.
But my tendency (see fractal programming) is to do it in the reverse order; I write some code that does new stuff in a new way. Then I migrate lots of old code into the new scheme, often rewriting it radically if necessary, or throwing entire blocks away. (Well, not literally at first; I prefer to comment such blocks out or move them into a “dead code” file for later reference.)
Now, the aforementioned migrants usually don’t see it that way. Rather, they want some automation to make the process easier:
An automated tool such as a SCIDs to help you do might work like this:
– I have a method which has some code that I would like to pull out into its own method.
– I highlight the offending code.
– I select Extract Method from a popup menu
– The RefactoringBrowser asks me to name the method and automatically creates it and inserts the highlighted code.
– In the current method, the highlighted code is replace by an invocation to the newly created method.
All very nice, but it presumes several things which I don’t see coming to Xcode (at least not to the Objective-C parts):
– You have a very regular, structured style of coding that conforms to standards the “RefactoringBrowser” understands.
– You always use the standard refactoring methods, such as expanding, collapsing, pulling out, pushing in, whatever.
– All source code in your project has been previously parsed and stored in the SCID (source code in database), so the browser and refactoring software have a perfect understanding of your code.
This is perfectly possible (or at least I’m told it is) in Java and perhaps in C++ – though I’m skeptical about the latter. I was astounded when a friend, who was qualifying to some Java certificate or other, asked me to have a look at his source code. A quite trivial program was expanded to several dozen source files, consisting of literally hundreds of small methods that differed from each other only by name and a few characters or line. No doubt everything was set up very logically and hierarchically and according to whatever standards a certified Java programmer must obey, but… it was completely illegible by my (admittedly eccentric) standards. It was code only a SCID could love.
So, suppose my friend decided to refactor his code. Just renaming a few classes must necessarily entail profound changes in all source and project files. Not only must the filenames themselves be changed, but all mentions of this class must also be changed. Wait, don’t we have global search & replace for that…?
But, of course, renaming classes or methods is trivial. Maybe it’s suddenly obvious that you need to push some methods off into a subclass, or pull them up into their superclass. Wouldn’t it be nice to have this done automatically?
Well, not really. First off, inheritance is much less used in Objective-C – or at least, I use it less than I used to do in my C++ days (I refuse to learn Java). Runtime binding, introspection and categories mean you usually don’t have to subclass more than one or two deep from the standard Cocoa classes. In fact, I believe I just today went to the third subclass level for the first time. So, automating such a superfluous process makes little sense.
Second, remember that Objective-C is just a small superset of C (unlike Java and C++ which are C-like languages). And that Mac OS X is Unix-based, with many headers pulled in from heterogenous sources. This means, of course, that all the old crufty things of the old C days are still there… pointers, pointer casting, weird #defines, other tricks – you name it. And all of this is liable to be #included into your poor unsuspecting code if you do any work at all with Carbon or BSD APIs, as most non-trivial applications need to.
In other words, I don’t believe it’s possible to feed all of this into a SCID and expect it to behave rationally; of course the gcc compiler has to make sense of all this eventually, but I seriously doubt it could be easily refactored to go back and forth from its internal representation to the source code while it’s being edited. It’s still, essentially, a batch processor.
Suppose someone pulled this transformation off. Suppose all the old C headers were tamed to make them compatible. Suppose we had everything in a hierarchical, intelligent, refactoring browser/editor. Now what?
It may be some congenital deficiency in my own neural wiring, but I can’t recall ever refactoring my code twice the same way (except for that trivial class/method renaming). So again, not much for an automated “RefactoringBrowser” to do.
Well. All this to, finally, say that I’ve been stuck refactoring some of my code – specifically, the XRay II file system browser back-end… and of course, no automation would have helped me. Nor would I have trusted it to do anything like what I want.
This is perhaps the third or fourth version of it, and it’s easily the most complex refactoring I’ve ever done. Unfortunately there are no intermediate steps. 20 days ago everything was compiling and running nicely (except, of course, for the problems that led me to this refactoring attempt). Then suddenly it’s like open-heart surgery. Nothing compiles and the number of error messages is so great that gcc throws its metaphorical hands up and goes off to sulk in a corner. I can’t close the patient up again until everything has been put back into place – even if it’s not the same place. And it’s a lot of information to hold in one’s head at the same time. I suppose I must get a second monitor, but that’s not practical at this moment.
And the availability of powerful time-sinks like the Internet means that it’s almost impossible to summon the necessary concentration to do the surgery in a single run. I’ve made serious progress over this weekend by the simple expedient of staying overnight with some friends who don’t have an Internet connection (or, even, a phone line). Still, sometimes it’s necessary to read and write e-mails, chat, even write long posts about refactoring…
Even so, I hope to get past this obstacle during the next few days and write a little about actual results. Turns out I learned a lot about in the process. More as soon as possible, then.
I was about to write a long, carefully reasoned post, explaining why Apple should not sell Mac OS X as a separate item for generic Intel boxes…
…when the CodePoet pre-empted me with even better arguments. So please go read “Should Apple sell Mac OS X for beige Boxes?“…
Nando wrote:
Well this wouldn’t work on our actual system I guess, but if you prepared the system to have an ID or other metadata for each icon, I really think it could be done. But all I want for now is to keep my day job
Well, I sympathize with your last sentence .
Still, there’s no way to make an ID for each icon that reliably tells if some custom isual is visually similar to a document icon already on the system. At least with the current state-of-the-art of image recognition software.