Solipsism Gradient

Rainer Brockerhoff’s blog

Browsing Posts tagged Mac

When I began programming many years ago (now 55 and counting!), computing was in its infancy. We wrote programs on blue coding sheets, had them converted into decks of punchcards, and queued them on a shelf for “batch processing”. Usually the reward was a program listing with some obscure error messages like IEC107D, and I would mentally step through the program, repeating the process until being rewarded with a working “run”. I soon found employment at the local university, where more often than not I could run my program deck through myself and figure out things faster — and later on, in the wee hours, even use the huge mainframe computer as a primitive but enthralling personal device.

After some years the first video terminals came out, where you could view all lines of the program in glowing green rows, edit them directly, and enter the program into the batch queue. Was this the future? Well, there still was much to come. Very soon I realized I could buy one of those new Apple IIe computers along with a small TV and program/debug in the comfort of my home. And, miraculously, people actually paid me to sit at a computer keyboard and program.

And then the future arrived. When I saw the reports of the first Macintosh — a full graphics screen and a mouse — I knew that was it! The point-and-click user interface was the best. No more command lines. No more moving a clunky cursor around with the keyboard. It was heaven! And it only got better: color, larger screens, huge amounts of RAM and storage, networking! And then the internet came in, wonder of wonders.

It took years to evolve the expected standard behaviors of mouse gestures and UI conventions; application programmers had a library of items to use, so that pretty soon everything worked as expected and programs that flouted those conventions were downrated.

I was in the audience, in 2007, when Steve Jobs introed the first iPhone as a combination cellphone, iPod, and internet device. I wasn’t very impressed; I already had an iPod which I used little, had no plans of getting a cellphone (and indeed, held out until 2016 to buy one!), and the internet part was cool but the screen was small, the browser was limited and my laptop Mac had a much better feature set. OK, it wasn’t as portable but I always had it with me…

The real revolution here was the multitouch interface, an evolution of the point-and-click interface but with your fingers standing in for the mouse. It took years to evolve, and as with the Mac, Apple offered standard UI items to developers, which could then concentrate on functionality instead of reinventing the wheel.

But then in 2010 the first iPad came out and I bought one right away. Now here was a portable computer good enough to use as a personal device, and latter versions became more and more impressive. My iPad today is my main device for reading, listening to music and casual browsing, although I still fall back on the Mac for developing and writing longer texts. With my recent eye troubles I tend to fall back on my 77″ TV for watching movies, though, and programming has been curtailed.

Now the future has arrived again. I have no doubts that Apple’s Vision Pro — and, of course, “spatial computing” — is the Next Big New Thing.

Oddly enough, one of my arguments here is the sheer volume of vitriol about the device that one can find on the social networks: it’s unwieldy, it’s expensive, the external battery sucks, it’s been done already by others, it’s isolating and dystopian, it’s one more dragon in Apple’s evil ecosystem… Apple is doooomed, I tell you! Sound familiar? (And I’m not even on most of those networks!)

Hey, all those negatives were also rolled out in the past — and before we had Apple’s evil ecosystem, we had IBM’s evil ecosystem, remember? Every time such a futuristic device is sighted, it is clunky, expensive, power-hungry, and so forth. But also, if conditions are just right… the next version appears just in time, and it’s lighter, easier to use, and we can’t live without it anymore.

The real futuristic paradigm is now look-and-click, evolved from the old point-and-click way, and many other gestures are possible; standardisation is no doubt in progress. Why hold a mouse, or a control, or lift your hands unnecessarily, when you can convey all with small, subtle gestures? And our brains are evolved for a 3D environment — all our language and thinking is constructed around 3D metaphors.

Now, finally, we have a minimum viable system to explore 3D user interfaces. Discussing whether the Vision Pro is AR, VR or XR is besides the point; those are just implementation details and will evolve along with the UI.

Details on the hardware are scarce as I write this, and some may even be uninteresting — this thing is a self-contained computer on your face and the only relevant spec will probably be how much SSD space is left for user data. Everything else will be just good enough to be effectively transparent. And that’s the major point about the Vision Pro hardware: it’s a 3D input device for your brain, the first with sufficient quality to be transparent. Who needs holograms?

But, people will ask, what is the use case for this thing for the normal user/gamer/TV watcher/whatever? My answer: we don’t really know! Apple has, of course, selected some classic cases for the previous tech: FaceTime, games, 3D photos/videos, widescreen movies, Excel spreadsheets hanging in space (ugh), etc.; they had to do it, to get people’s attention. But between now and whenever this comes on the market in early 2024, developers will burn the midnight oil to build compelling use cases, most of which nobody (not even Apple!) had thought of before.

And this is where the Vision turns Pro. I believe that, rather than just designating a higher-capacity device compared to a non-Pro version, the Vision’s Pro name indicates that, at least until the 2nd or even 3rd generation comes out, this is a device for professionals. This is not (yet!) for casual gamers, zoomers or moviewatchers. Here’s a partial list I and a few friends came up with in a few minutes:

  • Architects, engineers, designers (as usual)
  • Doctors, dentists, psychologists, therapists and researchers in general
  • Educators and grad/postgrad students
  • Technicians in hitech fields like power generation, aircraft maintenance
  • Industrial applications are boundless
  • Astronomers, archaeologists, artists

And most of these can afford (or their companies can) the current $3499 price.

Com to think of it, I’ll probably buy two; it’ll still come in cheaper than paying for a huge 3D TV, multiple big screens/projectors and a couple of M2 Macs.

More as details emerge.

WWDC 2020 opens next June 22nd and all indications are that the highest-impact announcement will be the Mac’s migration from Intel to the ARM architecture.

While CPU architecture migrations are infrequent — they happen every decade or so — Apple has a good track record of pulling them off successfully.

The first major migration was the move from Motorola 68K to PowerPC chips around 1994, followed by moving from the Classic Mac System 9 to Mac OS X around 2000. Relevant here was that for some time Mac OS X ran older applications in the “Classic Environment”: a compatibility sandbox that emulated the APIs of System 9 and the instruction set of the 68K.

This worked reasonably well as PowerPC CPUs were several times faster than the old 68K ones. It also introduced the concept of “fat binaries“; the same application file contained code for both old and new environments.

A better historical precedent is the move from PowerPC to Intel processors in 2006. This was more traumatic for developers, as PowerPCs were “big-endian” and Intel CPUs were “little-endian”. This meant that, except for strings, values stored in memory, files or transmitted over networks had a different byte sequence ordering. To have the same program source code work on both systems you could no longer assume it would just work, but had to bracket your instructions with macros or function calls that would do nothing on one platform but swap bytes around on the other.

If you’re not an oldtimer like myself you probably never had to think about this — every Mac, iPhone, iPad, Apple Watch or Apple TV use little-endian values, and I even had to dig into documentation to make sure of it. ARM CPUs can be run in big-endian mode by setting a special bit at boot time but this is not the default, and no Apple device uses that mode.

Now, this meant that in 2006 developers could not just migrate their apps to Intel by recompiling; we had to look through every line to either check that it was endian-neutral, and if it wasn’t, those special macros had to be used. For people who had very CPU-specifically optimized code — perhaps even in (shudder) assembly language — separate code sections were necessary.

Having done all this, you recompiled your app twice; once for PowerPC, once for Intel; and the magic of fat binaries allowed you to ship it all in one app. Later on, some apps even needed 3 or 4 different code sections, depending not only on endianness but also on whether they would run on a 32- or 64-bit CPU!

Another — today mostly forgotten — aspect was that Apple prepared for the Intel migration by gradually modernizing and building their developer toolchain in-house. LLVM, Clang, LLDB etc. allowed Apple to ensure that, for whatever CPU they wanted to support, compilers were ready beforehand and could be optimized continuously later on, without depending on outsiders.

Still, in 2006 Apple had to ship special hardware, “Developer Transition Kits”, to select developers for testing. For software that couldn’t be converted to the new architecture, Apple introduced a limited compatibility box: Rosetta. If I recall correctly, it did on-the-fly translation of PowerPC code into Intel code, which was then cached. Because of its limitations it didn’t work for many larger applications and was soon phased out.

Moving in parallel to the PowerPC to Intel migration was a slower-motion shift in operating system APIs. Most notably, this involves Carbon and Cocoa.

Carbon was a C-based API introduced in 2000 to ease migration from Classic System 9 to Mac OS X. Cocoa, introduced around the same time, was an Objective-C based API for modern object-oriented programming, itself an evolution of NeXT’s OpenStep system. Underneath both APIs, in the now well-known layer model, was Core Foundation, which could be used from both types of apps; and some apps (like my own) could mix calls to both APIs with some care.

Not too long after the Intel migration, Apple announced that 64-bit was the future, and that Carbon would not be migrated to that environment. This process was stretched over several years and involved redefining what APIs were really considered “Carbon”; some, like the File Manager, were “de-carbonized” and lived on until macOS 10.5 (Catalina) came out.

Cocoa, on the other hand, continues to be used everywhere in macOS. The Finder, the Dock, Xcode, and Safari are all Cocoa apps. Even when Swift came out a few years ago most of it was built on top of Cocoa and Objective-C objects; the notable exception is the Swift toolchain itself.

So, after all this, here we’re looking at Yet Another Hardware Migration for Macs. Let’s look at the implications.

Economically, it makes sense for Apple, as many others have already commented. They’ll no longer be bound to a foreign evolution roadmap on which they have little influence. They have extensive experience in producing high-performance, low-power CPUs for their mobile devices, and the latest versions already outperform Intel in some situations.

Technically, it’s a huge win. Switching to ARM64 — and not just the standard ARMv8.x architecture licensed from ARM, but with their own, extensive modifications — will allow them to have unified GPUs, Neural Engines, memory controllers and so forth on all their line, with more uniform device drivers and low-level programming.

For 99% of developers, I think nothing will change. The new chips are little-endian also, so a simple recompile will have Xcode produce a fat binary for the new Macs which should run outright. Of course, if you have assembly language sections in your program and/or write kernel extensions/device drivers, time to learn a new architecture…

Snags will come for people who dislike, or can’t use, Xcode. Some have to use Intel’s compilers, for instance; I know too little about such cases to have an informed opinion, sorry.

Some pundits seem to expect a sudden concurrent change in macOS; something like Objective-C and/or Cocoa being obsoleted in favor of Swift and SwiftUI. Or even the Mac going away entirely, some sort of huge desktop iPad taking its place. In my view this won’t happen. For one, what would developers or even most Apple engineers use for development?

A big question is: will Apple be able to provide an Intel compatibility box on the ARM Macs? Certainly Boot Camp will not be available. Running a virtualizer like VMware Fusion or Parallels seems almost as difficult, unless the new CPUs have some sort of hardware assist to decode x86-64 instructions. This may not be as outlandish as it sounds; current Intel/AMD processors already break x86 CISC instructions into RISC micro-operations which are then cached and executed by the “inner” CPU. This is a gross oversimplification but in theory nothing — except silicon space — bars Apple from breaking x86 instructions into ARM instructions.

A Rosetta-like box seems more feasible for running individual Intel applications, but who needs that? Game users? Performance will be limited. Most virtualizer app users want the complete OS running and with native speed. Linux/BSD might be available soon; perhaps Windows for ARM.

But what about Catalyst, some of you may ask? Here I can only shrug. In its present form it certainly is not an important future technology for macOS. While simple apps can be done with it — perhaps purely for the benefit of developers unfamiliar with AppKit — can you envision a Catalyst Finder? SwiftUI is still very new and primitive, and will continue to be layered on top of AppKit/UIKit for some time. They may merge in the future, or be renamed gradually like Carbon was, but that’s a long time out.

Finally, hardware. I don’t think the existing A13 SoCs would be applicable to any Mac, though. Some version of the Mac mini would be the obvious candidate to be the first to get the all-new CPU. It would then percolate up through the laptop line and the iMac. In these cases, reduced power usage would be a bonus — even for the iMac, it would mean a smaller power supply, less heat and a thinner enclosure.

The Mac Pro should be the last Intel redoubt. Multiple CPUs, OEM graphic cards, generic PCIe cards — Apple will have to address a huge range of problems there and this will take years.

Enough handwaving for now; the usual disclaimers apply and I’m really looking forward to the keynotes next week.

Update:
— corrected date for the 68K->PowerPC migration. Thanks to Chris Adamson for catching the error.
— fixed some awkward language about virtualization. Thanks to Maurício Sadicoff.

A Tale of Two Certs

6 comments

I’m keeping this post updated as details develop…

About ten days ago, something strange happened on my Mac: I was debugging the next version of my RB App Checker Lite app and suddenly I saw the dreaded dialog box:Damaged

Completely abnormal, especially as I was debugging using the Developer ID version (not the Mac App Store version!) from inside Xcode. When I opened Terminal, the same dialog; when I opened Safari, same thing! No new process was allowed to run. Of course I had to reboot to be able to do anything, everything worked fine afterwards, and I couldn’t reproduce the problem, so…

OK, a couple of days ago I concluded all was ready and I uploaded my app for review. A few hours after I announced so on Twitter, the reports began appear: the sky is falling! Major Mac App Store meltdown, everybody was getting the “damaged” dialog, Apple’s certificates were the culprit. I started testing my local apps from the MAS and, sure enough, the MAS leaf cert had expired; no problems, some of them asked anew for the AppleID password, some didn’t. RB App Checker Lite showed the expiration but no other problems, but I pulled it from review just in case.

Two days of confusion and frantic coding later, I had submitted (and pulled!) 4 more builds until I was reasonably sure that everything was working correctly. Thanks to several fellow developers on Twitter, the upcoming version seems to show everything correctly; it turned out that my receipt checks were somewhat obsolete. I usually publish the direct download version only after the MAS version has passed review, but decided to release version 1.1.4, build 351 immediately: you can get it here. It has a long list of improvements and fixes.

Meanwhile, the consensus is that rebooting and re-entering the AppleID and passwords (or even deleting and reinstalling) the affected apps solves 99% of the problems.

There are actually several different unfortunate problems here. First, the “damaged” dialog seems to be caused by some sort of cache or memory corruption in the system processes that coordinate to implement GateKeeper and the app store updates; some reports say killing the “storeagentd” process solves this problem without rebooting. (My system doesn’t seem to run this, FWIW.) What not everyone knows is that this dialog appears before the app it allowed to run; that is, it’s not affected by any checking done inside the app itself!

Second, asking for a new AppleID password. This is caused by the app itself checking the store receipt; something strongly recommended by Apple, since otherwise, it’s easy to copy a downloaded app to another computer and having it run there; I remember some early games not doing this and being widely pirated.

When an app is downloaded from the MAS, a proper receipt for that AppleID and that computer is already inside. A missing or corrupted receipt is the only normal circumstance in which the “damaged” dialog should appear. But if you copy the app to another computer, this will be noticed by the app itself.

Once a MAS app starts up, the first thing it should do is to check the receipt. It’s a complex process and not everybody implements it the same way. At first, checking the receipt’s cert chain would cause the receipt to be rejected in the case of expiration; the app exits with a special numeric code (exit 173) and this code signals the system to put up the dialog asking to confirm the purchaser’s AppleID and password. This, in turn, will cause a new receipt to be downloaded, and the app can now run with no problems. Update: reports indicate that, in at least some cases, the system doesn’t respond properly to exit 173.

A few years ago receipts began to include a new field containing the receipt’s creation date, and developers now had to check the certs against that date (and not against the current date), therefore obviating the need to reenter the password. Unfortunately this was not widely divulged, and Apple’s own sample code hasn’t yet been updated accordingly; I confess to not seeing this myself!

As is usual in disasters, several things have to go wrong at the same time: some bug corrupts a critical system cache, certificates expire normally, some apps incorrectly test for expiration, receipts are corrupted or the AppleID validation servers become slow or unreachable (because of the huge number of simultaneous requests), and… boom.

Many articles, unfortunately, published factual errors or wrong assumptions.Let’s try to counter a few:

  • Apple “allowed” their Mac App Store certificate to expire. Wrong on several levels. First, there’s not one but 5 (!) certificates involved in any app from the store: Apple’s root certificate: and 4 others: two intermediate and two leaf certificates.
    The way these certs work is by so-called certificate chains; every cert vouches for the lower-level ones. At the top is Apple’s Root certificate, which is one of a hundred or so in the System Keychain. There are two different certificate chains in every MAS app; the first is used in the code signature:and the second is used to sign the store receipt:Note the expired certificate there? This is a leaf certificate. These, usually, have a short life — one or two years — and the intermediate certificates usually last a little longer.
    So, when a cert expires, is that a serious problem? No – unless it is the root cert, which is why they all expire somewhen in the 2030s — hopefully, by that time, they’ll have figured out something better, Apple will have updated the cert via Software Update, or the horse will have learned to sing.
    The root cert can be updated via Software Update because it’s stored in System Keychain — but it’s impractical to push cert updates to each and every signed app, bundle or library; there are many thousands of them! So an expired cert in the code signature doesn’t affect the app at all. What’s important is that the certs were valid when the app was signed. When and if you get a new version of the app, all certs will probably be new ones. So there’s no “allowing” a leaf cert to expire — they do so naturally.
  • Apple “pushed” a new certificate that expires in 2035. This is probably just looking in the wrong place — not knowing which certificate had expired, someone glanced at the root certificate and noticed the “new” 2035 date. Nothing new to see, of course; that cert was created in 2006! Even more confusingly, someone else deduced from that that Apple let their original root cert expire; also wrong.
  • The system hasn’t been updated to check SHA2 (256) certificates. Wrong; it’s true that older systems used a version of the OpenSSL library that understood only SHA1 (128) certs, but that actually means 10.5 or so. Newer systems understand SHA2, and in any event, since the MAS went up, Apple has always recommended developers to not use the system’s OpenSSL library (I think it’s not even included anymore), so only very old apps would be affected by that.
    Update: Glenn Fleishman has informed me about the SSL situation: there’s the new 1.0.x library branch and the older 0.9.x branch. Both apparently got SHA2 support in 2010, when 1.0.0 and 0.9.8o came out, but some developers seem to have kept older versions, no doubt for valid reasons; space precludes, etc.
  • Apple is blaming developers. Apparently this can be traced to a single report of misinformation from an anonymous Apple Support person. As I write this, Apple hasn’t yet said anything; I doubt they’ll say anything over the weekend.
  • This is a serious security/cryptography failure.  Nope. This confusion arises from the fact that digital certificates (and libraries like OpenSSL)  are used for both secure, encrypted communications and for app/receipt signing. In the latter case, an expired cert doesn’t expose any information or makes the system or apps easier to hack.
  • Developers are better off not doing any, or little, receipt checking. Not really. True, apps which don’t do full receipt checking might have not been affected in this single instance, but under usual circumstances they’re more vulnerable to hacking or piracy.
  • Apple’s store/system infrastructure is brittle and can’t be trusted. True, it’s a very complex system that depends on many twisted little interlocking parts to work properly. And, as we’ve seen, this particular instance of failure is as self-amplifying as electrical grid failures _ once it starts, the demands on the working parts grow so huge that those fail, too. In Apple’s defense, it’s very hard to test for or simulate. Let’s hope that all involved have learned something from this incident; I certainly have learned a lot.

Update: forgot to comment on this particular post:

But when I tried to convince my Mac to run this app as an unsigned app, I encountered what is extremely likely to be the store DRM: I initially got the “your app was bought on another machine” message, so I tried deleting the receipt, but then I got the dreaded “app damaged” message, at which point I removed the signature.

…the only way I can see is to create a new root CA which I install on the machine as a trusted root, and redo the signing chain, and even that might not work if the DRM is somehow tied to the signature chain.

While I can understand the frustration implicit in not being able to run purchased apps “forever”, I think this is a fundamentally wrong approach. Let’s educate developers to check receipts properly, as I mentioned above. Figuring out a way to run store apps (or even developer-ID purchased apps) without “DRM” means that anyone else can use the same method to install pirated copies; we wouldn’t be able to trust users anymore.

Much worse, re-signing someone else’s app and expecting it to run is an even greater violation of trust. The days when you could hack someone’s app with ResEdit and having fun making it look different, or do unexpected things, are long gone. I implement very strict checks that my complete app bundle has not been altered in any way and that it’s running with my original signature, otherwise any user could freely alter files, hack the code, change graphic resources or even — and such cases have happened! — repost the app somewhere else as being their own. No, flawed as the current approach may be in implementation, I see no better alternative.

Update: reports are in from some helpful fellow developers, confirming my suspicions of cache corruption — RB App Checker Lite says the app bundle and receipt contents are OK, yet the apps will not run. I use the same APIs (hopefully) that the system processes use — but those APIs can take a long time to run, so the results are cached somewhere.

Update: Apple sent email to all developers:

In anticipation of the expiration of the old Mac App Store certificate, we issued a new certificate in September.

As I said — no “let[ting] certificates expire”. They all do.

We are addressing this caching issue in an upcoming OS X update.

Confirms this is a caching issue, as I suspected.

…some apps are running receipt validation code using very old versions of OpenSSL that don’t support SHA-2. We addressed this by replacing the new SHA-2 certificate with a new SHA-1 certificate last Thursday night.

I’m a little surprised that the number of apps using“very old versions” justifies going back to SHA1; but, OK.

Please ensure your code adheres to the Receipt Validation Programming Guide and check that all receipt validation issues are resolved.

Good, but:

  • the link goes to the page detailing the online receipt validation. Very few apps use that IMHO — you have to be online every time the app runs, you have to have a reasonably fast connection, and app launch will be significantly slower. Linking to this page (Validating Receipts Locally) would’ve been better;
  • it would have been more helpful to call out specifically the certificate expiration check and update the sample code to properly use the receipt creation date.

Update: I’ve now submitted rdar:///23611335 — a bug report to call attention to this documentation problem.

Update: Fixed the “Validating Receipts Locally” link, which was also pointing to the wrong page. Sorry. Also, here’s one way to do the correct date checking (copied from Matt Steven’s code):

X509_STORE *store;
// set up the store
X509_VERIFY_PARAM *param = X509_VERIFY_PARAM_new();
X509_VERIFY_PARAM_set_time(param, time_from_receipt); // option 1: verify using a specific time
X509_STORE_set1_param(store, param);
X509_VERIFY_PARAM_free(param);
// call PKCS7_verify() using configured store

The direct download version of RB App Checker Lite 1.1 (281) is out, and RB App Quarantine also has been updated to build# 281 (its version is still 1.1 since nothing has changed for the user). Check out the release notes!

Most important: this version of RB App Checker Lite considers the new signing rules in TN2206 for OS X 10.9.5 and 10.10. In particular, for applications, code signatures are now checked recursively, both version 1 and version 2 resource rules are shown if present, and the spctl utility is called to check the Gatekeeper assessment.

However, this signals a new development: direct download and Mac App Store versions of RB Utilities will now, unfortunately, have different functionality. In particular, the MAS version of RB App Checker Lite (which is currently in review) will not be allowed to call  spctl, as this utility requires special entitlements to work; and the MAS version of RB App Quarantine (also in review), for the same reason, will not clear the quarantine flag. So for now you may want to download directly from the product pages.

Apple’s rules for calling attention to such differences are a little tricky to navigate but hopefully the MAS versions will be approved and in a future release I’ll develop a way to bring this functionality back through an optional download. Research is underway!

RB App Quarantine 1.1 (273) is out. It’s the second app in the RB Utilities software suite — RB App Checker Lite was the first one.

As the name implies, it’s a utility that checks or changes the “quarantine” attribute of other applications. This attribute is set whenever an application is directly or indirectly downloaded by the user from anywhere except the Mac App Store. (Applications produced from installer packages, disk images or compressed files inherit the attribute automatically.)

When a quarantined application is first opened or executed, OS X’s Gatekeeper function will check the application’s code signature and several other details and either reject it or throw up the well-known dialog, confirming that you want to execute a downloaded application. If you agree, the quarantine is cleared and Gatekeeper will not check the application again.

Using RB App Quarantine to clear some just-downloaded application’s quarantine attribute is not really recommended: you’ll be bypassing Gatekeeper and — unless you’re a developer yourself and/or have already used RB App Checker Lite to check that application’s bona fides — may be opening your system to a potentially untrusted application.

If you are a developer yourself, using RB App Quarantine to set quarantine on your own application will allow you to check its Gatekeeper status without using Terminal commands or (perish forbid) uploading it to some server and downloading it again.

It took me just 5 days to write this little application since all the UI and other logic common to all RB Utilities is contained in a prepackaged framework and I just had to write the app-specific file/folder handling. Setting up a new project with everything in place and drawing the new icon took a single day. Unfortunately clearing the quarantine attribute takes a special sandbox entitlement which would certainly be frowned upon by the Mac App Store reviewers, so I didn’t even try submitting it.

In other news, I submitted a new version of RB App Checker Lite to the Mac App Store and, if everything goes well, it should be out soon. This new version fixes some bugs and — most requested by users — shows some details pertaining to the latest version of Apple’s Technote 2206, namely showing version 1 and 2 resource rules and showing the Gatekeeper (spctl) assessment results. Stay tuned…

[continued from part II] This, my first Mac, consisted of: • a system unit with 128K of RAM, 64K of ROM containing the system toolbox and boot software, a 9″ black&white display (512×342 pixels), a small speaker, a 400K single-side 400K floppy disk drive, two serial ports using a new mini-DIN 8 pin connector DB9 connectors, a ball-based mouse also connected via DB9, and an integrated power supply; • a small keyboard with no cursor keys or numeric keyboard, connecting to the front of the system over a 4-pin phone connector; • a second 400K floppy drive, which connected to the back of the system; • an 80-column dot matrix Imagewriter II printer, connecting to one of the serial ports; • System 1 (though it wasn’t called that yet) on floppy disks with MacPaint on one, and MacWrite on the other; • a third-party 512K RAM expansion board which fit somewhat precariously over the motherboard but worked well enough; (this RAM upgrade board, from Beck-Tech, was actually 1024K and I now remember buying it a year later) • a boxy carrying case where everything but the printer would fit — I didn’t buy Apple’s version, though. I went to Berkeley and bought it together with a BMUG membership and a box of user group software; • a poster with the detailed schematics of both Mac boards (motherboard and power supply); • a special tool which had a long Torx-15 hex key on one end and a spreading tool on the other end. The Mac’s rather soft plastic was easily marred by anything else; • The very first version of Steve Jasik’s MacNosy disassembler software. All this cost almost $4000 but it was worth every cent. (Also see the wonderful teardown by iFixit.) Taking it back to Brazil proved to be quite an ordeal, however. We had made arrangements to get my suitcase unopened through customs, but at the last minute I was advised to skip my scheduled flight and come in the next day. We hadn’t considered the fact that the 1984 Olympics were happening in LA that month, and getting onto the next flight in front of a huge waiting list of people was, of course, “impossible”! As they say, necessity is the mother of invention and I promptly told one of the nice VARIG attendants that I would miss my wedding if she didn’t do something — anything! She promised to try her utmost and early the next morning she slipped me a boarding pass in the best undercover agent manner. And her colleagues on board made quite a fuss about getting the best snacks for “the bridegroom”… 😉 Anyway, after that everything went well and I arrived safe and sound with my system. More on what we did with it in part IV.

[continued from part I]

In 1983 I’d started working at a Brazilian microcomputer company, Quartzil. They already had the QI800 on the market, a simple CP/M-80 computer (using the Z80 CPU and 8″, 243K diskettes) and wanted to expand their market share by doing something innovative. I was responsible for the system software and was asked for my opinion about what a new system should do and look like. We already had all read about the Apple Lisa and about the very recent IBM PC which used an Intel 8088 CPU.

After some wild ideas about making a modular system with interchangeable CPUs, with optional Z80, 8008 and 68000 CPU boards, we realized that it would be too expensive — none the least, because it would have needed a large bus connector that was not available in Brazil, and would be hard to import. (The previous QI800 used the S100 bus, so called because of its 100-pin bus; since by a happy coincidence the middle 12 pins were unused, they had put in two 44-pin connectors which were much cheaper.)

Just after the Mac came out in early 1984 we began considering the idea of cloning it. We ultimately decided the project would be too expensive, and soon we learned that another company — Unitron — was trying that angle already.

Cloning issues in Brazil at that time are mostly forgotten and misunderstood today, and merit a full book! Briefly, the government tried to “protect” Brazilian computer companies by not allowing anything containing a microprocessor chip to be imported; the hope was that the local industry would invest and build their own chips, development machines and, ultimately, a strong local market. What legislators didn’t understand was that it was a very difficult and high-capital undertaking. To make things more complex, the same companies they were trying to protect were hampered by regulations and had to resort to all sorts of tricks; for instance, our request to import an HP logic analyzer to debug the boards turned out to take 3 years (!) to process; by the time the response arrived, we already had bought one on the gray market.

Since, theoretically, the Brazilian market was entirely separate from the rest of the world, and the concept of international intellectual property was in its infancy, cloning was completely legal. In fact, there were already over a dozen clones of the Apple II on the market and selling quite well! This was, of course, helped by Apple publishing their schematics. A few others were trying their hands at cloning the PC and found it harder to do; this was before the first independent BIOS was developed.

To get back to the topic, it was decided to send me to the NCC/84 computer conference in Las Vegas to see what was coming on the market in the US and to buy a Mac to, if nothing else, help us in the development process. (In fact, it turned out to be extremely useful — I used it to write all documentation and also to write some auxiliary development software for our new system.)

It was a wonderful deal for me. The company paid my plane tickets and hotel, I paid for the Mac, we all learned a lot. I also took advantage of the trip to polish my English, as up to that point I’d never had occasion to speak it.

The NCC was a huge conference and, frankly, I don’t remember many details. I do remember seeing from afar an absurdly young-looking Steve Jobs, in suit and tie, meeting with some bigwigs inside the big, glassed-in Apple booth. I collected a lot of swag, brochures and technical material; together with a huge weight of books and magazines, that meant that I had to divide it into boxes and ship all but the most pertinent stuff back home separately. I think it all amounted to about 120Kg of paper, meaning several painful trips to the nearest post office.

The most important space in my suitcase was, of course, reserved for the complete Mac 128 system and peripherals. More about that in the upcoming part III.

30 years ago, when the first issue of MacWorld Magazine came out – the classic cover with Steve Jobs and 3 Macs on the front – I already could look back at some years as an Apple user. In the early days of personal computers, the middle 1970’s, the first computer magazines appeared: Byte, Creative Computing, and several others. I read the debates about the first machines: the Altair and, later, the Apple II; the TRS80; the Commodore PET, and so forth.

It was immediately clear to me that I would need one of those early machines. I’d already been working with mainframes like the IBM/360 and Burroughs B6700, but those new microcomputers already had as much capacity as the first IBMs I’d programmed for, just 8 years later.

So as soon as possible I asked someone who knew someone who could bring in electronics from the USA. Importing these things was prohibited but there was a lively gray market and customs officials might conveniently look the other way at certain times. Anyway, sometime in 1979 I was the proud owner of an Apple II+ with 48K of RAM, a Phillips cassette recorder, and a small color TV with a hacked-together video input. (The TV didn’t really like having its inputs externally exposed and ultimately needed an isolating power transformer.)

The Apple II+ later grew to accomodate several accessory boards, dual floppy drives, a Z80 CPU board to run CP/M-80, as well as a switchable character generator ROM to show lower-case ASCII as well as accents and the special characters used by Gutenberg, one of the first word processors that used SGML markup – a predecessor of today’s XML and HTML. I also became a member of several local computer clubs and, together, we amassed a huge library of Apple II software; quite a feat, since you couldn’t directly import software or even send money to the USA for payment!

Hacking the Apple II’s hardware and software was fun and educative. There were few compilers and the OS was primitive compared the mainframe software I’d learned, but it was obvious that here was the future of computing.
There were two influential developments in the early 1980s: first, there was the Smalltalk issue of Byte Magazine in 1981; and then the introduction of the Apple Lisa in early 1983. Common to both was the black-on-white pixel-oriented display, which I later learned came from the Xerox Star, together with the use of a mouse, pull-down menus, and the flexible typography now familiar to everybody.

Needless to say, I read both of those magazines (and their follow-ups) uncounted times and analysed the screen pictures with great care. (I also bought as many of the classic Smalltalk books as I could get, though I never actually suceeded in getting a workable Smalltalk system running.)

So I can say I was thoroughly prepared when the first Mac 128K came out in early 1984. I practically memorized all articles written about it and in May 1984 I was in a store in Los Angeles – my first trip to the US! – buying a Mac 128K with all the optionals: external floppy, 3 boxes of 3.5″, 400K Sony diskettes and a 80-column Imagewriter printer. (The 132-column model wouldn’t fit into my suitcase.) Thanks to my reading I was able to operate it immediately, to the amazement of the store salesman.

More about this in the soon-to-follow second part of this post. Stay tuned.

Photos licensed by Creative Commons license. Unless otherwise noted, content © 2002-2024 by Rainer Brockerhoff. Iravan child theme by Rainer Brockerhoff, based on Arjuna-X, a WordPress Theme by SRS Solutions. jQuery UI based on Aristo.