Features

net.wars: a chip off the old Apple

by Wendy M Grossman | posted on 10 June 2005


It can be very difficult to retain your eye on the important stuff when faith is at stake.

Wendy M Grossman

This has been a good week for examples of this minor principle. Take the case of British hacker  Gary MacKinnon. MacKinnon is fighting extradition to the US; he allegedly hacked into NASA and Pentagon networks because he believed they were hiding evidence of the existence of UFOs.

(Brief note: UFOs exist. We know this because "UFO" stands for Unidentified Flying Object. There are definitely objects in the sky that appear to move of whose nature we are unsure. What all the fuss is about is whether they can be Identified as alien spacecraft.)

In this case, the important stuff would have been avoiding breaking international laws in ways that would get you into the crosshairs of large military organisations bereft of a sense of humour. But if aliens are sufficiently important to you and you are enough of a conspiracy theorist, it would be easy to lose sight of that. It's also notable that none of the stories I've seen about the case mention any new evidence for alien visitations that might have come out of the alleged hacking.

The much bigger example is, of course, the announcement that Apple is switching to Intel chips. There are, of course, some people to whom this does matter a great deal. Journalists, who have space to fill arguing about who had the story first or what it  means for their particular pet subject . Technology companies, who need to know what to make and sell and how to design it. People who depend on an aging but obscure bit of Mac software who desperately need it to remain compatible with their hardware and are now facing life in the discarded lane. Chipspotters who investigate the depths of their computers' circuitry and who are the in the position of Betamax owners in the early 1990s – they know, for what they can explain to you are good and sound reasons, that the market is settling on an inferior technology and it genuinely hurts them that this should be so.

These last two categories are the folks who probably will either buy up the remaining stock of IBM/Freescale Macs or face scavenging them at a premium on eBay after the Great Change.

But why would an average person who loves the Mac interface remotely care what's inside his machine? Ordinary computer users care about three things when they buy a computer: that it lets them do whatever tasks they're buying it to do and that they can afford it. Sometimes they buy computers even if they can't afford them, and sometimes they buy computers that make them wait while a process finishes or whose user interface is less than wonderful. But the chip is not the selling point, any more than people choose their church based on the quality of the sacramental wine.

Times change and what's worth arguing about changes with the times.

It's certainly true that at one time which chip was inside mattered; those days are largely gone, and this week's announcement is the messenger. It's a long time since the early days of the industry when Apple, Commodore, Osborne, Radio Shack, Amiga, and Acorn all squabbled about who had the best architecture and people like Acorn founder (now turned venture capitalist) Hermann Hauser thought the best technology had to win. Who argues about RISC versus CISC any more other than people whose job it is to design and specify processors for specific applications?

What differentiates a Mac is two things: the outer hardware design and the interface. Putting an Intel chip in an iMac won't stop anyone from delivering the machine in turquoise. As for the user interface, when Windows XP came out there was a flurry of stories claiming that Mac was near-dead because the Windows and Mac interfaces were no longer different enough to justify the price premium. But the user interface is more than what appears on the screen; the ease of use of Macs is as much about being able to plug everything together easily and not spending hours fiddling with buggy drivers as it is about icon design.

We know now that when Betamax and VHS were competing for market share, what really mattered was the MPAA's suit against Sony, which threatened to end the industry altogether. By analogy, what matters most now if most or all of personal computers run on chips designed to a single architecture by one or two companies is whether that will make it easier for, say, the MPAA to force changes to that architecture to limit what we can do with our machines. Will it make machines more vulnerable to the many kinds of attacks we complain about daily? Will our computers continue to be programmable, general-purpose tools? Will Macs now become part of Trusted Computing? All of these are policy issues, matters that are resolved at a much higher level than chip architectures.

But I do sympathise with those Mac owners who feel their Appleness has been betrayed. "Think sort of different" isn't much of a slogan, is it?


Wendy M. Grossman’s Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series. Readers are welcome to post here, at net.wars home, follow on Twitter or send email to netwars(at) skeptic.demon.co.uk (but please turn off HTML).