ALL BLOG POSTS AND COMMENTS COPYRIGHT (C) 2003-2020 VOX DAY. ALL RIGHTS RESERVED. REPRODUCTION WITHOUT WRITTEN PERMISSION IS EXPRESSLY PROHIBITED.

Saturday, January 25, 2014

30 years of Macintosh

Stephen Fry commemorates the 30th anniversary of the introduction of the Macintosh by lamenting one of the great mischances of history:
In one of the world’s most extraordinary missed meetings in industrial, commercial or any other kind of human history, a Henry Morton Stanley failed to encounter a Dr Livingston in the most dramatic and comical fashion.

In the early 90s a young British computer scientist, Tim Berners-Lee had been tasked by CERN (Centre Européeen pour la Recherche Nucléaire the now famous large hardon collider that found the Higgs Boson or a tiny thing pretending to be it) to go in and see if he could find a way of getting the Tower of Babel of different computing platforms used by the hundreds of physicists at the plant to talk to each other. He came up with something that made use of metatextual techniques that he called The Information Mine. Being a very very modest man he realised that those initials spelled out his name, TIM, so he changed it at the last minute to the World Wide Web. He wrote a language HTML (Hypertext Markup Language), a set of communication protocols (chiefly htttp — the hypertext transfer protocol) and an application, as we would now say, on which all these could run, which he called a browser.

He planned, devised, programmed and completed this most revolutionary code in Geneva on one of Steve Jobs’s black cube NeXT computers. Hugging his close to him he took the train to Paris where Jobs was going to be present at a NeXT developers’ conference. Clutching the optical disc that contained the most important computer code in history he sat at a desk while Steve marched up and down looking at hopeful programs and applications. As in all of Steve’s judgments they either sucked or were insanely great. Like a Duchess inspecting a flower show he continued along the rows sniffing and frowning until he got two away from the man who had created the code which would change everything, everything in our world. “Sorry Steve, we need to be out of here if we’re going to catch that plane,” whispered an aide into Jobs’s ear. So, with an an encouraging wave Steve left, two footsteps away from being the first man outside Cern to see the World Wide Web. The two men never met and now, since Steve’s death, never can.
Those who only know me as an inveterate Apple-hater probably don't realize that I started out as an Apple guy. While my father built his fortune on the IBM PC, first on its need for memory cards, then on its need for high-resolution graphics, (he created and sold the first 1024x768 board for it, the ARTIST card.), my pride and joy and constant companion was an Apple //e. It was stacked, with two disk drives, a color monitor, and a 300 baud modem. I loved that machine, but I gave it up reluctantly when I went off to college and it became apparent that I was going to need something better suited to writing papers.

So, my parents gave me a Macintosh Plus, which gave me a huge advantage over other students, who had to wait their turn in the computer labs when they needed to write their papers. I had a particularly nice setup, since I lived in the only dorm with its own computer lab, complete with Macintosh computers and printers, so I could write my papers, then walk the disk down to the computer lab at 4 AM and print them out without delay. I remember, in particular, one paper on Alfred the Great that blew my professor away because it included a map of England on which I'd drawn the various extents of the Danelaw.

Not that he was unfamiliar with the Danelaw, but it was the first time he'd ever seen a printed graphic in a student paper. That was the power of the Macintosh. I don't think I ever turned in a paper again without some visual example. In fact, looking at the two college papers I still have with me today, one on the economic development of Japan and the Soviet Union, the other on the Italian condottieri, I can see crosshatched maps of Italy and several charts very similar to those that regularly litter my economics posts. That Macintosh Plus created a habit of readily resorting to bar charts that apparently persists even today.

Where the Macintosh ultimately fell down was not in its failure to penetrate the business market. That's the conventional wisdom, but it is wrong. Apple was never going to dislodge IBM and Microsoft there and was wise not to kill itself trying. The opportunity that Steve Jobs mysteriously missed, long before the World Wide Web, was the games market. Despite its GUI, the failure to adopt color for three years after the PS/2 introduced VGA/MCGA, as well as its reluctance to embrace a non-serious market, meant that Apple conceded the games market to DOS.

Papers be damned. The first time I saw Wing Commander, I switched immediately over to DOS and picked up a Compaq 386/25. I haven't looked back since.

I admire the late Steve Jobs. He was an amazingly innovative corporate genius. It is deeply lamentable that his chief legacy as a technologist appears likely to be the walled garden of Apple.

Labels:

37 Comments:

Anonymous Idle Spectator January 25, 2014 4:45 AM  

The opportunity that Steve Jobs mysteriously missed, long before the World Wide Web, was the games market.

Which I always found strange, since Jobs originally worked for Atari.

Anonymous zen0 January 25, 2014 5:48 AM  

So Tim Berners-Lee is the Johannes Gutenberg of the 20th century, but he is just a footnote in the epic STEVE JOBS SAGA.?

Anonymous benny January 25, 2014 5:56 AM  

OT, hsbc preppong for bank runs?
http://m.bbc.co.uk/news/business-25861717

Blogger Expendable Faceless Minion January 25, 2014 6:13 AM  

@benny and uk banks 'declining' 'large' cash withdrawals without good reasons. Every bank asked pretty much agreed with trying to prevent the withdrawal, but quickly said they wouldn't stop anyone, even without explanation.

This is exactly how bank runs get started.

No problem in the US, as the treasury is happy to print all the money anyone could possibly want.

Blogger Bob Loblaw January 25, 2014 6:43 AM  

I don't get the Berners-Lee worship. The web could have been based on any number of formats, and HTML's lack of features meant you couldn't do anything but share documents in the beginning. nroff with one additional command (for links) would have been fine for that.

And HTTP is awful.

Anonymous the abe January 25, 2014 6:46 AM  

Vox hits the nail on the head, there. The most bald-face display of contempt from Apple for its consumers resides in the paltry dearth of native video games available for the machines. You would think a guy that obviously saw the potential in Pixar would realize the boundless potential in providing unique entertainment content in a market that was inevitably going to become saturated with competitors.

Anonymous DT January 25, 2014 6:56 AM  

No problem in the US, as the treasury is happy to print all the money anyone could possibly want.

Well...as long as they're happy!

Anonymous The Great Martini January 25, 2014 7:05 AM  

If it's any consolation, Bill Gates also largely missed the Internet tidal wave. Microsoft didn't really even begin gearing up for the Internet until Windows 98.

The irony is of course that Jobs really put Apple on the map when he recognized the technical significance of something when everyone else seemed to be allowing it to languish, the graphical user interface at Xerox PARK. I guess it just kind of indicates that these radical insights probably have more to do with being in the right place at the right time, or dumb luck. Or perhaps, had he taken a few more steps, NeXT would have been at the forefront of the Internet revolution and not just a salvaged software company.

Anonymous Stilicho January 25, 2014 7:11 AM  

It's just as well they didn't meet. Had they done so, there's a fair chance the internet would be known as "MacWeb" or the "Orchard" and could only be accessed via Apple products.

Anonymous Rex Little January 25, 2014 7:14 AM  

The funny thing is, other now-defunct computers were years ahead of both Apple and the PC when it came to graphics. I bought an Amiga in 1986 that blew them away; the PC couldn't match it til at least 1992.

Anonymous Kool Moe Dee January 25, 2014 7:36 AM  

Agree 100% about DOS gaining market share just because of the game market. I admit game availability drove my choice.

Blogger FALPhil January 25, 2014 7:43 AM  

Tim Berners-Lee did not invent HTML. This is a case of Stephen Fry's ignorance posing as fact. All Berners-Lee did was coin the name HTML, nothing more. XML, for extensible markup language predated HTML by at least 10 years and was sold by IBM as a commercial product called Script/VS. The tags are virtually identical, and XML will run in HTML processors. To say that Berners-Lee "invented" HTML is a prevarication of the highest order. David Singer, now living in San Jose, probably had more to do with the advancement of HTML than Berners-Lee ever thought about.

Blogger FALPhil January 25, 2014 7:43 AM  

Tim Berners-Lee did not invent HTML. This is a case of Stephen Fry's ignorance posing as fact. All Berners-Lee did was coin the name HTML, nothing more. XML, for extensible markup language predated HTML by at least 10 years and was sold by IBM as a commercial product called Script/VS. The tags are virtually identical, and XML will run in HTML processors. To say that Berners-Lee "invented" HTML is a prevarication of the highest order. David Singer, now living in San Jose, probably had more to do with the advancement of HTML than Berners-Lee ever thought about.

Anonymous Anonymous January 25, 2014 7:46 AM  

So much great computer engineering went to waste because companies had no business or marketing sense. Commodore was another company that had chips and systems years ahead of their PC equivalents, but bad choices at the executive level made them footnotes in history.

Blogger dienw January 25, 2014 8:01 AM  

O.T., but this is jaw dropping arrogance on the part of the Global Warming cultists:
Polar Bears Hunt on Land as Ice Shrinks

Anonymous Jimmy January 25, 2014 8:32 AM  

As well as maintaining extremely high prices for Macintosh and have few form factors that people want. I recall buying an outdated all in one Mac that has a tiny screen and black in white. This went on too long before Windows finally matched Mac in software and hardware.

Today, Apple continues the tradition of high prices. Thus adoption is impeded. However, the situation is more complicated. Some products are more reasonably priced like the iPads, while the iPhones are ridiculous. The entry level iPhone 4s should not even be sold, but that's the only thing offered if you want get anything Apple at that price point.

Anonymous Bob Ramar January 25, 2014 8:37 AM  

It is fascinating how things like this occur. A similar thing is HP and Xerox creating a 'personal computer' in the late '60's that had all of the elements of a PC that we take for granted today.

I learned to program in Fortran in college. I bought my first computer, a Commodore VIC 20, a year after graduation and taught myself how to program in BASIC. Me and a friend of mine wrote programs for the VIC and his Radio Shack TRS 80 like SAT (an orbital simulation) and Quadaffi's Tent (a game where you tried to lob a bomb and hit Mummar Quadaffi's tent ,,, which moved between attempts). I refused to buy an MS Dos machine because I knew about GUI interfaces and was hanged if I was going to learn another programming language because Bill Gates said I had to. I purchased a Mac in 1993 when I started grad school and also because I had helped design and build the first 'modern' computer lab at the college where I worked. We used Macs because you could daisy chain them together with an AppleTalk network. It also had a neat printer, and I needed it for grad school work. However, that was the only Apple product I ever owned. I refused to knuckle under to Apple's exclusivity and there is something about an 80%+ market share for software developers vs. 8%.

Blogger El Borak January 25, 2014 8:43 AM  

the now famous large hardon collider...
What? No penis jokes? I realize Apple is serious business, but really. I am disappoint.

Anonymous MrGreenMan January 25, 2014 8:52 AM  

I remember gaming on an SE then SE/30, but, with an IBM, I could play Syndicate...in color. It's funny how much they lost the gaming space because the //e and //c were great for gaming and, of course, ran Wasteland.

Blogger tz January 25, 2014 9:03 AM  

This is while you had the 8 bit game consoles, and even the atari and amiga. The mac was a lisa derivative.

When Jobs said "no", it locks things out. Could you do games on an apple-tv? No, only on the phone. (Or get a $40 androif-hdmi stick). No, you can't connect bluetooth to iOS wothout our magic pixel dust.

Blogger Jeff Burton January 25, 2014 10:07 AM  

The Hagiography never ends. Jobs was a quirky guy. He may have hated it because it had to many buttons or something.

Anonymous Pseudo-Nate January 25, 2014 10:21 AM  

"O.T., but this is jaw dropping arrogance on the part of the Global Warming cultists:
Polar Bears Hunt on Land as Ice Shrinks"

I guess those Climate Scientists got their boat stuck in rapidly expanding landmasses. I hope The polar bears don't come to eat them!

Blogger Kallmunz January 25, 2014 10:24 AM  

Back in those days I hated the WinTel platform, but I too couldn't live without Wing Commander (especially Privateer).
BTW, Chris Roberts is now preparing Star Citizen, by all means check it out.
In Jobs defense, he was booted out of Apple before the Mac market was really developed, would he have made it into a game friendly machine? Who knows? That said, he seemed to have learned the lesson when IOS was introduced.
When the iMac was introduced it had no floppy and only connected with USB. Today that seems obvious but then it seemed insane, but Jobs was always ahead of his time.
These days I built a Hackintosh that boots into Windows 7 and Linux, why limit yourself? Life is good.

Blogger IM2L844 January 25, 2014 10:26 AM  

the now famous large hardon collider...
What? No penis jokes?


Yeah. there's got to be a Large Genevese cock blocker discovers size does matter joke in there somewhere.

Anonymous Jack Amok January 25, 2014 10:49 AM  

So much great computer engineering went to waste because companies had no business or marketing sense. Commodore was another company that had chips and systems years ahead of their PC equivalents, but bad choices at the executive level made them footnotes in history.

Windows owes its success to the massive effort Microsoft went to making it easy for people to write software for Windows. The late 80's, early 90's developer support MSFT offered was incredible. What Gates recognized was that tens of thousands of independant developers writing software for a platform would make that platform far more valuable than anything the platform could really do to stand out. Job's never really understood that, he wanted the OS to be the thing. Gates wanted the apps to be the thing. The folks running all the other companies weren't able to get either.

For games, it was the closed hardware platform that killed Apple. Again, part of controlling and defining the platfom, insisting that it be "insanely great" and not trusting anyone else to contribute to it. PC hardware outpaced Mac in price and performance, and games chased the peformance.

Anonymous SkepticalCynical January 25, 2014 11:23 AM  

@FALPhil: "All Berners-Lee did was coin the name HTML, nothing more. XML, for extensible markup language predated HTML by at least 10 years"

Perhaps you want to check the history? SGML emerged in the 80s, and HTML did borrow from it. But you'll note that SGML was a complete and utter marketplace failure and HTML has done rather well for itself. So maybe Berners-Lee deserves a little credit, eh?

Your history of XML is flat out wrong. It didn't appear until 1998, was a direct derivative of SGML, and was explicitly trying to remake SGML into something that could be useful in the HTML ecosystem of the internet.

Anonymous VD January 25, 2014 11:24 AM  

It's just as well they didn't meet. Had they done so, there's a fair chance the internet would be known as "MacWeb" or the "Orchard" and could only be accessed via Apple products.

That thought struck me too. Look how they resolutely refused to explore the possibilities intrinsic in Hypercard. It may have been the most fortuitous non-meeting in technological history.

Anonymous Josh January 25, 2014 12:49 PM  

It may have been the most fortuitous non-meeting in technological history.

On the plus side, Comic Sans might have never taken over the web in the late 90s

Blogger mmaier2112 January 25, 2014 3:22 PM  

I remember how I became enamoured of Hypercard when I figured out what it was on my high school's Macs.

I really wanted to make a "Wiki" for my AD&D fantasy world, over a decade before "Wiki" would become common usage.

Too bad a Mac was out of my budget for years. I was doing some graphics design work my first job out of HS.

Funny... I might have been a badass web designer before anyone knew they would want to hire one.

Anonymous Will Best January 25, 2014 3:43 PM  

You would think a guy that obviously saw the potential in Pixar would realize the boundless potential in providing unique entertainment content in a market that was inevitably going to become saturated with competitors.

We call this learning from ones mistakes. Jobs did a lot of great things, but he was far from perfect.

Anonymous Anonymous January 25, 2014 3:51 PM  

> The entry level iPhone 4s should not even be sold,

Point of information: The 4S and iPad 2 are holding down the bottom of their lines only so that there's a 30-pin connector option available tor the various accessories needing it, most notably secure credit card readers and the like that haven't got Lightning-connector replacements through all the various layers of regulatory approval yet.

Anonymous Anonymous January 25, 2014 4:06 PM  

> Look how they resolutely refused to explore the possibilities intrinsic in Hypercard.

The best technology Apple never released was Hypercard 3.0 built on top of Interactive QuickTime. Google it up a bit if you care; you'll find that it could do things that still are not practical to accomplish with 2014's tools.

Steve made the call that awesome as it was, there was no significant chance of it attaining critical mass on Windows, so it didn't make the 'absolutely necessary for immediate survival' cut for the scorched earth approach he took to all the various directions Apple was wandering in at the NeXT acquisition along with Newton, Dylan, and all those other awesome half-to-three-quarter baked ideas people were playing around with in 1990s Apple. As someone who won several Apple Design Awards, MacWorld Best of Shows, yadayadayada, with my various applications built on the public releases that were baby first steps towards full on Interactive QuickTime, took me a good decade to get over the idea that this was Steve's worst decision ever. Technically, I still think so, but I've grudgingly accepted that given Apple's near fatality at the time, interactive media standards was a war not worth fighting with Windows. Ah well.

Anonymous rycamor January 25, 2014 7:57 PM  

Eric January 25, 2014 6:43 AM

I don't get the Berners-Lee worship. The web could have been based on any number of formats, and HTML's lack of features meant you couldn't do anything but share documents in the beginning. nroff with one additional command (for links) would have been fine for that.

And HTTP is awful.


It's easy to be critical in retrospect, but give the man credit: 1) he actually solved a problem in a fairly trivial amount of time. 2) he sketched out a robust technology based on a vision that had only existed in sci-fi up until that point, and it had the simplicity and flexibility to change the world. Many a more elegant technology, or a more "correct" application of principles has died a stillbirth due to aspie fussiness and overwrought design. It worked, and it continues to work.

I get the same sense of annoyance when conceptual-fetishist programmers go on about how horrible and poorly-thought-out Javascript is, and how LISP should have been the language of the browser. Right. Sometimes you just need to get a job done. Brendan Eich did the same thing Tim Berners-Lee did: he had to solve a problem quickly and effectively, and by doing so he managed to cast aside the usual analysis paralysis or design-by-committee problems and make something useful.

Imperfect? Yes. Useful? Hell yes. A loaded gun in the wrong hands? There's always a trade-off if you want to get shit done.

Anonymous Dr. Kenneth Noisewater January 25, 2014 10:16 PM  

Apples of various vintages have run some of my favorite games of all time: Karateka, Castle Wolfenstein, Elite, Marathon, Myst, Bolo. Jobs may have persisted in gimping Mac graphics after the advent of hardware-accelerated 3D, but given the total domination of Win32 (followed by DirectX) and the lack of interest in porting to not just a different API but a different instruction set and endianness, it's not a surprise.

Games are upstream of the computing market as the Entertainment industry is upstream of culture.

Incidentally, many DOS-era games are available for OS X via GOG, with their supported version of DOSBox (or at least it looks like DOSBox). Master Of Orion 2 works pretty well most of the time. Steam also has some games available, and there's some opensource with good ports as well (particularly OpenTTD.org). Heck, even Netrek has an OS X client now, I believe..

Anonymous Daniel January 26, 2014 12:35 AM  

Apple IIe could game. Mac could not. Sure, it was the heyday of text games and 8-bit monochrome, but there was no reason whatsoever for Mac to faceplant on the game revo, which is what all my friends complained about for several years during Mac's ascendancy. It was clear that Mac wanted to gear things to a walled garden as far back as then, and it was measurable: in 85, the gamers were evenly split or cross platform. In 90, I didn't know a single gamer who even considered apple to be a player anymore...not even boutique.

Apple might have had an uphill battle on games, but the fact that they didn't even bother to fire a shot was all on them...which is really preposterous: they could have carved out a boutique alternative on 3-man development teams through at least the mid-nineties, had they cared to invest a pittance.

They didn't, which was ridiculous, considering how deeply entangled you could get in Apple IIe games in '85...when there was no DOS game that I seriously coveted vs. Apple offerings, and some Apple versions were actually superior to DOS (My recall could be off, but I remember Art of War on Apple IIe was the least buggy version of the two or three I played regularly).

When did the first good flight simulator come out? I remember that being the first DOS thing that I saw that didn't have an Apple equivalent...but that could just be my perception. I just remember thinking that Apple needed to do something circa '87...and they never did to my satisfaction.

Blogger FALPhil January 26, 2014 7:30 AM  

@Sceptical Cynic - "Your history of XML is flat out wrong. It didn't appear until 1998, was a direct derivative of SGML, and was explicitly trying to remake SGML into something that could be useful in the HTML ecosystem of the internet."
You are absolutely correct. It was SGML, not XML. I had a brain fart.

But regardless of commercial success, the rendering algorithms, formats, and lexicon were already invented by the time Berners-Lee got around to addressing his problem. All he did was adapt it to a new environment. That is all he should get credit for.

Anonymous Daniel January 26, 2014 7:59 PM  

http://pando.com/2014/01/23/the-techtopus-how-silicon-valleys-most-celebrated-ceos-conspired-to-drive-down-100000-tech-engineers-wages/

The sainted Jobs et. al... Stunning accusations. The arrogance. The contempt for law. The accusations have been made before but this article sums them up nicely. Now they are going to trial. OPEC or De Beers have never been this cynical or criminal. And, yes, you my friends are the victims. If you work anywhere close to software in or around Silicon Valley, these bastards robbed you. They will probably beat the case, though. Money and hypocrisy usually win out.

Post a Comment

Rules of the blog

<< Home

Newer Posts Older Posts