In my Wednesday October 5th memorial article I said a few words for Steve Jobs:
Steve Jobs and Steve Wozniak brought personal computing out of the science labs and back offices and into average American homes in the 1970’s. The Macintosh, a brilliant synergy of great hardware and a user-friendly software interface, created a sea change in home computing which still raises our expectations today.
As I predicted in that article, only days after the passing of Apple’s Steve Jobs, even hardball political commentary broadcasts like Inside Washington were rediscovering how many ways Steve Jobs will continue to influence how we conduct our daily lives. One commentator said that people who used to read real newspapers and real magazines now read the online edition on their iPad. I’d like to take that a step further and say I know people who never used to read real newspapers or real magazines, who’ve started devouring serious professional news resources on their laptops and iPads.
It really doesn’t matter if one does or doesn’t “like” Apple. Some of us have a contrarian distrust of anything that becomes too iconic, too popular, or attracts anything that smells like a cult. Some people may feel all the credit given to Jobs somehow diminishes the real innovations of the many others in other competitive industries. And of course most homes, and the entire business community, still run on the Windows platform.
My point is simple. It doesn’t matter whether we like Mac or PC. It doesn’t matter whether we’re a techno-geek who only speaks Linux, or a baby boomer still trying to figure out how to copy and paste and send those e-mail things. It doesn’t matter whether we carry a laptop, tablet, e-reader or iPod-type device.
Whatever we do and however we do it today, in 1984 Jobs changed the course of how the world does things. Every device you see, touch, listen to or consult uses graphic metaphors introduced to the mass consumer in a neat little package called “the Mac.” Perhaps you like Windows? I use both platforms. Windows isn’t a copy of the first Mac OS, it’s just a direct descendant of every concept introduced by Macintosh.
Today I’d like illustrate the debt of gratitude we really do owe Steve Jobs and his Apple team. Some of us are too young to remember those early Mac days. Many of us, including oldsters like me, can feel free to simply take for granted the incredible palette of instinctive tools we use in interacting with our Macs, PC’s, iPhones, iPads and iPods and all such similar devices. That’s as it should be.
As I wrote in my earlier article, one of those things we absolutely take for granted today, as we should, is our “user-friendly interface.” We expect our devices to lead us logically and intuitively into the tasks we need to accomplish, and we expect to discover new things we can do with our devices on the fly, as we use them – not after hours of poring through volumes of old-fashioned user manuals.
Prior to Apple II, the pitifully few “home computers” that existed were hand-wired electronics cases, usually kits, with glowing lights to indicate states and results. Despite my college background on FORTRAN, I had no curiosity about them. Those were workbench toys for true geeks. The idea of a “practical” home computer was then an oxymoron.
The Altair, and early clones, were relatively difficult to use. The machines contained no operating system in ROM, so starting it up required a machine language program to be entered by hand via front-panel switches, one location at a time. The program was typically a small driver for an attached paper tape reader, which would then be used to read in another “real” program. Later systems added bootstrapping code to improve this process, and the machines became almost universally associated with the CP/M operating system, loaded from floppy disk.
In 1977 Apple II married the QWERTY keyboard, color TV monitor, 5″ floppy disk and printer capability to a modest 6502 processor chip and 16K of RAM. Wozniak wrote its Apple OS. This spawned the first commercial software, such as Dan Bricklin’s VisiCalc spreadsheets and basic word processing applications we could recognize and use today.
This in turn primed homes and the workplace for downstream devices like Radio Shack’s TRS-80, Atari for music and gaming, and, in 1981, the great and iconic IBM PC. Although there were tech-intensive ways to load data into machines via floppy disk, these machines generally required users to first input commands and data into their machines via a series of command-line questions and responses in early DOS versions, which in turn owed its heritage to Unix. The Apple II had an improved command-line interface, but we still typed a response, a menu selection number or our data into the Apple, just like its imitators.
Macintosh changed all that forever in 1984 with its GUI or “graphics user interface.” Jobs saw “concept” prototypes of all this at Xerox PARC. His genius was recognizing that (1) anyone could learn to use this, and (2) he could improve upon that, mass-produce it elegantly, and deliver it affordably to anyone.
What the hell is a “graphic user interface?” As obvious as the question seems, what we see on the screen is only the tip of the iceberg. The real power is underneath.
Starting with the Mac, it’s the iconic graphic representation of tasks we previously performed by typing commands into a black-and-white command-line interface. Its hallmark is intuitive ease of use enforced by consistent deployment and conventions across all applications that run on a given operating system. And it’s no accident that most of those conventions were commercially introduced or invented at Apple. The very idea of a powerful “user” rule book to impose standardization on how the interface is presented to us was Apple’s invention and they enforced it in-house and with the software shops. The built-in code toolkit to make it easy to program under Apple house rules was awesome. Apple called it their “human interface guidelines,” and it changed the industry – and how we use the devices – forever.
It also changed how we communicate forever. What follows is old-hat gospel to veteran Mac and PC users, but I mention it here because we have a whole new generation of users fortunate enough to be able to take all this for granted, like electricity and running water in my parents’ generation.
Whether you scroll through iTunes with a swipe of finger, enjoying that free inertially-loaded scrolling, or use two fingers to zoom in on a selection, or click “Yes” on a Mac or PC, that’s part of an interface. In no particular order, here, from Macintosh, are some of the software tools we’ve come to expect everywhere:
- Mouse, from Xerox PARC, and the idea of coordinate mapping the whole screen: everything you can click, drag, select, zoom, expand, resize, minimize, draw, fill, color, cut, copy, paste, insert, open, save, find, search, or jump to with a mouse. Touchpads also first appeared on Macs.
- Click-able shortcuts or aliases
- Fonts and typography: Jobs revitalized the whole typography industry, starting with the Geneva typeface, designed by Susan Hare for Apple.
- Menus: yes, hierarchical drop-down, collapsing and pop-up menus started with Mac.
- Dialog boxes, “OK” and “Cancel” buttons, text fields, radio buttons and checkboxes: prior to Mac, users typed answers to a series of command-line questions, such as “Continue (Y) or Cancel (N).”
- Color (Macintosh LC, 1990)
- Icons, to represent files and directories instead of listing filepaths like C:\WH var\www\cgi-bin
- “Folders” to represent directories and nested subdirectories of files
- Task bars and palettes for task display and selection, using these icons
- Computer art and graphics design, with products such as MacPaint and MacDraw, which spawned commercial Adobe graphics programs.
- Apple did not invent e-mail. But America Online launched the first modern mail portal, just in time for Macintosh. A Windows AOL version was offered some time later.
- “Finder,” later known to Windows users as “Windows Explorer”
- Control Panels
- Music. I bought my first Mac in 1985 or 1986 and I’d heard claims it played tunes. I expected something crude. When I found the software that played Bach’s “Ode to Joy” so beautifully, I was overwhelmed by unexpected surprise.
- Consumer laser printers
In fact, the entire “desktop” metaphor came from Mac. Everything we needed to perform a task or access our files was visible or accessible from the desktop. On our modern web browsers, whether Internet Explorer, Firefox, Safari, or Chrome, they all inherit the lineage of the original 1994 Netscape interface, which in turn satisfied user expectations by adapting all the “web object” concepts from the Macintosh inventory of standardized user tools and controls.
Windows 95 came out in August 1995 and there were any number of well-deserved crude jokes about Microsoft’s first answer to the Mac OS. Anyone who’d ever used a Mac knew (1) where the Windows OS features came from and (2) already knew how to use the Windows desktop and user interface. We Mac-heads taught a lot of users how to use their PC’s. One of the steepest teaching hurdles was trying to convince PC users not to be afraid of their machines!
How did this uniformity come about? Was it simply plagiarism? Absolutely not. It was common sense. First of all, the Gates organization had already thoroughly integrated the Mac metaphors into Microsoft Word and Microsoft Excel for Macintosh. Word and Excel for Windows only came along later, actually, with the new Windows 95 GUI.
This could have been turned into a huge crisis for both Apple and Microsoft, but it didn’t. Business users would have revolted if Microsoft had to “reinvent” all the graphic metaphors and keyboard sequences – that would have been like reversing clutch and brake pedals on the automobile. Most of the complaints I hear today from people switching between iPhone, DROID and Blackberry are that things aren’t where we expected to find them. It could have been much worse. On your car’s Garmin or other GPS device, what if the roadmap symbols and graphics were different for each of the 50 states?
Happily, desktops are still always desktops, recycle bins are still the Trash, and shortcuts still jump you to that file or app buried six layers deep in file managers. When you swipe that high-resolution display on your handheld device, that’s just an improved metaphor for the old-fashioned scroll bar (also Mac). Just “doing it differently,” to avoid lawsuits, would have brought incredible levels of chaos to personal computing. That would have crippled the whole industry and retarded its remarkable transformation of home and office. We should be grateful that such a desktop polyglot never happened.
The whole point of such innovation, in one sense, is that in today’s world we all get to feel entirely free to take it for granted – as we should. You won’t find user group meetings in rented halls trying to teach each other how to use their iPods and iPads. I’m just lucky I’ve been around long enough to enjoy watching it all happen.
And it’s happening all over.
Nature [#2505H] 10/9/2011: The Desert Lions (PBS): “In the Namib Desert on Africa’s wild and forbidding Skeleton Coast, Philip Stander, a Namibian carnivore specialist, first spotted these desert lions …”
At the end of the show, we watch Stander trying to coax two marauding lionesses away from a Namibian village. If those desert lions get too close to the village’s herd, they will be shot. He hooks up a loudspeaker on his Land Rover so he can divert them away from the village. He’ll use the recorded sound of a male lion roar. Into his vehicle’s PA system, he plugs his iPod. The lionesses hear the recorded roaring, and leave the area.
Thank you, Steve Jobs.
copyright 2011 by Alex Forbes
962 total views, 2 views today