The Apple Watch, Tog’s Feb 2013 Prediction

Featured

Two years plus before the release of the Apple Watch, I set out my expectations of its capabilities and features. This was at a time when  “smartwatches” ran little more than recycled feature-phone software tied together with cumbersome interfaces. The article is just as it was in February, 2013. All I have done since then is to change the title from, “The Apple iWatch,” to its release name, “Apple Watch,” to make searching for this article easier. If you wonder why my prediction was as accurate as it turned out to be, it was because I applied the same design methodology that we developed at Apple decades ago. I had no inside information whatsoever.

[box]

Main sections & select features

Overcoming smartwatch drawbacks

  • Wireless charging, so you never remove the watch from your arm
  • Smooth Apple design with no clunk-factor
  • Siri and your iPhone take the place of buttons and menus on your iWatch

The iWatch as facilitator/coordinator

The Killer Applications

  • Your iWatch vouches for you, so you’ll never have to type another passcode or password again.
  • Walk away from your iPhone and your iWatch will warn you.
  • Your NFC chip for making payments is in your watch, instead of in an easily-grabbed $800 phone. Just wave your hand over the sensor and you’re good to go.

Other Cool Capabilities

  • When your iPhone rings, you watch says who’s calling, and you can handle your response by touching the watch.
  • Sensors enable the watch to monitor you in sickness and in health, tracking calories burned, miles walked, steps climbed, restlessness of sleep, even advent of tremor and other early warnings of serious health conditions.
  • Your music may be on your iPhone or iPod, the sound may come from your Bluetooth headset, but your controller is on your wrist with the iWatch.

The Apps

  • Unexpected apps will afford unexpected capabilities, like KidCode
  • Expected apps like using the watch to pause, mute, or change the channel on your TV or alter your room temperature
  • Apple Maps fix. Crowdsourced pressure data from the watch could enable Apple to fix the 3D view in its Maps app.
  • “What’s that thing?” Point your finger to a distant object, and Siri will tell you what it is.

Postscript

The Forum

  • Two-way conversation between readers and myself with a surprising number of good ideas for both features and applications.

[/box]

Introduction

The iWatch will fill a gaping hole in the Apple ecosystem. It will facilitate and coordinate not only the activities of all the other computers and devices we use, but a wide array of devices to come. Like other breakthrough Apple products, its value will be underestimated at launch, then grow to have a profound impact on our lives and Apple’s fortunes.

[box]Steve Jobs’s true legacy lies not with his products, but his method, the way he would forge revolutionary products from cold blocks of creativity. I know. I was one of his earliest recruits and watched him develop the method. Steve applied it one project at a time.  My hope is that Apple now has teams applying it across many projects, shortening the historic six years between breakthrough products.[/box]

What will follow is not based on insider information but a solid understanding of Apple, its products, the problem, and the opportunity. The Apple iWatch development team I expect exists is likely already well ahead of the ideas I’m suggesting here. (Should they draw any new ideas from what follows, they are free to use them.  I’ve already reached my lifetime goal of as many patents as Heinz has varieties.)

[box] Who’s talking?

Bruce Tognazzini was hired at Apple by Steve Jobs and Jef Raskin in 1978, where he remained for 14 years, founding the Apple Human Interface Group and designing Apple’s first standard human interface. He is named inventor on 57 US patents ranging from a intelligent wristwatch to an aircraft radar system to, along with Jakob Nielsen, an eye-track-driven browser.

[/box]
Before delving into what an Apple smartwatch might look like, we need to understand why, right now, people not only think they don’t need a smartwatch, they flat-out don’t want a smartwatch.

The Smartwatch

[quote]I’ve found a traditional smartwatch’s extra functions neatly divide into those I don’t need and those I can’t find.[/quote]

Traditional smartwatches are big and clunky.  They require charging. (I haven’t had to remove my “dumb” watch from my wrist in four years.) I can’t read a smartwatch at night without using my other hand to turn on the light.  I can’t read a digital watch at any time without the use of reading glasses, nor can most people over 45, which is why the big hand and the small hand continue to go around together on so many watches.  What’s worse, I’ve found a traditional smartwatch’s extra functions neatly divide into those I don’t need and those I can’t find. I can live without a smartwatch.

Recently, some startups have addressed a few of the smartwatch’s disadvantages.  They noticed that people are now carrying around a decent-sized screen with a whole bunch of virtual buttons—their smartphones—so smartwatches no longer need display everything and offer access to every option within the watch interface itself.  Bluetooth 4.0 enables low-power communication without draining the watch’s battery, making smaller size and longer running times possible.

The Cookoo watch, for example, will last for a year between battery changes. It doesn’t do a great deal, but what it does do is quite useful.

Cookoo Watch

The Cookoo Watch

The Pebble, while it offers much more than the Cookoo in terms of functionality, lasts about a week before demanding removal for charging. That’s longer than smartwatches used to go, but hardly compares to what people expect in a modern watch.

pebble-watchfaces-3

The Pebble Watch

Martian has combined the large, somewhat clunky styling of the traditional smartwatch (albeit in a great many color variations) to offer the greatest pass-through power from the smartphone.  The result is Dick Tracy’s two-way wrist radio:  Ask Siri to call someone, and you can talk with them through the speaker and microphone in your watch, all handled via Bluetooth by your phone.

The Martian Watch

The Martian Watch

The Martian sports two hours of talk time, although the watch itself will keep running after that. You’ll certainly need to get in the habit of charging it every night.

These and others of the new generation of smartwatches are certainly very attractive to early adopters, but don’t expect them to smash the market open.  That’s going to require an entirely different level of both functionality and perfection, just the sort of thing for which Apple is famous.

Overcoming Smartwatch Drawbacks

The first thing Apple has to do is address traditional drawbacks in smartwatch design, something they are qualified to do.

Charging. If you think about it, there isn’t actually a charging problem at all.   Never has been.  Instead, there’s a having-to-remove-the-watch-from-your-arm problem. What if you held a patent on a charger that could charge an object that is several feet away through the air wirelessly? Apple holds such a patent.

The usual drawback to remote charging is that it is not efficient, but if the watch doesn’t require all that much power to begin with and will shut down the charger when it is full, the process can be relatively inefficient and still not cost you much money or the nation’s infrastructure much energy. (We spend lots of money/resources on inefficient power sources all the time: One AAA cell for your TV’s remote control costs around fifty cents.  It holds around 1.4 watt-hours of energy.  Not kwhs, whrs.  You would have to spend $25 to $50 on AAA cells to equal a penny’s worth of the power you get out of the wall.)

Clunky design.  Two reasons clunky design wouldn’t be a problem for Apple.  The first and foremost: Jonathan Ive.  Second:  Apple’s recent patent on a low-cost method for creating curved glass for screens. Apple can create a smartwatch with revolutionary functionality that is drop-dead gorgeous.  Is there any doubt they will do so?

Buttons & menu trees.  Won’t be any.  Why?  One good reason: Siri.  Whatever the watch can do, you’ll be able to put in place by commanding it (with your iPhone and the Siri back-end handling the actual mechanics, of course): “Set timer for 22 minutes.” “Wake me at 6:15,” etc. Whatever the watch can display, you’ll be able to bring up just by asking: “How long before my plane takes off?” “What’s the temperature right now in Dubai?”

Siri will be accompanied by touch, of course, with touch handling the lighter tasks, Siri the more complex. There will be overlap, so you can use more complex touch maneuvers when you can’t speak to your watch, during a meeting perhaps or when there’s a lot of ambient noise. Many people will never learn the more complex maneuvers, nor will they need to as the iPhone, iPad, and Mac will offer simple alternative interfaces to the more complex tasks.

The iWatch as Facilitator/Coordinator

The iWatch will have a few functions it performs entirely on its own, chief among them being telling you the time.  It’s chief role will be that of office manager, facilitating and coordinating your use of your other iDevices and the Internet by gathering data, delivering messages, storing and forwarding, coordinating tasks, and carrying out functions that extend the capabilities of your other devices. The iPhone or other primary device will be the executive in charge, making the decisions, setting the strategy, and apportioning tasks. The watch will have the least energy resources available, so the watch will be used sparingly.  Still, as time goes on, more uses will be found for it, and it will receive increasing amounts of traffic.

The Killer Applications

The iWatch can and should neatly fix the two most serious problems we have with our current mobile devices, ones we may not even realize we have. Only Apple holds the necessary keys to address the first of these, so only Apple will.

[box]The paradox of the “huge problem”: A problem that feels sufficiently insurmountable will appear the product of natural law, to be accepted rather than challenged.[/box]

The first two killer applications are neither sexy nor fun, but they will make our lives so much more pleasant.

Passcodes & Passwords.  The watch can and should, for most of us, eliminate passcodes and passwords altogether on iPhones, and Macs and, if Apple’s smart, PCs: As long as my watch is in range, let me in! That, to me, would be the single-most compelling feature a smartwatch could offer: If the watch did nothing but release me from having to enter my passcode/password 10 to 20 times a day, I would buy it.  If the watch would just free me from having to enter pass codes, I would buy it even if it couldn’t tell the right time! I would happily strap it to my opposite wrist! This one is a must. Yes, Apple is working on adding fingerprint reading for iDevices, and that’s just wonderful, but it will still take time and trouble for the device to get an accurate read from the user. I want in now! Instantly! Let me in, let me in, let me in!

Apple must ensure, however, that, if you remove the watch, you must reestablish authenticity. (Reauthorizing would be an excellent place for biometrics.) Otherwise, we’ll have a spate of violent “watchjackings” replacing the non-violent iPhone-grabs going on today.

[quote]If the watch would do nothing but free me from having to enter pass codes, I would buy it even if it couldn’t tell the right time![/quote]

Individuals or companies that demand a higher level of security can require both the presence of the watch and a passcode, aka, two-factor authentication. Even that could be made a lot less onerous, again optionally, if, when at work or within your own house, the security software would be allowed to lift the requirement for the separate passcode, only applying it when you are out and about.

Find iPhone. The current “Find iPhone” is a well-implemented solution wherein you can find your iDevice no matter where it has wandered on the globe, as long as it is turned on and no one has messed with it.  However, it is not exactly as simple procedure:

  1. Find yourself another iDevice or computer
  2. Log in
  3. Open Find iPhone or point a browser to www.icloud.com
  4. Wait while signals are sent through the ether
  5. Select the device you want from the map or list
  6. Click “Play sound”
  7. Find the device you’re looking for & dismiss alert
  8. Delete the follow-up email

That’s a lot of steps! Better that your iDevices never get all that lost to begin with. Two additional capabilities, facilitated by the iWatch, can help ensure you never need that long-distance capability.

Local Find: As long as your device is close by, just scrawl a question mark on the top of your iWatch or perhaps ask Siri, “Where’s my phone?” and your phone will light up and start chiming. Of the eight steps above, you need perform only step seven. (You would find your iPod or iPad the same way, of course.)

Automatic Find: By the time you realize you have left your top-secret prototype iPhone sitting on the bar, some on-line tech blog will have probably already published an article on it. However, with the iWatch on your wrist, as soon as you move out of range, it will tell you that you’ve forgotten your phone, then help you locate it, as needed.  That’s a lot more useful than waking up the next morning to discover you seem to be missing something, only to then press Find iPhone into service. (The Cookoo watch already has at least the reminder part of this feature.)

Extending the range: Bluetooth Low Energy is supposed to have a range of 50 meters or 160 feet.  Presumably, that’s in an open field with a tailwind.  In your home or work place, your watch could end up driving you nuts if Apple doesn’t provide an intelligent means of expanding the virtual bubble so the alarm doesn’t go off anywhere in your safe environment. The system will need to “know” you’re in one of your secure areas, warning you only if you start to drive away without one of your devices. This could be handled, perhaps, by repeaters embedded in devices such as Apple Airports.  In homes and businesses with multiple repeaters, your watch could then also give you a local “read” on what repeater your device is near.

Near Field Communications for Payment.  The conventional, collective “vision” is that, soon, we will all pay our bills by simply reaching for our phone, a phone that, for around half of us, is lost somewhere deep in the recesses of a purse, retrievable in around one minute and thirty seconds. With luck. Think of the time those folks will save over paying with their wallet, a much bigger and more obvious object that they actually had to move out of the way in their effort to find their completely invisible black phone!

Oh, yeah, they won’t save any time at all.

Of course, we guys are a lot more clever. We’ll slide our phone right into our breast pocket where, heh, heh, we can get at it instantly. Or could have if we hadn’t then put on a turtleneck sweater before putting on and zipping up our jacket.

Next time, we’ll just pay cash.

And then there’s getting on the subway:  Instead of having to slide that paper card we buy once a month into the slot, all we’ll have to do is wave our $800 iPhone over the little sensor, except that nice gentleman we hadn’t noticed standing just to our side just grabbed our $800 iPhone and is now hot-footing it out of the station with us trapped on the wrong side of the turnstile.  Huh!  That didn’t work out so well!

Just last week, our kid had to struggle to get his phone out of his backpack to pay his bus fare using his marvelous NFC chip, only to have it stolen the same way! If only there were a better solution! Oh, yeah. There is.

The NFC chip belongs in the iWatch, not in the iPhone! That way we’ll know exactly where it is at all times, strapped to the end of an appendage expressly designed to be waved around at things.  How handy! Reach. Touch. Done.

Meanwhile, our iPhone, handling any necessary communication, will stay hidden safely away, and, if someone does manage to get ahold of our watch, it will require reauthorization, having been removed from our arm.  Net value to the thief: Zilch. Net loss to us: A whole lot less than an iPhone, with word on the street quickly making it clear there’s no point in stealing an iWatch.

Of course, not every merchant will accept NFC right away, so the watch, linked to Passport, will also display QR codes, etc.

Other Cool Capabilities

Phone call facilitator. Your iWatch vibrates. You glance at the watch and see who’s calling. You swipe up twice, indicating you want to answer (or some other standardized gesture). Your caller is asked to, “Wait one moment, please” while your watch instructs your phone to light up and start ringing to help you find it (or just lights up—your choice).

Many of us, of course, would like more, however, the iWatch as speakerphone peripheral for our iPhone is much less likely to happen. Of course, it would be cool: Let’s face it, Dick Tracy had a two-way wrist radio, and we want one, too! Imagine asking your imaginary friend, Siri, to call one of your real friends, Bill, then having a conversation, all without actually reaching into your pocket for your phone. However, the iWatch is going to be all about energy management. The Martian watch, for all its bulk, can squeeze only two hours of talk-time out of a charge. Martian will likely be left to pursue that market on its own.

Sensors. The iWatch will incorporate a variety of sensors. Certainly one thrust of these sensors will be sports/health data capture, inferring walking based on arm swing, detecting climbing or diving based on a pressure sensor, etc., etc. The more sensors, the better. A temperature and pressure sensor pressed against the skin could prove useful for medicine. A proximity sensor will let software “know” whether the watch is hidden in a sleeve or under a blanket. Whatever combination of sensors ultimately make their way into the product will inevitably lead to some very interesting new applications that people may have yet to consider. Other iDevices will combine the iWatch sensor data with data from their own sensors and from the outside world, such as weather data, to form a complete and complex picture.

Music. The Pebble is already handling music functions, which, of course, an iWatch would likewise be expected to do, just as the earlier generation iPod mini would do when embedded in an after-market watch-like case. The Pebble, however, is acting solely as a controller to—facilitator for—the user’s iPod or iPhone, rather than acting as a music device on its own, saving its battery life. The iWatch would be expected to follow this same path.

Telling the time. Yes, it will tell the time, likely offering a familiar Swiss Railroad watch face as an option, and it will tell the right time, too:  By communicating with the iPhone, it will update to changing time zones, etc., as the phone updates, eliminating—or at least reducing—the need for manual intervention, a major bother with current watches.

When Apple really gets serious about integrating Passbook, your watch will “know” when you’ve boarded that plane to London:  You were scheduled to board, the phone’s GPS locates you at the airport, and you just now turned off your phone.  Yesterday, the watch will have offered you an easy way to switch to split local/London time and, now that you’re aboard the plane, will be prepared for you to flip to just London time with a single touch.

The Apps

Most wearables to date have been dedicated devices.  The iWatch will be in the vanguard of devices that can work with 3rd party apps  There will be tens or hundreds of thousands of apps, few that either the designers of the iWatch (or I) will have anticipated. Almost all will actually run on the larger iDevices, extracting data from the iWatch, displaying data on the iWatch, or making use of the iWatch as facilitator.

Consider the iPhone, released on day-one with its handful of built-in apps.  Yes, it was exciting, but it was not nearly the tool that exact same phone had become three years later, as the breadth and depth of applications mounted and the system software matured.  We can expect the same curve to occur with the iWatch.

The Unexpected Apps

At least one or two evil apps will slip past the Apple watchdogs, launching a feeding frenzy in the press.  Apple will have already limited how much data a given app can access plus given us the power to offer and withdraw permissions. More steps will be taken once the breech occurs, and we’ll all soon get over it because the benefits we’re receiving will so far exceed the risks.

Then will come a different kind of unexpected apps. Consider SMS on cell phones. It’s a hack, a simple message system slipped in an underutilized space reserved for cell phones and towers to communicate with each other. It cost the cell phone companies nothing to offer it, and has made them billions of dollars, with total revenue expected to reach around one trillion dollars before the technology finally declines.  Grown-ups wouldn’t use it because you had to learn a secret code and phones are supposed to be talked into.  Kids took to it like ducks to water. (Only after Apple and its imitators made SMS accessible did the demographics creep upward.)

The iWatch, like every other Apple product, will have an interface made as simple as humanly possible.  However, human nature is such that, unless the designers work tirelessly to keep ahead or at least abreast of the users, it won’t stay that way forever.  Consider the following possibility:

KidCode. It might start out as an app designed with the best of intentions, to let people communicate via a brand-new gestural language-in, Morse-code vibration out, aimed, perhaps, at a few aging amateur radio operators. It it suddenly and unexpectedly taken over by school kids, sweeping the nation. No more being busted by teacher while intently tapping out text on phones. Instead, kids will be just innocently rubbing their watch faces. No more glancing at text screens, just feeling silent vibrations.  Tabloids and the evening news will simultaneously condemn it and  propagate it.  PTAsParent-Teacher Associations will decry it.  Civic leaders will condemn it.  Ultimately, teachers will learn to notice the trademark casually drooping arms of the senders, right hand over left wrist, along with the far-away stares of the recipients, and order will be restored.  However, by then, we’ll have an entire generation of kids that knows Morse code, just as an earlier generation learned that pressing the 4 button on a phone three times would get them an “K”.

YoungEmployeeCode. Kids grow up.  The young people you may be supervising in a few years will sit in your staff meeting strategizing against you in KidCode on their iWatches while looking at you with the most innocent of young, fresh faces.  You’ll learn to ply them with Krispy Kreme Doughnuts and coffee to force their hands above the tabletop, omitting napkins to ensure that, should they subsequently decide to engage in skullduggery, they’ll end up sliming their watches with syrupy glaze. (No, it won’t hurt the watch, but it will make you feel good anyway.)

This kind of utterly silent messaging will have benefit as well. Consider:

TheaterCode. Young people will be able to communicate in crowded theaters to their heart’s content without disturbing anyone.  No talking, whispering, ringing, buzzing, illuminated screens, no nothin’. If you are neither sender nor recipient, you will remain completely undisturbed except for the occasional seemingly random guffawA short explosion of laughter.

SalesCode. ExecCode. LawyerCode. A wide variety of people will communicate with collegues using KidCode in meetings and even open court, sending cues, cautions, etc., without fear of eavesdropping or censure, giving them a clear advantage over their less communicative opposition.

[box type=”info”]If you grew up knowing that pressing the 5 button three times will generate an “N” and pressing the 7 button two times will produce an “S”, but the very thought of having to learn KidCode sent a chill through you, I regret to inform you that you have officially just turned old.  Welcome.  The good news is that you will be old for a long, long time.[/box]

SilentMessage. Having learned the code, users will be able to receive notification of people calling, appointment reminders, etc., all in complete silence without even glancing at their phones.  Gestures can start, stop, pause, and replay messages, as well as set up replies, with coded responses offering the user feedback the the system understands. SilentMessage, as with most apps, would be primarily handled by the phone, with the watch accepting input and providing output, vibration in this case.  SilentMessage would also be an option.  Everything it could do could be done using either the iWatch display or the iPhone itself.

The Expected Apps

Many apps just belong out there. In some cases, they’re already being done by other companies in other forms, like the fitbit, or even in other watches, as with the companies mentioned above. In other cases, the iWatch

Golf. Baseball. Bowling. Tennis. Critique your form based on data gathered from the accelerometers in the watch. Get distance to the hole in golf and pertinent data for other sports delivered to the watch, rather than having to glance at your phone all the time.

Running/walking. Store and forward to your phone/computer data on jogging/walking time and distance based on arm swings, altitude changes based on pressure sensor, etc., to your phone or computer for the appropriate app to compute and display your running achievements. Lots of competition there already, but with the iWatch, it’s all built-in so you need not carry any additional hardware.

Swimming. Time your swimming laps retroactively.  Your “swim coach” app has instructed the watch to store and forward repetitive arm movement times and intervals when the watch is in a wet or high-pressure (under water) environment, so when your arm starts flailing for an extended period of time, that data gets stored and forwarded to the cloud via your phone.  Nothing for you to set beforehand. The app just simply has that data available to it to display the workout you did earlier today or a week ago Thursday if and when you become interested.

Health.  Having the watch facilitate a basic test like blood pressure monitoring would be a god-send, but probably at prohibitive cost in dollars, size, and energy.  However, people will write apps that will carry out other medical tests that will end up surprising us, such as tests for early detection of tremor, etc. The watch could also act as a store-and-forward data collector for other more specialized devices, cutting back the cost of specialized sensors that would then need be little more than a sensor, a Bluetooth chip, and a battery. Because the watch is always with us, it will be able to deliver a long-term data stream, rather than a limited snapshot, providing insight often missing from tests administered in a doctor’s office.

Find other stuff. Finding doesn’t have to be limited to only Apple products. The watch could also tell you that your car keys just went out of range, that your hand-carry luggage is no longer with you, etc. by communicating with simple Blue-Tooth-plus-battery transceivers designed as key fobs or luggage tags. They would then light up and/or emit chimes upon command to aid retrieval. These would likely not be Apple products, but would fit well into the Apple ecosystem.

Watching TV.  The iWatch will empower TV watching in at least two ways.  First, it can serve as the remote control:  Whisper to Siri what channel you want or what recorded show you want to watch. That information is then handled by a non-hobby version of AppleTV. Just double-tap to pause the screen.  Double-tap again to continue. (It could be some other gesture. They will choose one that you won’t perform by accident, but one that is much more lightweight than required, say, to unlock an iPhone.)

Second, because the iWatch eliminates the need for a passcode, IOS can be changed to enable your iPod/iPhone/iPad, in the presence of both iWatch and a nearby, running AppleTV, to turn on and default to the Remote app as soon as you pick it up, for the very first time making the Remote app practical to use on a passcode-protected iDevice.

The More Ambitious

Temperature Control. It wouldn’t take all that much to let the watch interface with a room’s thermostat. Local Bluetooth repeater information would determine what room you are in and provide the communications link, enabling you to raise or lower the current temperature from your wrist. However, if the watch can, through its array of sensors, accurately determine local ambient temperature where you are in the room, an HVAC system with an intelligent controller could provide a microclimate that would follow you around the building, making appropriate accomodation when two or more individuals with different thermal tastes occupy the same space.

[box type=”note”]The same localization information could be used by an evil employer to track employee whereabouts and, by inference, activities. In the case above, the HVAC system only needs to know that a human wants a temperature of 72 F/22 C, not that Bruce Tognazzini, employee #66, wants that temperature and spent 22 minutes and 17 seconds in that room. Apple will need to ensure that it is inherent in the system that data is anonymized to as great an extent as practical at every step.  The press will need to ensure that Apple maintains such an architecture and practice.[/box]

Correcting Apple Maps. This is a good example of what could come about through crowdsourcing using iWatch data.

[quote]Google Maps has had a roadway literally running right through the middle of my living room since 2005[/quote]

Contrary to press reports, Apple’s 2D roadmaps, supplied by TomTom, are pretty darned accurate.  However, because the initial Apple Maps presentation misled the world into believing that Apple Maps was the perfect app on its first day of release, it instantly became popular sport to point out every error anyone could find. Meanwhile, Google Maps has had a roadway literally running right through the middle of my living room since 2005, and no one has felt the need to send headlines screaming around the world about it. (Apple Maps, on Day One, moved that roadway off to the side of our property where it belongs.  I can’t tell you what a relief it has been to my wife and myself having reduced traffic passing between us and the telly these last months, with only Android users continuing to rumble past.)

What is less than stellar is Apple’s “3D View,” not “Flyover,” it’s quite wonderful. I’m talking about “3D View.” However, let’s start with “Flyover.”

“Flyover” is limited to the central portions of metropolitan areas within free and democratic countries.

Apple Maps Flyover View

This is not a photograph, but a texture-mapped model of San Francisco. The Flyover view, the envy of the computer world, covers far less than 1% of the globe and, because of its super-high cost, will never cover that much more.

Today’s “3D View,” seen below, superimposes a satellite photograph of the earth on a topographical map of the world. While the height of mountains, valleys, and lakes are accurately depicted, finer features, such as buildings and roadways, have no independent altitude information associated with them, resulting in buildings being uniformly flat and roadways being, at all times, assumed to hug the landscape, something that becomes quite comical when the “landscape” is a chasm dropping several hundred feet and the roadway is actually a bridge:

softBridge

Note that both the actual bridge and virtual bridge, the semi-transparent broken segments of paving seen slightly lower and to the left of the bridge, are shown as melted into the river.

The Fix: Using pressure data from millions of watches, Apple could build a precision altitude map of the world. This map would indicate true altitudes everywhere that iWatch wearers travel. The granularity would be several orders of magnitude greater than ever before attempted for a wide-area map at a cost several orders of magnitude less than Flyover.

Because most of the time, most of the people’s arms will be within four feet of known roadways (or rail beds), one can, over time, correct for both local barometric pressure and current GPS error (the GPS, of course, being in the phone, not the iWatch—GPS requires significant power). Given that data, one can then look for where current map data and people’s actual locations consistently vary, specifically where people appear to be either diving below or floating above the surface of the earth. If everyone is dropping below nominal ground level, they must be in a cut.

The more interesting data will arise from where people appear to be floating. Consider the real results that would be detected on Highway 93 above: Motorists’ watches will consistently show no pressure change as they cross the river, ergo, they are staying at the same altitude, ergo there is a bridge. Apply that correcton and the roadways, both real and virtual, will no longer melt into the river.

The building-height problem would likewise be solved:  Data collected day-after-day might report four different pressure levels, spaced 12 feet apart at one given location, indicating that particular building has four occupied stories.

Would the resulting map look as good as Flyover?  No.  The image textures would be missing, perhaps to be applied through local effort.  The buildings would typically be rendered as extruded solids, based on their roof shapes, i. e., primarily clusters of rectangular solids. Would it be ahead of what’s there and way ahead of the competition?  Definitely. Such a world-wide micro-altitude map, applied to Apple’s current 3D View, would instantly correct millions of errors, turning Apple Maps into the map with the most finely-detailed vertical information ever.

Weather prediction. Sure, the watch will tell you the temperature outside and whether you’re going to get rained on, but I’m talking about another crowdsourcing application, one that can save lives. Once a true altitude map has been established, meteorologists will be able to gather barometric data at a granularity never before even considered.  That data, fed into supercomputers, has the potential to enable them to detect and correlate initial conditions very early in the process, predicting storm paths, strengths, and timing with considerably higher precision than today.

Turn-by-turn walking directions. The face of a smartwatch would be a poor place to display maps, but it can display an arrow just fine. As you approach an intersection, the arrow will become bent, etc., indicating a right or left turn, just as we’re used to with the arrows in our GPS. Except there’s one problem: As you rotate your arm, the arrow, fixed as it is on the display, rotates right with you. Or at least it would if you didn’t have a compass embedded in your watch.

Here’s how a compass-equipped iWatch would work: You start by asking Siri to guide you someplace in the city, and the Maps app on your iPhone works out the route.  The iPhone issues its first command to the watch:  “iWatch: Display a straight arrow pointing toward 22 degrees.” (Actual syntax more complex.) The iWatch “knows” which way is North from its compass, so it adds 22 degrees to that and displays the arrow pointing toward 22 degrees.  Then, it updates that image, say, 15 times a second, as necessary.  You can rotate your arm all you want, but the iWatch will always display that arrow just floating there, always pointing toward 22 degrees magnetic.

The watch might also display the remaining minutes until the bus you’re hoping to catch will arrive, along with an indicator letting you know if your pace is sufficient.

With people no longer needing to stare at their iPhones as they walk down the street, there will be fewer people run over and fewer people subjected to having their iPhones snatched from their hands.

“What’s That [thing]?” You’re standing in a forest clearing and a waterfall high on the mountain catches your eye.  You raise your hand, point your finger, and say, “What’s that waterfall?”  Your iPhone’s speaker responds, “That’s the upper level of Yosemite Falls.” Simple: The GPS (in the phone) establishes your position, the iWatch compass reports the direction your arm is pointing, its accelerometer reports declination, and triangulation in the app on the phone corrects for the offset between your eyes and shoulder joint. (Yes, finer resolution could be achieved by having the user start out by running a setup routine to determine each user’s dominant eye. A bit beyond the scope of this article, no?)

For just these last two apps alone, having a compass would be very cool, and I hope they’ll incorporate one in the first release.  If they don’t, then these last two apps will fall into the category of…

Future Releases

With subsequent product generations, the iWatch will take on more and more of a central role in your iLife.

Important papers. You know that sinking feeling when you realize you left your wallet at home?  It would be nice if having your NFC chip with you in the watch would, from day-one, remove most of that, enabling you to buy lunch, gas, and food for dinner, but how about if it also stored electronic copies of your driver’s license, your passport, etc., along with an access pathway to your medical records for emergency personnel?

Ubiquitous access.  Approach any Apple device, mobile or not when wearing your iWatch. Armed with the owner of that device’s approval and your passcode, make it temporarily yours.  If it’s a Mac, you will see your account just as you last left it.  If it’s a phone, it will, for as long as you’re holding it, be your phone, being billed to your account, showing your address book, etc. (This is a concept we showed in the opening scene of my 1993 film, Starfire.) To secure that kind of access, will require two-factor authentication, and, with the iWatch, that authentication will finally become available and simple.

First Release

So when will the iWatch come out? I need mine no later than a week from Tuesday, but Apple, when you look back, is never actually the first. They let a few others, sometimes many others, experiment first. (Tablets were out for more than a decade.) Then, they bring out the killer product. We may have to wait until next year, or around 7500 pass code/password entries from now.  Please, Apple, get a move on!

Postscript – One Week Later

It may seem like this watch has every bell and whistle imaginable, but if you carefully examine what I’ve proposed, I’ve really outlined proven technology that is here today, found in other wearable products.  It is packaged differently, to be sure, but that has always been Apple’s hallmark.  In fact, the iWatch I have outlined uses much simpler technology than products already out there.  It does not have a speaker, an earphone jack, or a camera. I do not anticipate that it will be a two-way wrist radio nor a two-way wrist videophone, at least not for a long, long time.

The reason that some reviewers have seen the article as extravagant is that it projects the iWatch into a mature future. Consider back in 2007 when you first heard that Apple was about to release a line of phones. At that time, sophisticated phones held perhaps a dozen apps, most of them simple games, all of them relatively difficult to use.  Suddenly, you read that this new phone would not only make calls, but soon users will be able to see geosynchronous satellites in orbit simply by raising the phone in the air, to deposit a check in their bank accounts just by aiming their phone at it, and to do, not another dozen things, but another 800,000 things that might interest them. People might have imagined the phone would have to be the size of a house and the complexity of an NSA supercomputer.

Visioneering is about looking at the way products will appear at maturity in order to design in the necessary elements that will enable that maturity to take place.  What sounds extravagant in this case arises from a conservative hardware design coupled with an open architecture heavily dependent on the existing Apple infrastructure.  It is the openness of the architecture and the ability of Apple to leverage its infrastructure that will offer Apple the advantage and make this vision possible. Don’t expect every feature and certainly not every app to be in circulation on day one, but they and many more will be there in a short order, much faster than with previous products.

Below, you will find extensive reader comments that include many good ideas for some of those future apps as well as follow-on designs.

The Forum

The lively discussion that followed first publication of this article produced a number of excellent ideas both for software that could make use of an initial release as well as follow-on products. (The discussion is now closed.)

First Principles of Interaction Design (Revised & Expanded)

The following principles are fundamental to the design and implementation of effective interfaces, whether for traditional GUI environments, the web, mobile devices, wearables, or Internet-connected smart devices.

Help!

This is a huge revision. I expect I have made mistakes. Please leave corrections and suggestions in the Comments at the end. If you have better examples than I’m using, please include them as well, but give me enough information about them, including links or cites, that I can make use of them.

 

This revision features new examples and discussion involving mobile, wearables, and Internet-connected smart devices. However, the naming and organization remains the same except for three changes: I have shortened the name of one principle to extend its reach: “Color Blindness” is now simply Color and includes more than just color blindness. I’ve added one new principle, Aesthetics, and brought back two old principles, Discoverability and Simplicity. I dropped them from the list more than a decade ago when they had ceased to be a problem. Problems with Discoverability, in particular, have come roaring back.

What has changed greatly is the level of detail: You will find many new sub-principles within each category, along with far more explanation, case studies, and examples.


Previous Version & Its Translations. (Google’s machine translator for the latest edition, to your right). I’m continuing access to the original version of First Principles because it is cited in many scientific papers.


Introduction

Effective interfaces are visually apparent and forgiving, instilling in their users a sense of control. Users quickly see the breadth of their options, grasp how to achieve their goals, and can settle down to do their work. Effective interfaces do not concern the user with the inner workings of the system. Work is carefully and continuously saved, with full option for the user to undo any activity at any time. Effective applications and services perform a maximum of work, while requiring a minimum of information from users.

Because an application or service appears on the web or mobile device, the principles do not change. If anything, applying these principles—all these principles—becomes even more important.

I Love Apple, But It’s Not Perfect

I’ve used many example drawn from Apple products here, often as examples of bad interface practices. Apple has made many revolutionary breakthroughs in interaction technology, a trend I fully expect will continue. They also make mistakes, fewer than most, but because I use Apple products almost exclusively, I suffer from them daily. While composing this document on my various Apple devices, it’s only natural that I extract examples from what I see here and now.

Please do not take from this document that I am somehow an Apple hater. In 1978, I designed Apple’s first human interface after being recruited by Steve Jobs. I spent more than 14 years with the company. I buy most every new major product Apple releases on Day One and have much of my retirement invested in Apple stock. I love Apple, support Apple, but would like it to do even better.

 


Aesthetics

  • Principle: Aesthetic design should be left to those schooled and skilled in its application: Graphic/visual designers
  • Principle: Fashion should never trump usability

Generating artificial obsolescence through fashion is a time-honored and effective way to sell everything from clothing to cars. A new fashion should not and need not detract from user-performance: Enormous visual and even behavioral changes can be carried out that either do not hurt productivity or markedly increase it.

  • Principle: User test the visual design as thoroughly as the behavioral design

User test after aesthetic changes have been made, benchmarking, where applicable, the new design against the old. Ensure that learnability, satisfaction, and productivity have been improved or at least have stayed the same. If not, newly-added aesthetics that are causing a problem need to be rethought.

Top


Anticipation

  • Principle: Bring to the user all the information and tools needed for each step of the process

Software and hardware systems should attempt to anticipate the user’s wants and needs. Do not expect users to leave the current screen to search for and collect necessary information. Information must be in place and necessary tools present and visible.

Anticipation requires that designers have a deep understanding of both the task domain and the users in order to predict what will be needed. It also requires sufficient usability testing to ensure the goal has been met: If a tool or source for information is there on the screen, but users can’t find it, it may as well not even be present.

The penalty for failing to anticipate is often swift and permanent, particularly if you do not have a captive user, as is the case with public websites and apps, for example. Those users will probably never return from their search. Even if you do have a captive user, you probably don’t have a captive client, and if the client’s employees are wasting time trying to find required resources, your competitors will have a good story to tell when it is time to make their next pitch.

Top


Autonomy

  • Principle: The computer, interface, and task environment all “belong” to the user, but user-autonomy doesn’t mean we abandon rules

Give users some breathing room. Users learn quickly and gain a fast sense of mastery when they are placed “in charge.” Paradoxically, however, people do not feel free in the absence of all boundaries (Yallum, 1980). A little child will cry equally when confined in too small a space or left to wander in a large and empty warehouse. Adults, too, feel most comfortable in an environment that is neither confining nor infinite, an environment explorable, but not hazardous.

  • Principle: Enable users to make their own decisions, even ones aesthetically poor or behaviorally less efficient

Autonomy means users get to decide what keyboard they want to use, how they want their desktops to look (even if they like clutter), and what kind of apps they want to run. When developers take that kind of control away, users can be left frustrated and angry.

  • Principle: Exercise responsible control

Allowing users latitude does not mean developers should abandon all control. On the contrary, developers must exercise necessary control. Users should not be given so much rope they hang themselves. However, some developers today are not only taking excessive control, but making huge HCI errors in the process, like restricting text to fonts and sizes that people with ordinary eyesight can’t read. They offer editing schemes that require the user to use their fat finger to place the text cursor with pixel-precision accuracy just to avoid adding the necessary arrow keys to their aesthetically perfect, but functionally crippled, keyboard.

They also set an arbitrary timing and movement threshold for determining whether a user is or is not pressing a link on purpose, rather than her just pausing for an instant at the start of an upward swipe for scroll, for example. They then offer the user no way to alter that threshold, so many users find themselves triggering links to unwanted pages many, many times per day. That is an irresponsible application of control. We learned 30 years ago that users needed access to a slider for mouse double-clicking. Touch users need the same thing for link timing.

Perfect Link Triggering, Every Time

In thinking about solutions to problems like accidental link triggering, you have to consider the difference between a user who is accidentally triggering a link and a user who intends to trigger it.

The difference is easy when you think about it: I look at a link when I’m trying to trigger it. I’m not looking at the ones that I trigger by accident. Turn on the camera or use a built-in dedicated eye-tracker to look at the user’s eyes. If he looks right at the link for long enough for it to register in his mind that he’s touching it, he’s trying to follow the link. If he’s not looking at it, he’s accidentally touching it. When you have determined purposeful touching, trigger the link. If you determine the user is not purposely touching, ignore the fact that the user is touching it.

To save energy, to carry out this method, the camera or eye tracker need not be turned on until the user is hovering over or pressing a link. The method and algorithm may require minor tweaks in timing, but it should prove quite accurate.

This may have already been invented, but, if not, it’s called, “Accidental Link Triggering Error-Reduction Method Using Eye Tracking,” and I hereby put it in the public domain.

 

  • Principle: Use status mechanisms to keep users aware and informed

No autonomy can exist in the absence of control, and control cannot be exerted in the absence of sufficient information. Status mechanisms are vital to supplying the information necessary for users to respond appropriately to changing conditions.

  • Principle: Keep status information up to date and within easy view

Users should not have to seek out status information. Rather, they should be able to glance at their work environment and be able to gather at least a first approximation of state and workload.

  • Principle: Ensure status information is accurate

Status information can be up to date, yet inaccurate. At the time of this writing, when a user updated an iPhone or iPad to a new generation of system software, a progress indicator would appear showing that it will take approximately five minutes to complete the task. Actually, it typically takes an hour or more. (The new system itself would update in five minutes, but then all the other tens or hundreds of megabytes of information on the phone had to be re-uploaded.) The user, having been lied to, was left with no way to predict when she might actually get her device back. Such a user is not feeling autonomous.

Top


Color

Color blindness

  • Principle: Any time you use color to convey information in the interface, you should also use clear, secondary cues to convey the information to those who cannot see the colors presented.

Most people have color displays nowadays. However, approximately 10% of human males, along with fewer than 1% of females, have some form of color blindness.

  • Principle: Test your site to see what color-blind individuals see
Search Google for simulation tools. For example, for websites, you might try http://enably.com/chrometric/. For images, http://www.colblindor.com/coblis-color-blindness-simulator/.

Color as a vital interface element

  • Principle: Do not avoid color in the interface just because not every user can see every color.

Color is a vital dimension of our limited communication abilities. Stripping away colors that a person who is color blind can’t see does no more for that person than turning off the entire picture does for a person who is completely without sight. It’s the presence of an alternate set of cues for that person that is important.

  • Principle: Do not strip away or overwhelm color cues in the interface because of a passing graphic-design fad.

Generating artificial obsolescence through fashion is a time-honored and effective way to move products from clothing to cars. A new fashion should not and need not, however, detract from user-performance. User test after making aesthetic changes, benchmarking the new design against the old. Ensure that learnability, satisfaction, and productivity have improved or at least stayed the same. If not, newly added aesthetics that are causing a problem need to be rethought.

Top


Consistency

The following four consistency principles, taken together, offer the interaction designer tremendous latitude in the evolution of a product without seriously disrupting those areas of consistency most important to the user.

1) Levels of Consistency

  • Principle: The importance of maintaining strict consistency varies by level.

The following list is ordered from those interface elements demanding the least faithful consistency effort to those demanding the most. (Many people assume that the order of items one through six should be exactly the reverse. This can lead to real confusion as users confront pages that look familiar, but act completely different.)

1. Top level consistency

Platform consistency: Be generally consistent with de jureAs dictated by guidelines and standards & de factoThe unwritten rules to which the community adheres. standards

In-house consistency: Maintain a general look & feel across your products/services

Communicates brand and makes adoption of your other products and services easier and faster

2. Consistency across a suite of products, e. g., Microsoft Office

General look & feel communicates family

3. The overall look & feel of a single app, application or service–splash screens, design elements, etc.

A visual designer should establish a purposeful & well thought-through visual language, shaped by usability testing. User behaviors should be fully transferable throughout the product.

4. Small visible structures, such as icons, symbols, buttons, scroll bars, etc.

The appearance of such objects needs to be strictly controlled if people are not to spend half their time trying to figure out how to scroll or print. Their location is only just slightly less important than their appearance. Where it makes sense to standardize their location, do so.

5. Invisible structures

Invisible structures refers to such invisible objects as Microsoft Word’s clever little left border that has all kinds of magical properties, if you ever discover it is there. It may or may not appear in your version of Word. And if it doesn’t, you’ll never know for sure that it isn’t really there, on account of it’s invisible. That is exactly what is wrong with invisible objects and why, if you insist on using them, rigid consistency becomes so important.

Apple apparently thought this was a good idea and started copying Microsoft by adding invisible controls from scroll bars to buttons everywhere. The situation on the Mac got so bad that, by the early 2010s, the only way a user could discover how to use many of the most fundamental features of the computer was to use Google to search for help. (For more, see: Discoverability)

Some objects while, strictly speaking, visible, do not appear to be controls, so users, left to their own devices, might never discover their ability to be manipulated. If you absolutely insist on disguising a control, the secret rule should be crisp and clean, for example, “you can click and drag the edges of current Macintosh windows to resize them,” not, “You can click and drag various things sometimes, but not other things other times, so just try a lot of stuff and see what happens.”

Objects that convey information, rather than being used to generate information, should rarely, if ever, be made invisible. Apple has violated this in making the scroll bars on the Macintosh invisible until a user passes over them.

6. Interpretation of user behavior

Changing your interpretation of a user’s habitual action is one of the the worst things you can do to a user. Shortcut keys must maintain their meanings. A learned gesture must be interpreted in the standard way. If the button that carries the user to the next page or screen has been located at the bottom right for the last 30 years, don’t move it to the top right. Changes that require a user to unlearn a subconscious action and learn a new one are extremely frustrating to users. Users may not even realize what has happened and assume that something has failed in their hardware or software.

If you want to attract existing users of someone else’s product to your product, you should try to interpret your new user’s commands in the same way by, for example, allowing them to reuse the same shortcut keys they’ve grown used to.

 

Case Study: Apple “Command” Modifier Key
It was years before Apple finally gave Windows users a simple way to continue to use the Control key, rather than the Command key, for their keyboard shortcuts. Windows users new to Mac faced great difficulty in unlearning/relearning such an ingrained habit. Users having to switch between the two operating system as they moved between office and home had to unlearn/relearn twice a day and would end up constantly making errors as well as having to set aside their task to consciously consider what modifier key to press each and every time they wanted to make use of “shortcut” keys that were shortcuts no more. A large percentage of the difficulty in switching or using dual operating systems was due to this one missing capability, and it was a completely unnecessary hindrance from the start.

 

 

2) Induced Inconsistency


  • Principle: It is just important to be visually inconsistent when things act differently as it is to be visually consistent when things act the same

Make objects that act differently look different. For example, a trash can is an object into which a user may place trash and later pull it back out. If you want to skip the “and pull it back out” functionality, that’s fine. Just make it look like an incinerator or shredder or anything other than a trash can.

Make pages that have changed look changed. If someone encounters an unfamiliar page on an updated website or in a revised app, they know to look around and figure out what’s different. In the absence of such a cue, they will attempt to use the page exactly as they have always done, and it won’t work.

3) Continuity

  • Principle: Over time, strive for continuity, not consistency

If you come out with a completely re-worked area of your product or even a completely new product, it is important that people instantly recognize that something big has changed. Otherwise, they will jump into trying to use it exactly the way they always have and it just isn’t going to work. “Uniformity” would mean that your next product would be identical to your last, clearly wrong, but “consistency” is little better in a field where so much growth will continue to take place. Our goal is continuity, where there is a thread that weaves through our various products and releases, guiding our users, but not tying us to the past.

4) Consistency with User Expectation

  • Principle: “The most important consistency is consistency with user expectations”—William Buxton

It doesn’t matter how fine a logical argument you can put together for how something should work. If users expect it to work a different way, you will be facing an uphill and often unwinnable battle to change those expectations. If your way offers no clear advantage, go with what your users expect.

 

Case Study: The Xerox Star Drag Rule,

The rule proposed for dragging icons in the Xerox Star Finder was a paragon of elegance:

  • Proposed Rule: Dragging a document icon from one object (e. g., a folder or disk) to another on the desktop will move the document

Easy to learn. Easy to understand. Logical. Teachable. Terrible. The rule, to be fair, worked well most of the time. It even worked better in some circumstances than the far more complex rule we use today. For example, if you dragged a document from the folder on your desktop to a floppy disk, it moved the document, rather than making a copy on the floppy. If you then made changes to the document at home, using the document now on the floppy, when you slid the document back onto your desktop at work the next morning, you didn’t have the new version as well as an obsolete version left there the day before. You just had the one true version.

Everything worked well until it was time to print on the Xerox Star. At that a point, you would grab the document and drag it onto the printer icon. The document was transferred to the printer and erased forever from the desktop. A two-week war erupted between the engineers and the designers. The designers won, putting in place the rule we have today:

  • Final Rule: Dragging a document within a logical volume will move it. Dragging it from one logical volume to another will copy it

99+% of our users could not possibly tell you what a “logical volume” is, yet they understand the rule without our having to explicitly teach them. Why? Because it is consistent with user expectations, part of which is a very strong expectation that carrying out a routine activity will not result in the destruction of their work.

 

 

Top


Defaults

  • Principle: Defaults within fields should be easy to “blow away”

When a user activates a field, the current entry should be auto-selected so that pressing Backspace/Delete or starting to type will eliminate the current entry. Users can click within the field to deselect the whole, dropping the text pointer exactly where the user has clicked. The select-on-entry rule is generally followed today. (Sloppy coding, however, has resulted in the text cursor dropping at various unpredictable locations. )

  • Principle: Defaults should be “intelligent” and responsive

Not everything should have a default. If there isn’t a predictable winner, consider not offering any default. It takes precious cognitive cycles to look at a default that covers maybe 25% of the cases and make the decision not use it. That same time could be spent entering the choice actually desired.

  • Principle: Replace the word “default” with a more meaningful and responsive term

Users rarely have any idea of what “Default,” in a given situation means. (They do know its literal meaning, of course, which is that the bank is going to take away the user’s house. That always cheers them up.) Replace “Default” with “Revert to Standard Settings,” “Use Customary Settings,” “Restore Initial Settings,” or some other more specific terminology describing what will actually happen. User test to find out what terms enable your users to accurately predict what your software will actually do.

  • Principle: Both your vocabulary and visual design must communicate the scope of a reversion

Make sure, through user testing, that users understand the extent of the restoration: Are they signing up to a benign restoration of just a few recent and localized items, or are they about to spend the next four days re-entering usernames and passwords in every app they own?

User-test your restoration options to find out what users think the result of pressing the button will be. If you are going to do something benign, but they interpret it as potentially destructive, they won’t use the option, leaving them with the same broken or partially broken system that made them consider using it in the first place. Likewise, if you wipe out hours of careful customization without properly preparing them, they may not be nearly as grateful as you might expect. (I once had a young chap in India help me out with a minor problem on my DVR. When he was finished, he had led me to reinitialize the hard disk, erasing every single program on the machine. That was a bit more restoration than I was looking for. After that, I was able to carry out the rest of my conversation with him without even using the phone. I had no idea I could yell that loud.)

When designing tabbed objects, such as properties and preference windows, ensure that the visual design makes the scope of a restoration button clear. Individual tabbed “cards” should be visually separated from the surrounding window so that buttons may be placed either within the individual card or in the surrounding area, indicating whether the button action will apply only to the current tab or all tabs. There is never an excuse for leaving such a scope ambiguous. This is not a fashion decision.

Top


Discoverability

  • Principle: Any attempt to hide complexity will serve to increase it

Functional software does not have to look like a tractor; it can look like a Porsche. It cannot, however, look like a Porsche that’s missing its steering wheel, brake, and accelerator pedal. Yet many tech companies in the late 1990s began purposely hiding their most basic controls, often to the serious detriment of their users. Why? Because they found it more important to generate the Illusion of Simplicity for potential buyers than to reveal the extent of complexity to their actual users.

Businesses are driven to hide complexity from buyers because it can pay off in the short term: Most consumers who are potential buyers make judgements as to whether they can conquer a new machine not by sitting down and spending a day trying to learn it, but by gazing at the screen for ten minutes while the salesperson gives a demo. Stripping away scroll bars, hiding buttons, doing all the things that this section tells you not to do can all lead to increased profits, at least in the short term.

 

Case Study: The Invisible Mac Scroll Bars

Scroll bars are used to generate information, as a user clicks or drags within them to inform the software that the user wishes to move to a different position within the page or document. However, just as often, users will glance at the scroll bar just to see where they are in a page.Users try to maintain a sense of place on two different levels: First, their location right now within the visible part of the page; second, their location within the entire document. By forcing the user to move the mouse away from their current center of interest in order to scrub the scroll bar to make it appear, they lose a primary cue as to their current position within the visible page, namely, the current location of the mouse pointer.

Making a complex control like the scroll bar invisible likewise slows the user attempting to actually scroll: With the scroll invisible, the user can’t predict where in the blank space hiding the control she should scrub in order to find herself over the “elevator” she will use to do the scrolling. Instead, she first has to scrub somewhere in the scroll bar (target one) and then slide up or down to the elevator (target 2). (See Fitts’s Law for why this is bad.) Let’s assume that extra step would only require one extra second, a very conservative estimate. One second lost X ten scrolls per day per person X 66 million Mac users (at the time of this writing) = 21 person-years of wasted life and productivity per day, all to make a screen in a store look simpler.

 

 

  • Principle: If you choose to hide complexity, do so in the showroom only

You need never decide whether to support potential buyers or eventual users. We are not working with fixed pieces of hardware. We work with either pure software or hardware driven by software. A designer can easily create a system that will fully support both buyer and user, switching appearance depending on current need. You can design the software for an operating system, for example, that will present itself in a very simple form in the store only to slowly open up like a flower, offering the user more and more accessibility and functionality as the user becomes progressively more skilled and more comfortable.

Crippling an interface might help make the initial sale, but in the long run, it can lead to having your most important “sales force,” your existing customer base, not only leave you, but tell your potential buyers to stay away as well.

  • Principle: If the user cannot find it, it does not exist

Not all buyers are naive. Even those that are don’t stay that way long. Only the most persistent buyers/users will travel the web searching for a treasure map to features that you choose to hide from them. Most will simply turn to your competitors, taking you at your word that you just don’t offer whatever they were after.

 

Case Study: Safari for the Mac
I abandoned the Safari browser for Firefox after I found that Safari for the Mac had started corrupting PDF files when doing a Save As. Two years later, I tried Safari again, assuming Apple had by now fixed the bug. The bug persisted, but I persisted this time, too, spending 20 minutes Googling for a work-around. The solution I discovered? Never use Save As to save a PDF file. Instead, mouse away from the top of the display, where 100% of the file-manipulation controls are, toward the bottom of the window, where nothing but content exists. Suddenly, a gray box with a floppy-disk icon will start fading into view inside the content region. Click the icon, and Safari will save your PDF.

 

 

  • Use Active Discovery to guide people to more advanced features

With Active Discovery, you cease waiting for people to find something and, instead, offer it to them. In its ideal form, your system “realizes” they now need it and offers it to them. In most instances, we are far from being able to do that. A workable compromise:

  1. Mention to a user that a feature exists about the earliest he might need it
  2. Repeat the message at intelligently spaced intervals. Not over and over again.
  3. Stop mentioning it once either explored or adopted

The messaging might take the form of a “Did you know…” hint that you show during startup. (If you see a large percentage of your users are turning off these hints, it reflects that you are prematurely mentioning features, are giving them too many hints too often, or continue to tell them about features they have adopted. It is not necessary to give a helpful hint each and every time the user starts up your app. They are more likely to be read and appreciated if they occur on occasion.

 

Case Study: GroceryIQ app for the iPhone

GroceryIQ enabled the user to scan the bar codes of rapidly-emptying containers in the user’s kitchen or to type in a few leading words to any product they might find in a grocery store. Armed with the resulting list, they could walk through their neighborhood market, marking off items as they came across them. It was quick, efficient, and, in general, designed competently. The developers had made no effort to hide complexity. However, the user could make good use of the app within a couple minutes, discovering further complexity when and if their wants and needs expanded.GroceryIQ had no real competition in this rather narrow sector for several years. When a new app was released that also enabled the user to sync between devices, so that, for example, a wife could enter an item, such as a quart of milk, and it would show up in the husband’s shopping list a few seconds later, many people jumped ship, moving to the new product.So why is this a case study of interest? Because GroceryIQ had had syncing between products for more than five years before the competing product was released. It was easy to set up and very reliable. It was not purposely hidden and its name, Sync, certainly was not open to misinterpretation. The problem was that it was on the More… submenu, rather than being constantly displayed at the bottom of the phone. An Active Discovery reminder, triggered on opening perhaps a few days after the app was installed, along with a follow-up even every three months thereafter until such time as Sync was set up would have prevented a loss of customers five years later when the new app, “featuring Sync!” showed up.

 

 

  • Principle: Controls and other objects necessary for the successful use of software should be visibly accessible at all times

The object itself should either be view or enclosed by an object or series of objects (documents within folders, menu items within a menu, for example) that, in turn, are visibly accessible at all times.

Exceptions can be made for systems that are used habitually, such as a mobile browser or reader, where:

  1. Screen size is so limited that it is impractical to display items not currently needed, and
  2. It would be difficult or impossible for the user to fail to trigger the appearance of the controls by accident, thus ensuring the user will discover their existence.

These exceptions cover standard and widely-used operating system objects and behaviors on mobile devices, as long as users are given simple and obvious access to a help guide for using them.

  • Principle: There is no “elegance” exception to discoverability

A few designers, having fallen in love with the clean lines of smartphone apps, thought it would be great to visit those same clean lines on giant-screen computers. Wrong! Hiding functionality to create the Illusion of Simplicity is an approach that saps user-efficiency and makes products an easy target for competitors.

  • Principle: With the exception of small mobile devices, controls do not belong in the middle of the content area

Smartphone and tablet controls are sometimes forced into the content area because the screens are so small, that’s the only area there is. Even there, you need to provide a standard trigger, such as a tap in the middle of the content area, that will simultaneously expose all the icons and buttons representing all the hidden controls so that users don’t have to carry out a treasure hunt.

Web Pages & Cloud-based Applications

The first “inside-out” applications appeared inside web browsers as independent developers struggled to create complex sites and apps within the confines of a simple window in a simple tool designed for simple browsing of static pages. Even though the basic paradigm was shown inadequate by 1996, nothing at all has been done by standards committees to address the needs of these developers.

Developers working on complex sites and cloud-based web applications deserve access to the browser’s menu bar. That they don’t already have it is a continuing scandal. Imagine your boss telling you to whip up a competitor to Photoshop that uses Microsoft Word’s menu bar. The first words out of your mouth would be, “but that would mean I’d have to put all my menus in the content area of the window where the user’s image needs to be!” Insane, right? Yet that’s just what we do every time we create a web page.

 

On laptop and desktop computers, controls, and, in particular, hidden controls have no place inside the content area.

 

Case Study: Apple’s Inverted Applications
Apple, as of the early 2010s, began migrating controls from the area surrounding the content region of their Macintosh applications into the content region itself. The controls popped up in locations that would obscure the very content the user was attempting to affect. The user was often unable to move the controls far enough out of the way to be able to see what they were doing. Even when a control panel could be moved beyond the edges of the content window, it would revert to its original obscuring position the next time it was invoked. While this approach was sometimes a necessity for apps designed for phones, it made no practical sense for apps designed for the large screens found on traditional computers. Some of the affected apps would take up as little as 10% or 20% of the user’s screen, providing huge amounts of room for the controls suddenly being forced into the even smaller content region. It resulted in serious degradation of the user’s satisfaction and productivity.

 

 

  • Principle: Communicate your gestural vocabulary with visual diagrams

Include a help page that shows the gestures your app can understand. Present the page when the user first opens the app and make it clear where the user will find this help page after that. In a mobile app, make the icon representing the page constantly visible or have it form part of the standard set that appears around the periphery when the user touches the triggering area of the mobile screen. For a magazines and similar media objects, make it the first open page (after the cover) of each and every issue, always there and available.

Instructions for Popular Photography for iPad, an interface worth exploring

Instructions for Popular Photography for iPad, an interface worth exploring

  • Principle: Strive for Balance

It is not 1980, when most people had never seen a computer, and we necessarily made everything highly visible. You can use subtly in design: Don’t put an info icon next to every single item on the page. Instead, use overlays like this one from Google+ Snapseed that explain every symbol and every gesture at once:

Google+ Snapseed help overlay showing meaning of symbols as well as gestures

Google+ Snapseed help overlay showing meaning of symbols as well as gestures

It is difficult to see with the overlay in place, but the developers have reinforced their Cancel and Apply arrows with permanent, written labels. How do you find out if such reinforcement is necessary? How do you find out whether the user can figure out what to press (in this case, an always-visible question mark at the extreme top right) to get help to begin with? Follow the next principle and apply the results.

  • Principle: User-test for discoverability

To discover what information you need to communicate and to ensure you’re successfully communicating it, you must do routine usability tests throughout a project. Test using a population that has your expected level of experience with the system and task domain. See if they can locate, identify, learn, and use the tools they need to perform the tasks you expect your users to perform. If they cannot, iterate the design until they can. Make use of Active Discovery, Dealer Modes, whatever you need to to ensure your users can discover and learn the features of your product.

Not one of the errors I’ve discussed above would ever make it into production were user-experience groups conducting usability studies and altering the design based on those results.

Top


Efficiency of the User

  • Principle: Look at the user’s productivity, not the computer’s

In judging the efficiency of a system, look beyond just the efficiency of the machine. People cost a lot more money than machines, and while it might appear that increasing machine productivity must result in increasing human productivity, the opposite is often true. As a single example, forcing customers to enter telephone numbers without normal spacing or punctuation saves a single line of code and a handful of machine cycles. It also results in a lot of incorrectly captured phone numbers as people cannot scan clusters of ten or more digits to discover errors. (That’s exactly why phone numbers are broken up into smaller pieces.) The amount of time wasted by just one person in your company trying to track down the correct version of an incorrectly entered phone number would sweep away the few minutes it would have taken to code the entry form so users could scan and correct their errors. Wrong numbers also can and do result in a lost sales. How many trillions of machine cycles would the profit from a single lost sale have covered?

  • Principle: Keep the user occupied

Typically, the highest expense by far in a business is labor cost. Any time an employee must wait for the system to respond before they can proceed, money is being lost.

  • Principle: To maximize the efficiency of a business or other organization you must maximize everyone’s efficiency, not just the efficiency of the IT department or a similar group

Large organizations tend to be compartmentalized, with each group looking out for its own interests, sometimes to the detriment of the organization as a whole. Information technology departments often fall into the trap of creating or adopting systems that result in increased efficiency and lowered costs for the information resources department, but at the cost of lowered productivity for the company as a whole. It is your job to run the studies that prove out whether new designs based on new, money-saving technologies will increase or decrease overall productivity among the affected workers and, if so, by how much and with what result to the corporation’s bottom line.

Work with HR or department heads to find out the average cost per hour of affected employees. (An honored rule of thumb is to take their hourly wage and multiply it by three to include all the other overhead associated with an employee, from rent to heat, lights, computer support, etc.) Multiply the overhead cost x the number of affected employees x the time an activity takes x the difference in productivity, positive or negative, in carrying out the activity to find the actual cost of a change. A positive number will help you sell your group and your design. A negative number will help your company from making a costly mistake.

  • Principle: The great efficiency breakthroughs in software are to be found in the fundamental architecture of the system, not in the surface design of the interface

This simple truth is why it is so important for everyone involved in a software project to appreciate the importance of making user productivity goal number one and to understand the vital difference between building an efficient system and empowering an efficient user. This truth is also key to the need for close and constant cooperation, communication, and conspiracy between engineers and human interface designers if this goal is to be achieved.

Look at the difference between the iPad and the netbook computers it crushed. The differences had nothing to do with what key you pressed to open an email. They had to do with things like not having to press any key at all.

  • Principle: Error messages should actually help

Error messages must be written by a skilled writer to:

  1. Explain what’s wrong
  2. Tell the user specifically what to do about it
  3. Leave open the possibility the message is improperly being generated by a deeper system malfunction

“Error -1264” doesn’t do any of these. Rare is the error message that covers even Point One well. Yours should cover all three. Your Quality Assurance group should be charged with the responsibility for reporting back to you any message that does not fulfill the criteria.

Many other principles in this list affect efficiency, Latency Reduction and Readability in particular.

Top


Explorable Interfaces

  • Principle: Give users well-marked roads and landmarks, then let them shift into four-wheel drive

Mimic the safety, consistency, visibility, and predictability of the natural landscape we’ve evolved to navigate successfully. Don’t trap users into a single path through a service, but do offer them a line of least resistance. This lets the new user and the user who just wants to get the job done in the quickest way possible a “no-brainer” way through, while still enabling those who want to explore and play what-if a means to wander farther afield.

  • Principle: Sometimes you do have to provide deep ruts

The earlier your users are on the experience curve, the more you need to guide them. A single-use application for accomplishing an unknown task requires a far more directive interface than a habitual-use interface for experts. The deepest ruts are wizards.

  • Principle: Offer users stable perceptual cues for a sense of “home”

Stable visual elements not only enable people to navigate fast, they act as dependable landmarks, giving people a sense of “home.” A company logo on every page of a website, including every page of checkout, all enabling the user to escape back to the home page, makes users feel safe and secure. Paradoxically, such cues make it more likely that people will not escape back to the home page, secure in the knowledge that they easily can.

  • Principle: Make Actions reversible

People explore in ways beyond navigation. Sometimes they want to find out what would happen if they carried out some potentially dangerous action. Sometimes they don’t intend to find out, but they do anyway by accident. By making actions reversible, users can both explore and can “get sloppy” with their work. A perfect user is a slow user.

  • Principle: Always allow “Undo”

The unavoidable result of not supporting undo is that you must then support a bunch of confirmation dialogs that say the equivalent of, “Are you really, really sure?” Needless to say, this slows people down.

If you fail to provide such dialogs, in the absence of undo, people slow down even further. A study a few years back showed that people in a hazardous environment make no more mistakes than people in a supportive and more visually obvious environment, but they work a lot slower, taking great care to avoid making errors. The result was a huge hit in productivity.

We usually think of the absence of Undo as being the sign of lazy programming, but sometimes people do it on purpose. For example, some ecommerce sites want to make it hard for you to take things back out of your shopping cart once you’ve put them in there. This turns out to be a backwards strategy: An ecommerce study we did at the Nielsen Norman Group looked at what happens when merchants make it really easy to take things out of shopping carts. As might be expected, people visiting these merchants were much more willing to throw things in, figuring, “oh, well, I can always take it back out later.” Except they didn’t take them back out, because the deletion rate was no different. These user just bought more stuff.

  • Principle: Always allow a way out

Users should never feel trapped inside a maze. They should have a clear path out.

Cancel & Wizards

Cancel is particularly import in Wizards. Let people leave at any time, but make sure to tell them where they can finish the task later on. When you user-test, bring back two weeks later the same people whom you had cancel in the middle of a task and ask them to continue. Watch where they browse. If they browse to two different places in your menus, consider putting the function both places.

  • Principle: Make it easy and attractive to stay in

A clear, visible workflow that enables people to understand where they are and move either backward or forward in a process will encourage people to stick with a task. Consider, as an example, a multi-step checkout procedure. Making the navigation visible by putting each step on a clearly-labelled tab will let people know where they are in the process. Clicking an earlier tab should allow people to jump back to correct an error or just change their mind by, for example, selecting a different delivery address. They should then be able to click on the tab they were originally on and resume their forward movement. When you either forbid people to move back or destroy all subsequent data if they do, they will not be happy with you. Even if they decide to grit their teeth and continue with the current sale, they are unlikely to ever return.

Top


Fitts’s Law

  • Principle: The time to acquire a target is a function of the distance to and size of the target

Use large objects for important functions (Big buttons are faster). Use small objects for functions you would prefer users not perform.

Use the pinning actions of the sides, bottom, top, and corners of your display: A single-row toolbar with tool icons that “bleed” into the edges of the display will be significantly faster than a double row of icons with a carefully-applied one-pixel non-clickable edge between the closer tools and the side of the display. (Even a one-pixel boundary can result in a 20% to 30% slow-down for tools along the edge.)

While at first glance, this law might seem patently obvious, it is one of the most ignored principles in design. Fitts’s law (often improperly spelled “Fitts’ Law”) dictates the Macintosh pull-down menu acquisition should be approximately five times faster than Windows menu acquisition, and this is proven out.

Fitts’s law predicted that the Windows Start menu was built upside down, with the most used applications farthest from the entry point, and tests proved that out. Fitts’s law indicated that the most quickly accessed targets on any computer display are the four corners of the screen, because of their pinning action, and yet, for years, they seemed to be avoided at all costs by designers.

  • Multiple Fitts: The time to acquire multiple targets is the sum of the time to acquire each

In attempting to “Fittsize” a design, look to not only reduce distances and increase target sizes, but to reduce the total number of targets that must be acquired to carry out a given task. Remember that there are two classes of targets: Those found in the virtual world—buttons, slides, menus, drag drop-off points, etc., and those in the physical world—keyboards and the keys upon them, mice, physical locations on touch screens. All of these are targets.

  • Principle: Fitts’s Law is in effect regardless of the kind of pointing device or the nature of the target

Fitts’s Law was not repealed with the advent of smartphone or tablets. Paul Fitts, who first postulated the law in the 1940s, was working on aircraft cockpit design with physical controls, something much more akin to a touch interface rather than the indirect manipulation of a mouse. The pinning action of the sides and corners will be absent unless the screen itself is inset, but the distance to and size of the target continue to dictate acquisition times per Fitts’s law just as always.

  • Principle: Fitts’s Law requires a stop watch test

Like so much in the field of human-computer interaction, you must do a timed usability study to test for Fitts’s Law efficiency.

For more on Fitts’s Law, see my A Quiz Designed to Give You Fitts

Top


Human Interface Objects

Human-interface objects are separate and distinct from the objects found in object-oriented systems. Our objects include folders, documents, buttons, menus, and the trashcan. They appear within the user’s environment and may or may not map directly to an object-oriented program’s object. In fact, many early GUI’s were built entirely in non-object-oriented environments.

  • Principle: Human-interface objects can be seen, heard, felt, or otherwise perceived

Human interface objects that can be seen are quite familiar in graphic user interfaces. Objects that are perceived by another sense such as hearing or touch are less familiar or are not necessarily recognized by us as being objects. Ring tones are auditory objects, for example, but we tend to just think of them as ring tones, without assigning any higher-level category to them.

  • Principle: Human-interface objects have a standard way of being manipulated

Buttons are pressed, sliders are dragged, etc.

  • Principle: Human-interface objects have standard resulting behaviors

Dropping a document on a trash can does not delete it, it stores it in the trash can. Selecting “Empty Trash” is necessary to actually delete it.

  • Principle: Human-interface objects should be understandable, self-consistent, and stable
  • Principle: Use a new object when you want a user to interact with it in a different way or when it will result in different behavior

If dropping a document on your delete-document icon will destroy it instantly and permanently, do not make it look like a trash can. People come with expectations about previously encountered objects. It’s important not to confuse or water down such expectations. For example, if you use a trash can icon, but instantly destroy documents dropped into it, it broadens the rule for trash cans. Instead of the rule remaining: “Dropping a document on a trash can does not delete it. Rather, it stores it in the trash can. Selecting ‘Empty Trash’ is necessary to actually delete it,” it will shift to, “Dropping a document on a trash can will destroy it either right now or sometime in the next six months to a year.” That is not only confusing for your users, it is damaging to every other developer that uses the trash can icon in the proper way.

Top


Latency Reduction

  • Wherever possible, use multi-threading to push latency into the background

Latency can often be hidden from users through multi-tasking techniques, letting them continue with their work while transmission and computation take place in the background. Modern web browsers can pre-fetch data, reducing the dead time when the user reaches the end of a task and must wait for the next page to appear.

  • Principle: Reduce the user’s experience of latency
    • Acknowledge all button clicks by visual or aural feedback within 50 milliseconds
    • Trap multiple clicks of the same button or object.

Because the Internet is slow, people tend to press the same button repeatedly, causing things to be even slower.

  • Principle: Keep users informed when they face delay
Chart of Delay Feedback Times & Indicators

Delay Feedback Times & Indicators

  • Principle: Make it faster to begin with

Eliminate any element of the application that is not helping. Be ruthless.

The sluggish speed of the early web set users’s expectations extremely low. (It also burst the Internet bubble when people realized they could get in their car and drive round-trip to the shopping center in less time than it took to “trick” the website into selling them something.) They have become less forgiving as time has passed.

Mobile, which has an architecture more in keeping with traditional GUI applications than web browsing, has been reminding people that computers can be fast, and they are even more impatient with slow-downs. Wearables will come with an even higher level of expectation: No one waits to see what time it is, and they will not wait to see who is calling, what the temperature is outside, or any other information to be displayed.

Automotive applications today are oftentimes sluggish, suffering from a fatal cocktail of weak hardware, poor design/coding practices, and high latency. Consider the car hurtling down the road at 88 feet per second (27 meters per second) while the user, eyes fixed on the flat panel display, waits to learn which of ACDC’s many fine works he’s currently enjoying. Imagine the rich irony when the accident report reveals it was “Highway to Hell.”

Top


Learnability

Ideally, products would have no learning curve: Users would walk up to them for the very first time and achieve instant mastery. In practice, however, all applications and services, no matter how simple, will display a learning curve.

  • Principle: Limit the Trade-Offs

Learnability and usability are not mutually exclusive. First, decide which is the most important; then attack both with vigor. Ease of learning automatically coming at the expense of ease of use is a myth.

Do-It-Yourself Case Study: Ashlar-Vellum Graphite

Rent yourself a copy of the high-end CAD/CAM program from Ashlar-Vellum called Graphite. Notice how it took you less than 10 minutes to learn how to do productive work. Then try doing the same thing with the leading CAD/CAM brand. Notice how after six weeks you’re still just staring at the screen, wondering what to do. (You can skip the other leading brand, but do rent and learn from Graphite. It will change the way you design.)

How do you decide whether learnability or usability is most important? The first thing you must do is identify frequency of use: Are you working on a product or service that will be used only once or infrequently, or is it one that will be used habitually? If it’s single-use, the answer is clear: Learnability. If someone will use this every day, eight hours a day for the rest of his or her life, the answer is equally clear: Usability.

Next, who is the buyer? If the person who will use it habitually will also make the buying decision, a product’s reputation for learnability may be a key factor in making the sale. That’s why you want to identify the most important of the two, then attack both.

  • Principle: Avoid only testing for learnability

Much usability testing involves running a series of tests at regular intervals with your spending only 20 minutes to an hour with each subject you recruit. You end up knowing everything about the initial learning curve and nothing about either the long-term curve or the end-state level of productivity.

If you are working on an application that will be used habitually, go about it entirely differently: Work with HR to hire temporary workers. Then, have them spend a week or two coming up to speed on the interface, monitoring them with time tests to see what the overall learning curve and the eventual efficiency of your interface actually prove out to be.

Top


Metaphors, Use of

  • Principle: Choose metaphors that will enable users to instantly grasp the finest details of the conceptual model

Good metaphors generate in the users’ minds a strong series of connections to past experiences from the real world or from a previous cyberspace encounter, enabling users to form a fast and accurate sense of your system’s capabilities and limitations.

  • Principle: Bring metaphors “alive” by appealing to people’s perceptions–sight, sound, touch, and proprioception/kinesthesia–as well as triggering their memories

Try making your concepts visually apparent in the software itself. If that proves impractical, make it visual apparent through an illustration. The illustration should be compact and meaningful. Test it to see if it works, then embed it in such a way that every user that needs to see it will see it.

 

Case Study

Apple’s HyperCard browser: This precursor to the web, had a three-layer structure, with a Background layer in common with all cards in a deck (equivalent to across all pages on a website), a Foreground “card” layer, with elements pertaining to an individual card (page), and a logical control layer for the individual card, with all the buttons, etc. If you didn’t understand this concept, you couldn’t author in Hypercard, and few understood it until the graphic designer, Kristee Kreitman, drew an exploded picture showing the three layers. When we tested that, everyone got it instantly. 20 pages of perfectly-accurate text drafted by its inventor? Nothing. One picture? Total success.

structure-hypercard-objects

Hypercard Three-Layer Exploded Diagram

 

 

  • Principle: Expand beyond literal interpretation of real-world counterparts

Most metaphors evoke the familiar, but can and usually should add a new twist. For example, an electronic newspaper might bear a strong resemblance to a traditional paper, but with hyperlinks than enable users to quickly dive as far into articles as their interest drives them, something quite impossible with their paper counterparts. Not only is there no need to slavishly copy a real-world object (skeuomorphism), but unnecessarily limiting the functionality of a software counterpart just to “perfect” the imitation is most often bad design.

The inverse of skeuomorphism is abstraction, a prominent feature in so-called flat design, a fashion that took hold in 2013, turning once well-understood icons and other elements into meaningless abstractions and even false symbols. (For example, the icon for the browser on the iPhone became a compass, only connected to the concept of the web through the vaguest of abstractions. The iPhone has an actual compass, so they turned its icon into… another compass! Two compass icons: One tells you which way is north and the other connects you to your bank account. The Settings icon had originally looked like the inner workings of a clock, clearly carrying the message that this is an app that will let you see and affect the inside workings of the iPhone. That was abstracted to the point that it looks exactly like a large industrial fan.)

  • Principle: If a metaphor is holding you back, abandon it

A metaphor is intended to empower your user. However, there are times when it can also hold back your design.

 

Case Study: Dish Pointer Maps

DP Maps, as originally released in 2009 for the iPhone, used the time-honored map metaphor to enable people to aim TV satellite dishes.

DPMaps

It would superimpose TV satellite aiming information, including the direction to the satellite (the green line) on Google mapping data. Sometimes location and direction were fairly easy to interpret as in this case, where the user is standing in the parking lot of 1 Infinite Loop, Apple’s then-current headquarters. With such a unique location, surrounded by distinct landmarks, locating what building the green line should just miss when aimed correctly was pretty simple. It was not so simple when the user was standing in a wooded area surrounded by tens of thousands of trees, all of which look remarkably alike when viewed from outer space. That map metaphor had been best the developer could do for the original desktop version, when installers would look at locations on their computers at their office before going out to worksites. The iPhone, however, offered a new opportunity.

A new pair of glasses

Often times, in reworking an old idea, you will want to “put on a new pair of glasses” as a way of looking at the old problem from an entirely different angle. In this case, my recommendation to the developer was that he do that quite literally, scrapping the map metaphor entirely in favor of giving people magical sight, so they could simply look up and see the communications satellites orbiting 22,000 miles in space. That’s exactly what he did:

DPARPro

DP AR Pro

To use the replacement app, you walk around outside or scramble around on the roof until you can see the satellite(s) you need against clear sky without anything in the way. Mount the dish there, and you’re done. No map interpretation skills are needed.

 

 

Top


Protect Users’ Work

  • Principle: Ensure that users never lose their work

This principle is all but absolute. Users should not lose their work as a result of error on their part, the vagaries of Internet transmission, or any other reason other than the completely unavoidable, such as sudden loss of power to a client computer without proper power protection. We’ve gotten so used to being the victim of data loss that we often don’t even notice it. So consider if what happens routinely on the web happened in real life:

You go into Harrod’s Department Store in London. After making your selections, you are asked to fill out a four-page form. A gentleman looks the form over, then points to the bottom of Page 3 at your phone number. “Excuse me,” he says, “Look there. See how you used spaces in your phone number?” When you nod, he continues, “We weren’t expecting you to do that,” at which point, he picks up the four-page form and rips it to shreds before handing you a new, blank form.

Of course, never in a thousand years would such an event take place at Harrod’s, but another venerated British institution did exactly that to me almost twenty years into the miracle of the world wide web when I was invited to give them my emergency contact information for an upcoming flight into London. Every time I would fill out all eight fields on the form, it would come back with an error message about at least one field, having destroyed the entire contents of all eight of them! I’m sure it was carrying out this wanton destruction for my own good, but I could not for the life of me figure out from the messages what it actually wanted. 20 minutes and two browsers later, I gave up. I’m sure other passengers are abandoning their effort far earlier.

Travel sites, in general, think nothing of repeatedly tossing all the information the user has entered about cities, times, days of travel, frequent flyer numbers, anything that takes time and trouble to type in. The user may attempt nothing more radical than leaving a half hour later, but that is apparently grounds to destroy their choice of departure city and date as well as arrival city and date. If the user is impolite enough to go to the bathroom, well, that sort of activity poses a significant security risk, so of course their entire evening’s work must be destroyed, with a message explaining its been done for their own good.

Travel sites may be the tip of the iceberg, but websites in general, are notorious for their cavalier attitude when it comes to their user’s hard work, and it doesn’t stop there: Traditional applications continue to crash and burn, and the excuses for entire computer systems crashing and burning are at an end. Small portables can survive a power outage. It’s no longer acceptable that many of today’s high-end desktop computers and operating systems still do not support and encourage continuous-save. That, coupled with a small amount of power-protected memory, could eliminate the embarrassment of $5000 machines offering less reliability than 10-cent toys.

Top


Readability

  • Principle: Text that must be read should have high contrast

Favor black text on white or pale yellow backgrounds. Avoid gray backgrounds.

  • Principle: Use font sizes that are large enough to be readable on standard displays

You must demand of your marketing people that they tell you what the expected range of your customer’s standard displays will be. You then need to work with your graphic design and engineering people to ensure that your code will show up in appropriate sizes across that range of displays. It need not be one-size-fits-all. For example, CSS can mold itself to the system in which it finds itself.

  • Principle: Favor particularly large characters for the actual data you intend to display, as opposed to labels and instructions.

For example, the label, “Last Name,” can afford to be somewhat small. Habitual users will learn that that two-word gray blob says “Last Name.” Even new users, based on the context of the form on which it appears, will have a pretty good guess that it says “Last Name.” The actual last name entered/displayed, however, must be clearly readable. This becomes even more important for numbers. Human languages are highly redundant, enabling people to “heal” garbled messages. Numbers, however, unless they follow a very strict protocol, have no redundancy, so people need the ability to examine and comprehend every single character.

  • Principle: Menu and button labels should have the key word(s) first, forming unique labels

Experienced users read only as much of an item name to differentiate among items. Highly experienced users actually trigger on the difference in the external shapes of the entire first word(s) without ever actually reading anything.

  • Principle: Test all designs on your oldest expected user population

Presbyopia, the condition of hardened, less flexible lenses, coupled with reduced light transmission into the eye, affects most people over age 45. Do not trust your young eyes to make size and contrast decisions. You cannot.

  • Principle: There’s often an inverse relationship between the “prettiness” of a font and its readability.

Specifically, anti-aliasing softens the edges of a font, giving it a much smoother appearance on the digital page. The problem is that the human vision system responds to sharp edges, so, in smaller font sizes, an anti-aliased font, while often appearing more attractive, can be quite difficult to comprehend.There are anti-alias techniques that specifically increase the sharpness of the edges the eye is seeking, so this is not strictly a black and white issue (so to speak), but it is definitely something of which you should be aware and not something in which every graphic designer has been schooled. You will want to run some reading speed and comprehension tests on proposed font changes.

Top


Simplicity

  • Principle: Balance ease of installation vs. ease of use

As designers, we need to strive to simplify users’ lives. That often requires a delicate balance between our effort to make installation of a product easier and making subsequent use of that product easier or better.

Consider the autofill feature for browsers: The user is required to enter and maintain a database of information the browser can then drop into a form at the user’s command. It takes time to set it up, and every time anything changes, it’s just one more record that has to be altered. What’s more, it often fails to work, either not responding or putting incorrect data all over the place.

Apple has simplified the setup process by enabling the user to link Safari’s autoFill to the user’s contact card in his or her address book. However, the ability of Safari to actually fill out a form is just as dismal as it’s ever been, largely because there is no standardization of labels, locations, or anything else in forms.

I have solved the autoFill problem with a more technically complex solution: I use an app called Keyboard Maestro that sits in the background looking for certain key combinations. When it finds one I’ve programmed, it automatically replaces the text I’ve typed with a string of text I’ve previously stored. Setup was definitely more difficult, but now when I open a form and a field calls for my first name, I type, “bbbb” and it’s replaced with “Bruce.” I type “aaaa” and my address appears, “pppp” and my phone number pops into place, etc. It takes me 30 seconds to fill out a form, longer than autoFill would if it actually worked, but this method works on every single form every single time. It saves me time, effort, and frustration.

Often, you can assume that one user in the house will be technically inclined. When you have a trade-off between simplicity of installation/set-up and ease-of-use, get together with your marketing people. If they tell you that you can depend on at least one reasonably clever or sophisticated user, do make life a bit more difficult at first if it will make subsequent use a lot simpler for everyone else. However, expend effort making both installation and operation as simple as possible. That’s the approach that Nest took, where one person in the house must go through a complex and confusing process to tie their products to the Internet, but thereafter the lives of everyone in the house become simper.

(In Nest’s defense, they are doing their best to overcome a major flaw in the way Wi-Fi set up to work. It requires users to leave their normal Wi-Fi and log in to a new “network” with a gobbledygook name which is, in fact, their Nest device. This is a weird, backwards activity that throws most users the first time they encounter it. It also requires their going into the “basement area” of their phone or tablet, a place most users avoid whenever possible. It’s all-around bad, and the committees that oversee the Wi-Fi protocol need to address the issue if connected devices are to take off on an expanded basis.)

Principle: Avoid the “Illusion of Simplicity”

In the early years of this century, Apple became so focused on generating the illusion of simplicity for the potential buyer that they began seriously eroding the productivity of their products. They thought they had a good reason: They wanted new products to look bright, shiny, and simple to potential users. That’s an excellent goal, but actual simplicity is achieved by simplifying things, not by hiding complexity. (See Visibility.)

It’s just fine to make your showroom products look simple, but, to the extent you want to hide complexity to avoid scaring away buyers, do so in the showroom, not in the home or office of the purchaser now trying to accomplish real work. I started putting a special Dealer Mode into Apple software in 1978, so that the product would look and act differently in the showroom than in the buyer’s home. Computers allow that. Somewhere along the line, people forgot.

Principle: Use Progressive Revelation to flatten the learning curve

It is OK to make the user’s environment simpler when they are learning by hiding more advanced pathways and capabilities, revealing them when users come to need them and know how to handle them. This is distinct from the illusion of simplicity, where necessary controls are made invisible or hidden in obscure and unusual places so that users have to go on treasure hunts to find the tools they need to use right now.

Progressive revelation can cut down on support costs by eliminating calls from users trying to understand advanced capabilities before they have learned enough about the task domain to need them. It can also raise costs if advanced features are not introduced when they are needed or are too well hidden.

Principle: Do not simplify by eliminating necessary capabilities

This became another Apple problem after the release of their mobile devices. In 2014 on a Mac, you can set an alarm that will trigger 90 minutes before a calendar event. On the iPad, you can set a trigger for either one hour or two, but not 90 minutes. If a person needs a warning 90 minutes before the event, that’s when they need the warning. Apple has “simplified” the interface by giving the user no way to set an arbitrary time. No weakness or defect in the underlying interface would prevent Apple from giving users this capability. It is a conscious decision to limit what people can do with the product.

The way you set a 90 minute warning on an iPad is to create a second event, 30 minutes before the real event, and set a 60 minute warning.

How is that simpler?

Likewise, they have a very simple interface for finding photographs in your collection: You look through all your folders of photographs, one at a time, until you find the picture you’re looking for. Apple will neither display nor allow you to sort/search on the title, caption, or keywords you’ve carefully associated with your images. One can argue that the interface is simple: If you want to find a specific photo among that 20,000 on your iDevice, just look through your 73 folders you’ve created in iPhoto or Aperture to hold them until you find it. You don’t have to learn about searching, you don’t have to remember the name, you just have to have 10 to 20 minutes on your hands to spend the time looking.

How is that simple?

Fortunately, after many years, help is at hand: Apps like Photo Shack HD (the HD is important) enable you to search on all the criteria that Apple is importing but refusing to show you. That’s OK for really advanced features. However a remarkably high percentage (100% to be exact) know how to search.

Top


State

  • Principle: Because many of our browser-based products exist in a stateless environment, we have the responsibility to track state as needed

Our systems should “know”:

  • Whether this is the first time the user has been in the system
  • Where the user was when they left off in the last session
  • What the user has found of interest based on time spent with a pointing device moving, objects being touched, etc., in different areas
  • Where the user has been during this session
  • Where the user is right now and what they are doing

and myriad other details. In addition to simply knowing where our users have been, we can also make good use of what they’ve done.

One site with which you are familiar is so involved in and good at tracking state that it could be described as a state-tracking system that happens to do other stuff. That site is amazon.com. Their uncanny ability to make suggestions on what we might want to explore and buy is the result of their understanding our full history on their site. They know what expensive items we’ve come back to repeatedly in the past, what we’ve lingered over recently, and what would go well with what we just or recently purchased based on like-minded individuals.

  • Principle: State information should be stored in encrypted form on the server when they log off

Users should be able to log off at work, go home, and take up exactly where they left off. Following the principle of Protect Users’ Work, whatever they were last working on should be preserved in its current condition.

A private service for doctors, Physicians On Line, does an excellent job with this. Doctors can be 95% of the way through a complex transaction, log off, log in again six weeks later from another part of the world, and the service will ask them if they want to be taken right back to where they were.

“Track State” came late to this list, in 1996. Up until then, everyone had been tracking state on their own, without question. Because the web browsers failed to provide any tools beyond the purple color of a link indicating that link had been previously visited, engineers took this to mean they no longer needed to concern themselves with state at all. To the contrary, what it meant is that, from that day until this, applications engineers and designers have had to take over the full responsibility for tracking state that had historically been shared with the systems engineers, making the job that much harder.

  • Principle: Make clear what you will store & protect the user’s information

State data is neither good nor evil, but it can be put to both uses. You should make clear in your privacy policy that you will be saving data, making your case for why it is in the user’s interest. Any data from the user, including state data, should be encrypted and safeguarded.

Top


Visible Navigation

(See also: Discoverability)

  • Principle: Make navigation visible

Most users cannot and will not build elaborate mental maps and will become lost or tired if expected to do so.

The World Wide Web, for all its pretty screens and fancy buttons, is, in effect, an invisible navigation space. True, you can always see the specific page you are on, but you cannot see anything of the vast space between pages. Once users reach our sites or web-based applications, we must take care to reduce navigation to a minimum and make sure the remaining navigation is clear and natural. Ideally, present the illusion that users are always in the same place, with the work brought to them, as is done with the desktop metaphor. This not only eliminates the need for maps and other navigational aids, it offers users a greater sense of mastery and autonomy.

While you may have a thousand pages on your site, if every one of them has the same heading and the same main and secondary menus, the illusion to the user will be that she is always on the same page with content changing in one panel of the page. They can still be lost if they don’t know what panel they are viewing, so reinforce it by highlighting the current menu items that resulted in that particular panel, as well as offering them, in development systems that support it (and they all should), bread crumbing, such as:

NN/g Home > AskTog > Interaction Design Section

to help them build a mental model. As with the inherent statelessness of the web (see Track State, above), we must go beyond blindly accepting what the web’s architects have given us by adding layers of capability and protection that users want and need. That the web’s navigation is inherently invisible is a challenge, not an inevitability.

  • Principle: Limit screen counts by using overlays

In designing complex apps, strive for a minimal number of screens, each representing a separate and distinct task the user will be performing. When a user needs to perform a subtask, bring up an overlay that is smaller than full screen, so that users can see a darkened image of the main screen still present in the background. What is seen need not be memorized, so users need not remember how that overlay maps onto the screen behind it.

Top


Next Steps

If you want more, consider joining me for three days for my Interaction Design Course. It’s practical, engaging, and we always have a lot of fun. You’ll find the schedule close to the top in the right-hand column. Until then, thanks for reading.

(The copyright notice that follows you will discover is, essentially, a release for you to copy this post as much as you want as long as you aren’t publishing it for money.)
Top


Copyright

“First Principles of Interaction Design” is copyright 1978-2014 by Bruce Tognazzini. Permission to make copies for personal use is granted without reservation, provided this copyright notice remains on the copy. Please contact the author for permission to republish on a web site, to publish in bound form, or to make multiple copies.

  • Exception: Educators and in-house corporate trainers may make sufficient copies for their own students.
  • Exception: This work has become a standard for heuristic evaluation and cognitive walkthrough, as well as day-to-day design work, and nothing about this notice should limit its application to those endeavors, regardless of whether they are for commercial or non-commercial use, including sufficient reproduction of copies for teams carrying out such work.

No commerical use may be made of the work beyond these exceptions without permission. This notice must be retained together with any version of the work.

If you want to translate this work and make your result available for free, please contact me so we can exchange links.

In other words, I want people to make free and full use of this material, I just don’t want people to claim it as their own or to cynically make money off something that I am attempting to offer for free.
Top

Providing Predictable Targets

Three principles form a foundation to the graphical user interface: Discoverability, Stability, and Visibility.  They stand in stark contrast to MS-DOS and the earlier generation of interfaces, and their presence swept all of those others away. All three principles were so ingrained in the culture, so absolutely inviolate, that I eventually dropped all of them entirely from my list of core principles as no longer necessary to mention. (I also don’t mention the need to breathe. Some things you just figure people know.) I eventually started talking about Visibility again, but only in regards to the web, where navigation is inherently invisible.

The old interfaces persisted as long as they did because they worked just fine for the people who created them with their wrongly assuming that everyone else in the world shared their exceptional memories, off-scale IQs, and unbridled joy at the challenge of overcoming abstract, invisible interfaces. For many years, interfaces-for-the-rest-of-us ruled the world, and we let our guard down. Now, a new generation of people, of equally high IQ and lack of understanding of how different they are from the rest of the population, are once again creating interfaces that are a joy for them and a continuing frustration for others. Stability, Discoverability, and even Visibility are now both being widely violated. I can’t blame the newcomers because, after all, I pulled it out of my own “history book.” All three will soon be back in a newly-revised version of my First Principles.

Meanwhile, this month, let’s look at just one victim of this movement away from the central underpinnings of visual design, the Predictable Target, starting with an old friend, Fitts’s Law.

Fitts’s Law can accurately predict the time it will take a person to move their pointer, be it a glowing arrow on a screen or a finger tip attached to their hand, from its current position to the target they have chosen to hit. You can learn all about it in my riveting article, “A Quiz Designed to Give You Fitts,” but what the article doesn’t cover is a bit of history.

Paul Fitts was not a computer guy.  He was working on military cockpit design when he discovered his famous Law. Paul Fitts never had to deal with the the issue of stability because stuff inside aircraft cockpits is inherently stable. The few things that do move only do so because the pilot moved them, as when he or she pushes a control stick to the side or advances the throttle forward.  The rest of the targets the pilot must acquire—the pressure adjustment on the altitude indicator,  the Gatling gun arm switch, the frequency dial on the radio, and fuel pump kill switch—stay exactly where they were originally installed. Everything in that cockpit is a predictable target, either always in the same place or, in the case of things like the throttle, within a fixed area and exactly where you left it. Once you become familiar with the cockpit and settle into flying the same plane hour after hour after hour, you hardly look at your intended targets at all. Your motor memory carries your hand right to the target, with touch zeroing you in.

The early graphical user interface researchers came to the conclusion that GUIs, regardless of their inherent ability to slide objects around at every turn, should maintain targets with all the stability of an airplane cockpit. For years we followed that course with great success.

Let’s look at something that’s quite the opposite. Ever had someone do you a favor and “straighten up” your desk, room, or, worse, entire house? When you straighten things up yourself, you have a reasonable chance of remembering where you moved everything. When someone else does it, you haven’t a clue. And that should be a clue to us. When we unnecessarily move stuff around behind our users’ backs, we are causing trouble.  It’s going on now, and it’s time to stop.

Case study: Safari and Firefox tabbed browsers

Firefox tabs on the Mac at the time of this writing start out equally sized, only shrinking when there is no more room across the width to fit, and then only so much as is needed.  For the longest time, if you want to hit the second tab, you could return to the exact same spot on the screen and do so without fear of hitting either the first or third tab in the process. That second tab is a predictable target.

Firefox browser with two tabs open

Firefox 20.0 for Mac with two tabs open

By the time habitual Firefox users consciously elect to hit the second tab with their mouse, they’ll find the mouse already hovering either over or almost right over the target, their subconscious having jabbed the mouse in the direction their motor memory suggested to bring them there, similarly to the way a pilot’s hand will head for a control.  Only the absence of a physical device at the destination then requires the user look at the screen and make a small conscious correction, as necessary, to bring the mouse perfectly over the target before clicking down with the mouse.  Both that initial, habitual jab and the correction-to-target follow the parameters of Fitts’s Law. Because it’s both a predictable and large target, the acquisition time it very short.

Now, let’s look at Safari. What Apple does is to immediately divide the entire available space between the currently-opened tabs.  If there’s one tab, it stretches the full width.  When you add the second, it opens on the opposite side of the window from the first, as seen below, only moving into the position seen in the Firefox example after two more tabs have been added.

Safari 6.0 for Mac with two tabs open

Safari 6.0 for Mac with two tabs open

If you just look at this from the standpoint of Fitts’s Law, this seems like the ideal solution, as you always have the largest possible target, and the bigger the target the faster you can acquire it. However, the Firefox target is already plenty big, and, much more importantly, before you can acquire any target, you must first know where it is, and that’s where this scheme fails badly: The location of the target is not predictable because the second tab moves around so radically dependent on the number of tabs on the screen.

“Ah,” you might say, “but it is predictable as long as you know the rule! Yes, it is. And that rule ends up dictating the following four-step procedure which replaces Firefox’s one step (step three, below) procedure:

  1. Cease thinking about your task entirely.
  2. Look carefully at the tab bar and find the second tab wherever it may be right now.
  3. Go there and click it.
  4. See if you can pick up the thread of your thoughts somewhere near where you left off and continue working.

This, of course, is not the end of the world, more like a minor, if constant annoyance. And it wouldn’t be a big deal if it were an isolated example. But more and more of these interruptions to high-level cognitive processing are popping up all the time.

The Apple Dock has such complex rules for the location of objects that I abandoned using it as a test case because the explanation was going to be so lengthy.  And I can guarantee you that few Apple users, including myself, could recite all its rules to you. Instead, the “working rule” is that “stuff dances around all the time, so if you want something, just start scrubbing with the mouse from one end to the other and you’ll find it in there somewhere. Probably.” (Apple makes the titles of even identical-looking documents invisible until you scrub over them so the dock will look prettier.)

In Safari, it took me two years to learn the rule for printing a PDF document.  You have to click on a target that is always in the exact same place which would make it completely predictable except its invisible  I only discovered it even existed when I finally set aside an hour of my time and chased the solution down on Google. Invisibility and predictability don’t exactly mix.

[box]Why pick on Apple?

Two reasons: First, I use Apple almost exclusively, so I am subjected to examples every day. Second, Apple was the first to break the Predictable Target rule. Because their designs are so otherwise brilliant, they’ve been able to get away with it, frustrating their users, but not driving them off. However, others far less skilled are now following their bad example, seriously rolling back the clock on user experience. If you want to send me other bad examples from other people big enough to pick on, please do.[/box]

Several fundamental problems arise when you replace Predictable Targets with the kind of shifting objects governed by rules we’re seeing today:

  1. Users are often expected to infer these rules on their own, sometimes with nothing but a Google search to come to their aid. Users won’t do this.
  2. Users are always expected to memorize the rules, keeping all the conflicts straight in their heads. Users can’t do this.
  3. User’s “hands” don’t “understand” rules. Users “heads” can understand rules, but “hands” work from habit, not intellect, so having shifting locations driven by rules will guarantee slowdowns as users set aside what they are working on to consciously work through the problem of shifting target acquisition.

These problems are why the pioneers of graphical user interface adapted the exact same principles as airplane designers before them.  If you want people to hit targets, you, of course, make them visible, but you never, ever move them around.  (It’s also why the first thing fighter pilots do, when entering combat, is move their planes around as rapidly and as randomly as possible—kind of like objects in the Apple Dock—so their enemy can’t smite them.)

Supplying predictable targets does not preclude you from having dynamic, exciting screens. It just means that the things users have to find and touch or click, over and over again, should not move around.  It means that when a user is finished using something the first time, the user should decide where it will stay in the future, not you. It means that the target is visible at all times. They don’t have to get near it before it mysteriously appears. It means putting the Continue button or its equivalent in the same place on every page. It also means, when you want the user not to continue without really considering what they are about to do, that you put it in a different place.

Both Safari and Firefox compress their tab widths beyond a certain count so that more tabs can appear on the screen.  This goes against Predictable Targets, but is also an example of good design.  Why? Because the time hit from having to go fetch a tab that is currently off screen is greater than the time hit from having to make a small course correction to get to an existing tab that is on screen. If you do the study, you’ll find there is no appreciable time advantage to getting to the half-window-sized Safari tab shown above vs. the smaller, but still very good sized Firefox tab, but there is a considerable loss that an experienced user will face because of Predictable Target. That’s why Safari’s decision to start at full-width is bad and both browsers’s electing to compress tab size to enable more tabs to appear on the screen is good.

Predictable Target should appear high on your list of mandatory rules, only to be violated when it can be proven that another consideration, in a particular circumstance, will result in even greater productivity.

It’s becoming popular now to speak of the visual interface as one whose time is passing, to be replaced by voice and glance and who knows what. I supposed that’s going to happen just the way the mouse so successfully displaced the keyboard and TV got rid of radio as well as movies.

Yes, the fact is that old technologies do tend to persist, with the new taking their place beside them, and funeral plans for the graphical user interface may be a bit premature. Until such time as either the GUI passes on to its reward or humans spontaneously evolve to all think like engineers, Discoverability, Visibility, and Stability will continue to be vital to people’s comfort and success with visual interfaces. Predictable Targets, lying as it does at the confluence of these three principles, will likewise continue to be vital to people’s comfort and success.

If you happen to cross paths with one of those people who seems to not understand Predictable Targets, please send them a link to this article.  They may switch occupations one day.  You wouldn’t want them working on an fighter plane cockpit, deciding between sorties it would be a great idea to flip the position of the Gatling gun arm switch and the fuel pump kill switch.  Could lead to trouble.

The Forum

When you visit a forum, you are visiting my home. You will not see personal attacks on myself or other writers here because Siri automatically forwards them to the writer’s mom, along with a letter of explanation. My apologies. She’s rather strict about this. My long-time editor, John Scribblemonger, will then publish comments that are on point, but may edit for brevity and clarity. This being “asktog,” I will then often chime in, even if not explicitly asked.

I hope you will find the result worth reading, as well as joining.

The Third User

or Exactly Why Apple Keeps Doing Foolish Things

Apple keeps doing things in the Mac OS that leave the user-experience (UX) community scratching its collective head, things like hiding the scroll bars and placing invisible controls inside the content region of windows on computers.

Apple’s mobile devices are even worse:  It can take users upwards of five seconds to accurately drop the text pointer where they need it, but Apple refuses to add the arrow keys that have belonged on the keyboard from day-one.

Apple’s strategy is exactly right—up to a point

Apple’s decisions may look foolish to those schooled in UX, but balance that against the fact that Apple consistently makes more money than the next several leaders in the industry combined.  While it’s true Apple is missing something—arrow keys—we in the UX community areN missing something, too: Apple’s razor-sharp focus on a user many of us often fail to even consider: The potential user, the buyer.

[box]Who’s talking?

Bruce Tognazzini was hired at Apple by Steve Jobs and Jef Raskin in 1978, where he remained for 14 years, founding the Apple Human Interface Group and designing Apple’s first standard human interface. He is named inventor on 57 US patents ranging from a intelligent wristwatch to an aircraft radar system to, along with Jakob Nielsen, an eye-track-driven browser.

[/box]

During the first Jobsian era at Apple, I used to joke that Steve Jobs cared deeply about Apple customers from the moment they first considered purchasing an Apple computer right up until the time their check cleared the bank.  Of course, in later years, the check was replace by a credit card, and check clearance was replaced by the 15-day return period, but Steve’s and Apple’s focus remained the same.

What do most buyers not want?  They don’t want to see all kinds of scary-looking controls surrounding a media player. They don’t want to see a whole bunch of buttons they don’t understand. They don’t want to see scroll bars.  They do want to see clean screens with smooth lines. Buyers want to buy Ferraris, not tractors, and that’s exactly what Apple is selling.

The tipping point

While Apple is doing a bang-up job of catering to buyers, they have a serious disconnect at the point at which the buyer becomes a user.  That same person who was attracted to that bright and shiny computer in the showroom, as opposed to those dull-looking things in the Microsoft look-alike store, may not be so happy when denial breaks down, and he admits to himself that it’s so bright and shiny that the reflection of his office is blocking out the image on his screen.

(Steve had a habit of setting up impossible tasks for his people, only to have them overcome them anyway.  This year, Apple release the latest round of bright and shiny monitors except they don’t reflect the office to the exclusion of the screen. 75% reduction in reflections and just as glassy smooth. Some kind of magic.)

But let’s talk about software.  Let me offer two examples of Apple objects that aid in selling products, but make life difficult for users thereafter. Then I’ll talk about a simple, zero-downside solution that would solve the unnecessary problems Apple has been making for itself before presenting a very valuable lesson the rest of us can learn from Apple.

The Apple Dock

The Apple Dock is a superb device for selling computers for pretty much the same reasons that it fails miserably as a day-to-day device: A single glance at the Dock lets the potential buyer know that this a computer that is beautiful, fun, approachable,  easy to conquer, and you don’t have to do a lot of reading. Of course, not one of these attributes is literally true, at least not if the user ends up exploiting even a fraction of the machine’s potential, but such is the nature of merchandizing, and the Mac is certainly easier than the competition.

The real problem with the Dock is that Apple simultaneously stripped out functionality that was far superior, though less flashy, when they put the Dock in.  The Mac is a powerful computer with lots and lots of room.  There was no reason to strip anything out.  The flashy-demo Dock object and the serious-user objects could and should have continued to coexist.

Invisible Scroll Bars

“Gee, the screen looks so clean! This computer must be easy to use!” So goes the thinking of the buyer when seeing a document open in an Apple store, exactly the message Apple intends to impart. The problem right now is that Apple’s means of delivering that message is actually making the computer less easy to use!

When we were first implementing GUIs at Apple in the early 1980s, scroll bars were primarily an input device: Any document you saw on your screen was likely one you had written, destined eventually to be printed and distributed.  Today, most documents are web pages or email you’ve never laid eyes on before, and the scroll bar has become a vital status device as well, letting you know at a glance the size of and your current position within a document.

Hiding the scroll bar, from a user’s perspective, is madness. If the user wants to actually scroll, it’s bad enough:  He or she is now forced to use a thumbwheel or gesture to invoke scrolling, as the scroll bar is no longer even present.  However, if the user simply wants to see their place within the document, things can quickly spiral out of control: The only way to get the scroll bar to appear is to initiate scrolling, so the only way to see where you are right now in a document is to scroll to a different part of the document! It may only require scrolling a line or two, but it is still crazy on the face of it! And many windows contain panels with their own scroll bars as well, so trying to trick the correct one into turning on, if you can do so at all (good luck with Safari!) can be quite a challenge.

Multi-scroll bars

This article in WordPress editor in Safari with Mac set to “Alway Show” Scroll Bars

(The scroll bars, even when turned on, are hard to see with their latest mandatory drab gray replacing bright blue and are now so thin they take around twice as long to target as earlier scroll bars. When a company ships products either before user testing or after ignoring the results of that testing, both their product and their users suffer.)

One might argue that eye-tracking could be used to turn on the scroll bar as needed, but let’s consider that:  In the above example, if the center scroll bar suddenly hove into view every time your eyes wandered between the left and right columns, it would be more than a little distracting.  OK, so then maybe we force the user to stare fixedly at the place the scroll bar should appear for some set amount to time.  Hey, now there’s a bad idea!

Fortunately, the answer to both these and other problems is much simpler and needs no such sophisticated technology as I will soon reveal (I promise).  First, however, let me present a new model.

The User Spectrum

The UX community generally considers users to be arrayed along a spectrum that stretches from naive to expert.  Apple expands that array by prefacing it with buyers. The following chart shows that more complete spectrum, flowing from the buyer before the sale, through the dashed red line at the point of sale, and then moving along from naive to expert user with the continuing passage of time.

Supporting “The Third User”

Apple has concentrated virtually all their effort on potential users—buyers—and new users, leaving out, in the last several years, experienced users almost entirely.  The User Experience (UX) community are in the opposite position, rarely considering our user’s wants, needs, or desires until the moment they first open their newly-purchased products.

buyer-naive user-expert user chart

Both our efforts should resemble the third line on this chart, with all of us properly supporting people through a continuum stretching from customers’ first faint desires to the point at which they’ve become seasoned professionals.

Industrial Design: The Source of Apple’s Secret

Apple has drawn from the lessons of industrial design, applying them not only to external hardware, but software as well, a brilliant step forward that has made their products as appealing when turned on as when turned off.

Industrial design & “the message”

A primary purpose of industrial design is to deliver a carefully-crafted message. Let’s look at two products, the tractor and the Ferrari.

"I should like to remove your arm"

“I am powerful, and I should like to remove your arm”

The raw form of a tractor delivers a simple, brutal message:  “I am powerful!”  Unfortunately, people often see an unwritten message accompanying it, such as “I should like to remove your arm,” or simply “I’m way too powerful for the likes of you!”

SterlingNova

“I am amazingly fast, and you are devastatingly attractive.”

A Ferrari also delivers a power message, but not by displaying its engine.  Instead, the Ferrari’s message is indirect, rather than direct, the hallmark of industrial design.  The reality of the Ferrari’s powerful drive train is entirely hidden, skinned over with a second, completely independent message in shaped steel that not only screams power, but adds in the promise of speed, fun, and the granting of an instant increase in physical attractiveness to those who would possess it.

The reality and the message are truly independent. A sleek Italianate body could be gracing a Volkswagen Beetle chassis, and, as long as the buyer never made the mistake of turning the key in the ignition, the message would continue to ring out loud and true. What you are seeing in the photo above is actually a Sterling Nova composite body. The Nova, a design inspired by the Lamborghini Miura, was developed in England as an after-market body kit for bolting to, yes, a Volkswagen Beetle.

Industrial design: Borrow the aesthetic, ignore the limitation

While Apple has copied over the aesthetics of industrial design into the software world, they have also copied over its limitation:  Whether it be a tractor, Ferrari, or electric toaster, that piece of hardware, in the absence of upgradeable software, will look and act the same the first time you use it as the thousandth time. Software doesn’t share that natural physical limitation, and Apple must stop acting as though it does.

Apple need change nothing about their support for buyers and very little for new users.  They need only add support for experts, and they can do so without any effect whatsoever on either of the first two groups.

If Apple’s goal is to sell computers by having the interface look clean in the store, there’s no reason the interface need remain obscured after the sale. As soon as users get home and register, the software can flip the option to display scroll bars to Always, triggering the start of a gradual, planned transition from training-wheels to full-fledged computer-user. That’s good and proper design.

[box]Clutter (noise) vs. Dense Information

What we see as clutter (known as “noise” in Information Theory) when we look over someone’s shoulder is often a screen with a high concentration of information meaningful to them. If your personal aesthetic by nature or by training is tuned to seek visual simplicity, you may find that distressing. It is our job to remove real clutter—any tools or data not needed right now—but it is not our jobs to hide what experienced users need just to make ourselves feel better when we look at their screens.[/box]

The cost to Apple for ceasing to support expert users is spilling over into lost hardware sales.  I myself have stopped buying both new iPads and iPhones because Apple’s increasingly powerful hardware is so badly crippled by its software. Having shot more than 20,000 digital photos, I’d like to access them on my iPad from memory, lots and lots of memory.  Not going to happen.  Why? because when I transfer my photos, Apple strips off all my titles, date, time, location info, and keywords, then randomly shuffles all my folders. Individual photos are impossible to locate should I carry around more than 100 or so.  I already have plenty of memory for 100 photos.

Last year’s buyers, so confused by what a pinch gesture was or how use a trackpad, have become expert over the next twelve months, ready to move up to the next level. In fact, there are millions of these now-expert users. Inside Apple today are employees who understand that. These same employees understand that shipping a product like Aperture that, after almost five years, is still not even feature complete and is rife with data-destroying bugs, is a bad idea.  That having a flagship computer that has been obsolete for four years makes the entire company look bad. However, in recent years, their voices have not been heard.

Apple’s upper management probably lack the time to become expert users themselves. They need to start talking to people who are expert users.  People inside Apple.  People like me who have been in and around Apple for the last 35 years and, being expert users as well, understand the issue. (I’m available to anyone at Apple that would like to talk privately.)

Apple’s upper management needs to start listening to people in the press who keep writing article after article questioning why Apple refuses to let people select a “real” keyboard from the open market for their mobile devices if Apple won’t supply one with arrow keys, etc.,

[quote]Dear Apple,
We wouldn’t keep jail breaking your phones by the millions if you were giving us what we want and need,
                                                                     —Your Loyal Users[/quote]

Apple’s market research has already been done for them: Millions of users jail-break their iDevices to get at missing functionality. Apple need only look at what users are installing on those jail-broken phones to know know exactly what experienced users are craving. Sometimes the jail broken solutions are brilliant and should be bought and incorporated. Sometimes they’re clumsy, but nonetheless serve to illuminate unmet needs.

There’s no reason in the world not to grow all Apple’s devices to meet the needs of users as they grow. It is only playing into competitors’ hands, hurting the very people Apple courted last year and the year before when they were new buyers instead of expert users.

It’s good to have a walled garden, but the walls have to be far enough apart not to crush the child as he or she grows. As the naive user hits their “teenage years,” they should have the right to make their scroll bars any color they want, not drab gray because that’s “in” this year.  If the user who’s now an all-grown-up expert favors productivity over visual simplicity, that user has the right to select a keyboard that’s actually functional. Even if the user wants to select a keyboard that’s provably less efficient, as the USS Saratoga was, that is still the user’s right (as much as it pains me, as a human-computer interaction designer, to say it).

The Buyer-User-Seller Spectrum

Earlier, I showed buyers becoming users and thus, by implication, ceasing to be buyers.  Actually, buyers never cease to be buyers. True, you have their money, but they can leave you in an instant.  Unless you never want to sell to them again, you should continue to consider them buyers.  They do make a transition to user, and they also make another important transition: They eventually become a key part of your sales force.

Chart persistence of buyer role

If you keep your customers happy, they will become unpaid ambassadors for your product. Or, conversely, unpaid disparagers of your products.  Ask yourself, what do you put your own faith in, manufacturers’ puffery or experienced users’ amazon reviews?

Chart Apple persistence of buyer role

Apple’s expert users are their largest, most influential sales force. By allowing support for expert computer users to languish over the last several years at the very time that they’ve been adding to the ranks of expert computer users, Apple has slowly been turning its largest, most influential supporters against it.

Apple has been shipping mobile devices for years now, but they’ve refused to allow their users to move beyond the barest naive-user stage. People don’t lose 50 IQ points when they pick up a mobile device.  When they set up an appointment on an iPad vs. a computer, they still want to set an alarm for 1 hour and 40 minutes beforehand, not be forced to choose between either one hour or two hours because someone at Apple decided a minute-changing interface would be too complex. Perhaps it would be for demoing purposes, but it makes experienced users feel as though Apple thinks they’re stupid. You might be able to get away with a lot of things with your customers, but calling them stupid is not one of them.

It’s not as though people haven’t been writing about it, talking about it, screaming about it.  They have, each and every day for years, and not just in tech publications.  It is damaging Apple’s market share, and it is killing the stock price. And it is completely unnecessary not only because it is easy enough to design an interface that will “unfold” to support users as they grow, but because, paradoxically, sometimes even buyers whom you would least expect want some complexity.

The complexity paradox

Among those who seek out complexity are people who are actually terrified of complexity, hence the paradox.

One of the keyboards I worked on at Apple we dubbed the USS Saratoga because it was approximately the size and shape of the giant aircraft carrier of that same name. It had every extra key anyone at Apple could think of, including many unrecognized by the Macintosh at the time.  Its extreme size pushed the mouse further from the user, causing a time delay each time the user moved between keyboard and mouse. It was inefficient and really scary looking!

The new keyboard flew off the shelf faster than jet fighters off the real Saratoga, and it wasn’t expert users in need of all those non-functional keys buying it: It was comparatively new users—mainly guys—seeking visual evidence that their little Macs were real powerhouse computers. The buyers went for it because they found it scary, knowing that, in turn, it would intimidate their friends.

Before the invention of the personal computer, I spent fifteen years selling consumer electronics and teaching sales techniques. (I was both one of Apple’s first employees and first dealers.) I found the best way to motivate sales was to demonstrate ease-of-learning and ease-of-use while simultaneously talking about power. Apple used to do that, with its ads for “munitions-grade” computers.  Now, it’s all toy-piano music and nursery-school software.

Apple has been letting Microsoft and Android “own” the power sphere for many years now.  I find no compelling reason for Apple not to grab some of that space back. Visual simplicity sells and should be jealously guarded, but a certain class of complexity sells as well.

[box]Update
Shortly after this article was posted, Apple hired Kevin Lynch as its new VP of Technology. Kevin Lynch understands these issues I’ve been discussing, giving me hope that experienced users and, in fact, all post-purchase users will have a new and stronger voice.[/box]

What the UX Community Needs to Do

I talked a lot about what Apple needs to do, and you probably noticed almost all was argument. It is usually hard to get through to Apple, and what they actually need to change is rather simple.  Members of the UX community may need to do a lot more, but I suspect far less persuading will be necessary.

Most people in the UX community have excluded buyers from their thoughts simply because the users they support don’t make their own buying decisions. I have three arguments for including buyers in every step of your design process regardless:

  1. Buyers will either see the software demo’ed, try it themselves, observe their users using it, or hear complaints, in particular, if users hate it.
  2. Users must psychologically “buy in.” You reduce training costs when software is attractive.
  3. Users, lately, have been buying things like iPhones on their own, regardless of corporate policy. Letting them do so then becomes corporate policy. Next thing you know, your company is out.

Samuel Johnson once remarked, “When a man knows he is to be hanged in a fortnight, it concentrates his mind wonderfully.” If you begin to think of your users as buyers, too, able to cut you off without a second’s thought should you fail to please them, you will discover that little spark of fear will concentrate your mind  wonderfully as well. You’ll end up with a product that is more attractive, easier to learn, and more productive because you will be motivated to make your user/buyer happy, not just efficient. (One can be miserable but efficient for only so long.)

If you’re not sure where to start with all this, begin where Apple did, with industrial design. Learn about it and then move on to motivational psychology. It adds an entirely new dimension to one’s understanding of people, a dimension directly applicable to buyers.

Become best friends with marketing. We in UX have always seen them as allies, but they should be even closer, acting as partners when we are designing for buyers. It’s time to start lurking in retail stores and eavesdropping on people dropping cash on consumer products. Find out what makes them tick.  Get a sense for what motivates them.

Here is a sprinkling of books that I have found particularly useful in understanding buyers and how they fit into the spectrum of users. Start here and branch out.

Designing for People by Henry Dreyfuss. Dreyfuss was the greatest industrial designer of the last century.  He churned out everything from locomotives to cars to flying cars to the Trimline phone to Polaroid cameras to the round Westclock alarm clock to the equally round Honeywell thermostat, inspiration for the Nest.
I am a particular fan not only because he was a brilliant industrial designer, but because he understood the vital importance of usability testing to validate designs, specifically ensuring that the desires and needs of the full spectrum of users—buyers, naive, and expert—were being met. We have much to learn from him, and this book is an excellent starting point.
Henry Dreyfuss: The Man in the Brown Suit by Russell Flinchum. If Dreyfuss’s own book has whet your appetite, this well-written biography will help sate it.
While Designing for People gives us insight into Dreyfuss’s mind at the time of its writing, this biography gives us insight into his thinking as it changed over the decades, adding dimensionality. It is filled with illustrations and personal details missing from Dreyfuss’s own work.  I have gleaned much from each.
Games People Play by Eric Berne.  Freudian psychoanalyst Eric Berne’s Transactional Analysis (TA) altered mainstream therapy and sales psychology forever. You can apply TA’s simple, reliable model of human behavior from first field study to final usability follow-up to ensure you cover both the wants and needs of all users, including buyers.
The initial 3000 print run turned into 5 million. A fun read, it will alter your view of users forever. I’ve read every one of Eric Berne’s books and research papers. This is a perfect starting point.
“An important book…a brilliant, amusing, and clear catalogue of the psychological theatricals that human beings play over and over again.”— Kurt Vonnegut, Jr.
Steve Jobs by Walter Isaacson. Yeah, that book.  If you haven’t read it, you should.  This man was the greatest industrialist of our time and was responsible for the creation of the most revolutionary products in our industry.
Steve Jobs was also one of the greatest human-computer interaction designers of all time, though he would have adamantly denied it. (That’s one of Apple’s problems today.  They lost the only HCI designer with any power in the entire company the day Steve died, and they don’t even know it.)
Walter Isaacson was able to get further into Steve Jobs’s head than any save perhaps Steve’s own wife.  It is a must-read for all of us, including those who might have thought we knew him.

 

All three of these men have had a profound influence on my life for decades.  I’m sure there are other people and other books of note that could help all of us to even-out our support of the spectrum of users.  Please help out by suggesting more books below.

The Forum

When you visit a forum, you are visiting my home. You will not see personal attacks on myself or other writers here because Siri automatically forwards them to the writer’s mom, along with a letter of explanation. My apologies. She’s rather strict about this. My long-time editor, John Scribblemonger, will then publish comments that are on point, but may edit for brevity and clarity. This being “asktog,” I will then often chime in, even if not explicitly asked.

I hope you will find the result worth reading, as well as joining.

All the rest of Tog’s Articles…

asktog.com since the earliest days of the web, so early that the only index, Yahoo, was listing every site in the world in alphabetical order, leading me to choose the name, asktog, rather than just simply tog.

Above (and here) you will find two section links.  Most of my writing is on Interaction Design, but I’ve also written from time to time on Living, covering travel, food, humor, as well as some of my responses to more general questions.

I am in the process of moving the entire site over to WordPress.  With hundreds of pages in pure HTML, all to be converted and updated, that is going to take some time. Meanwhile, WordPress is convinced that no other site exists and was bent on telling you there were no further columns unless I interceded.  Hence this page.

So now you know.  There’s a full book’s worth of columns. Please click either of the section links above and enjoy,

-tog –>