Three principles form a foundation to the graphical user interface: Discoverability, Stability, and Visibility. They stand in stark contrast to MS-DOS and the earlier generation of interfaces, and their presence swept all of those others away. All three principles were so ingrained in the culture, so absolutely inviolate, that I eventually dropped all of them entirely from my list of core principles as no longer necessary to mention. (I also don’t mention the need to breathe. Some things you just figure people know.) I eventually started talking about Visibility again, but only in regards to the web, where navigation is inherently invisible.
The old interfaces persisted as long as they did because they worked just fine for the people who created them with their wrongly assuming that everyone else in the world shared their exceptional memories, off-scale IQs, and unbridled joy at the challenge of overcoming abstract, invisible interfaces. For many years, interfaces-for-the-rest-of-us ruled the world, and we let our guard down. Now, a new generation of people, of equally high IQ and lack of understanding of how different they are from the rest of the population, are once again creating interfaces that are a joy for them and a continuing frustration for others. Stability, Discoverability, and even Visibility are now both being widely violated. I can’t blame the newcomers because, after all, I pulled it out of my own “history book.” All three will soon be back in a newly-revised version of my First Principles.
Meanwhile, this month, let’s look at just one victim of this movement away from the central underpinnings of visual design, the Predictable Target, starting with an old friend, Fitts’s Law.
Fitts’s Law can accurately predict the time it will take a person to move their pointer, be it a glowing arrow on a screen or a finger tip attached to their hand, from its current position to the target they have chosen to hit. You can learn all about it in my riveting article, “A Quiz Designed to Give You Fitts,” but what the article doesn’t cover is a bit of history.
Paul Fitts was not a computer guy. He was working on military cockpit design when he discovered his famous Law. Paul Fitts never had to deal with the the issue of stability because stuff inside aircraft cockpits is inherently stable. The few things that do move only do so because the pilot moved them, as when he or she pushes a control stick to the side or advances the throttle forward. The rest of the targets the pilot must acquire—the pressure adjustment on the altitude indicator, the Gatling gun arm switch, the frequency dial on the radio, and fuel pump kill switch—stay exactly where they were originally installed. Everything in that cockpit is a predictable target, either always in the same place or, in the case of things like the throttle, within a fixed area and exactly where you left it. Once you become familiar with the cockpit and settle into flying the same plane hour after hour after hour, you hardly look at your intended targets at all. Your motor memory carries your hand right to the target, with touch zeroing you in.
The early graphical user interface researchers came to the conclusion that GUIs, regardless of their inherent ability to slide objects around at every turn, should maintain targets with all the stability of an airplane cockpit. For years we followed that course with great success.
Let’s look at something that’s quite the opposite. Ever had someone do you a favor and “straighten up” your desk, room, or, worse, entire house? When you straighten things up yourself, you have a reasonable chance of remembering where you moved everything. When someone else does it, you haven’t a clue. And that should be a clue to us. When we unnecessarily move stuff around behind our users’ backs, we are causing trouble. It’s going on now, and it’s time to stop.
Case study: Safari and Firefox tabbed browsers
Firefox tabs on the Mac at the time of this writing start out equally sized, only shrinking when there is no more room across the width to fit, and then only so much as is needed. For the longest time, if you want to hit the second tab, you could return to the exact same spot on the screen and do so without fear of hitting either the first or third tab in the process. That second tab is a predictable target.
By the time habitual Firefox users consciously elect to hit the second tab with their mouse, they’ll find the mouse already hovering either over or almost right over the target, their subconscious having jabbed the mouse in the direction their motor memory suggested to bring them there, similarly to the way a pilot’s hand will head for a control. Only the absence of a physical device at the destination then requires the user look at the screen and make a small conscious correction, as necessary, to bring the mouse perfectly over the target before clicking down with the mouse. Both that initial, habitual jab and the correction-to-target follow the parameters of Fitts’s Law. Because it’s both a predictable and large target, the acquisition time it very short.
Now, let’s look at Safari. What Apple does is to immediately divide the entire available space between the currently-opened tabs. If there’s one tab, it stretches the full width. When you add the second, it opens on the opposite side of the window from the first, as seen below, only moving into the position seen in the Firefox example after two more tabs have been added.
If you just look at this from the standpoint of Fitts’s Law, this seems like the ideal solution, as you always have the largest possible target, and the bigger the target the faster you can acquire it. However, the Firefox target is already plenty big, and, much more importantly, before you can acquire any target, you must first know where it is, and that’s where this scheme fails badly: The location of the target is not predictable because the second tab moves around so radically dependent on the number of tabs on the screen.
“Ah,” you might say, “but it is predictable as long as you know the rule! Yes, it is. And that rule ends up dictating the following four-step procedure which replaces Firefox’s one step (step three, below) procedure:
- Cease thinking about your task entirely.
- Look carefully at the tab bar and find the second tab wherever it may be right now.
- Go there and click it.
- See if you can pick up the thread of your thoughts somewhere near where you left off and continue working.
This, of course, is not the end of the world, more like a minor, if constant annoyance. And it wouldn’t be a big deal if it were an isolated example. But more and more of these interruptions to high-level cognitive processing are popping up all the time.
The Apple Dock has such complex rules for the location of objects that I abandoned using it as a test case because the explanation was going to be so lengthy. And I can guarantee you that few Apple users, including myself, could recite all its rules to you. Instead, the “working rule” is that “stuff dances around all the time, so if you want something, just start scrubbing with the mouse from one end to the other and you’ll find it in there somewhere. Probably.” (Apple makes the titles of even identical-looking documents invisible until you scrub over them so the dock will look prettier.)
In Safari, it took me two years to learn the rule for printing a PDF document. You have to click on a target that is always in the exact same place which would make it completely predictable except its invisible I only discovered it even existed when I finally set aside an hour of my time and chased the solution down on Google. Invisibility and predictability don’t exactly mix.
Two reasons: First, I use Apple almost exclusively, so I am subjected to examples every day. Second, Apple was the first to break the Predictable Target rule. Because their designs are so otherwise brilliant, they’ve been able to get away with it, frustrating their users, but not driving them off. However, others far less skilled are now following their bad example, seriously rolling back the clock on user experience. If you want to send me other bad examples from other people big enough to pick on, please do.
Several fundamental problems arise when you replace Predictable Targets with the kind of shifting objects governed by rules we’re seeing today:
- Users are often expected to infer these rules on their own, sometimes with nothing but a Google search to come to their aid. Users won’t do this.
- Users are always expected to memorize the rules, keeping all the conflicts straight in their heads. Users can’t do this.
- User’s “hands” don’t “understand” rules. Users “heads” can understand rules, but “hands” work from habit, not intellect, so having shifting locations driven by rules will guarantee slowdowns as users set aside what they are working on to consciously work through the problem of shifting target acquisition.
These problems are why the pioneers of graphical user interface adapted the exact same principles as airplane designers before them. If you want people to hit targets, you, of course, make them visible, but you never, ever move them around. (It’s also why the first thing fighter pilots do, when entering combat, is move their planes around as rapidly and as randomly as possible—kind of like objects in the Apple Dock—so their enemy can’t smite them.)
Supplying predictable targets does not preclude you from having dynamic, exciting screens. It just means that the things users have to find and touch or click, over and over again, should not move around. It means that when a user is finished using something the first time, the user should decide where it will stay in the future, not you. It means that the target is visible at all times. They don’t have to get near it before it mysteriously appears. It means putting the Continue button or its equivalent in the same place on every page. It also means, when you want the user not to continue without really considering what they are about to do, that you put it in a different place.
Both Safari and Firefox compress their tab widths beyond a certain count so that more tabs can appear on the screen. This goes against Predictable Targets, but is also an example of good design. Why? Because the time hit from having to go fetch a tab that is currently off screen is greater than the time hit from having to make a small course correction to get to an existing tab that is on screen. If you do the study, you’ll find there is no appreciable time advantage to getting to the half-window-sized Safari tab shown above vs. the smaller, but still very good sized Firefox tab, but there is a considerable loss that an experienced user will face because of Predictable Target. That’s why Safari’s decision to start at full-width is bad and both browsers’s electing to compress tab size to enable more tabs to appear on the screen is good.
Predictable Target should appear high on your list of mandatory rules, only to be violated when it can be proven that another consideration, in a particular circumstance, will result in even greater productivity.
It’s becoming popular now to speak of the visual interface as one whose time is passing, to be replaced by voice and glance and who knows what. I supposed that’s going to happen just the way the mouse so successfully displaced the keyboard and TV got rid of radio as well as movies.
Yes, the fact is that old technologies do tend to persist, with the new taking their place beside them, and funeral plans for the graphical user interface may be a bit premature. Until such time as either the GUI passes on to its reward or humans spontaneously evolve to all think like engineers, Discoverability, Visibility, and Stability will continue to be vital to people’s comfort and success with visual interfaces. Predictable Targets, lying as it does at the confluence of these three principles, will likewise continue to be vital to people’s comfort and success.
If you happen to cross paths with one of those people who seems to not understand Predictable Targets, please send them a link to this article. They may switch occupations one day. You wouldn’t want them working on an fighter plane cockpit, deciding between sorties it would be a great idea to flip the position of the Gatling gun arm switch and the fuel pump kill switch. Could lead to trouble.
When you visit a forum, you are visiting my home. You will not see personal attacks on myself or other writers here because Siri automatically forwards them to the writer’s mom, along with a letter of explanation. My apologies. She’s rather strict about this. My long-time editor, John Scribblemonger, will then publish comments that are on point, but may edit for brevity and clarity. This being “asktog,” I will then often chime in, even if not explicitly asked.
I hope you will find the result worth reading, as well as joining.
Why do you have zoom disabled for your site on iPad? The font isn’t exactly big, and the empty space on the right is just annoying.
Because I don’t know how yet to make WordPress|WooThemes|Safari enable it. If anyone knows the trick, please let me in on it. I hate it, too.
Meanwhile, if you hold your iPad vertically and press the Reader button up in the URL address field, you will get a full-width, nicely legible image.
In your meta tags. You have
Remove maximum-scale so the user can zoom.
The Mac OS X Finder is another excellent example of utter unpredictability. Siracusa used to cover this very well in his lengthy Mac OS X reviews on Ars Technica (great reads), but he finally gave up complaining.
This is one thing I miss from goold old Classic Mac OS X Finder.
Another good article, Tog.
Apple’s inordinate fondness for low contrast/low visibility themes (not to mention writing everything important in 6 point type in barely distinguishable non contrasting shades of grey) has been getting much worse over the past few years, particularly in Lion and Mtn Lion with their many “only show when needed” features like the disappearing scroll bars or system preferences that only appear when ‘relevant’ or ‘needed’.
Apparently someone decided that it you weren’t scrolling, you didn’t need to see the scroll bars! Hello? Anybody home at Apple HQ? Does anybody at Cupertino use an Apple computer?
If I’m reading a 50 page article or on a web page, I want to know at a glance whether I’m near the top, middle, or end, and what part of the overall article, proportionately, is visible in this window. A lot of smart people thought about all that and made the information easily visible via a quick glance at the high contrast proportionate scroll bars in Mac Classic and Aqua interfaces. More recently some not so smart people decided to hide it away. 🙁
Fortunately they did at least leave an option to partially restore the functionality in a low contrast not very visible narrow bar if you know where to look, along with restoring the mouse scroll button to natural scrolling by counterintuitively – unchecking! – “Natural Scrolling”. These sort of mindless changes for no useful purpose are like the designer of an automobile deciding that turning the steering wheel clockwise on a particular model of car should turn the car left instead of right, or whacking the centre of the steering wheel should turn the headlights off instead of honking the horn, or that the green light should be at the top of the traffic signal instead of the bottom, thereby guaranteeing a string of traffic accidents.
Fortunately, so far the iOS designers have resisted the pressure to change that interface for no useful purpose except that tech blogs with nothing better to write about complain that it’s “stale” because, having mostly got it right in the first place, any change is likely to unimprove it, like the dumb change to the phone number keypad to a white background with small low contrast grey numbers.
There is a wonderful saying: “If it ain’t broke, don’t fix it!”
The designers of Window 8 are carrying Apples interface unimprovements to their (il)logical conclusion. In fact, the more experienced the Window user, the more irritating they are going to find the changes. To use a single example, for decades now clicking a red X in the top right corner of a window closed it and quit the program. When I first looked at Windows 8 in local computer store a couple of days after it became locally available, I went to close the open program and said, WTF? I called a an employee over at the store and asked, “How do I quit a program?” He didn’t know either. Nor did the next person. When I left the store, half a dozen experienced employees were still trying to figure out how to quit a program.
As it turned out, you unintuitively drag the title bar to the bottom of the display, which is a lot more trouble than clicking in the top right corner of the window, which, as your article nicely points out, until now was always in the same spot.
When I get into the shower and turn the left hand tap on, I expect hot water to come out. When I get to the street corner and turn the steering wheel clockwise, I expect the car to turn right. The default interface should have all the information turned ON, with an option to turn off or hide some items that don’t necessarily need to be visible.
When I looked at the recent update to iTunes, I said: “Where is the sidebar with all the sources of music, movies, etc”? Ditto Mail when I updated to Mtn Lion some time back. What kind of an idiot would hide away all your Mail folders in the mail program and your music sources in iTunes? What could they possibly be thinking? When would you not want to see your mail folders and music sources? Why is this even an option, never mind the default?
Anyway, keep at ’em, Bruce. We need people like you that ‘get it’ nagging at these nitwits designing the recent Apple interfaces before the Mac becomes indistinguishable from Windows 8.
You questioned whether anyone at Apple ever uses an Apple computer, and evidence would certainly suggest that no one in a position of power at Apple uses one regularly enough to experience the effect that cute displacing productivity is having.
The iPhone is not rolling backwards as rapidly as the Mac, but it is certainly failing to evolve based on common user experience.
Take the phone tree dance: You dial a toll-free number and enter phone-tree hell, quickly switching to speakerphone mode. You spend a minute or two listening to prompts and both choosing selections and entering account numbers, etc., with various long pauses in between as you are routed among various computers on various continents. Suddenly, without warning, you are talking to a live human. You reach for the speakerphone button to turn speakerphone off… And it isn’t there! The keypad is there instead. You fumble around for the right button to get rid of the keypad, punch for the speakerphone button, then, finally, seconds having now passed, bring the phone to your ear. The person us gone.
(I know there’s a work-around: you just have to open your mouth and start using the speakerphone. I’ve studied panic behavior. The person who was prepared to touch the speakerphone button to turn off speakerphone is very likely to not even consider the possibility of talking out loud to their phone. Ive also studied speakerphone behavior. Only two groups, lawyers and MBAa, routinely talk out loud to their phones. It’s a power thing. Using a speakerphone to talk to someone who is not using a speakerphone is rude, and most people avoid doig so.)
The keypad should include a speakerphone button above and toward the right side, close to where it is when the keypad is absent. It’s not unreasonable it was left out of the first release. The fact that it is still missing indicates a fundamental flaw in Apple’s process. That is a management problem and needs to be addressed as do all the other badly trailing elements of this overwhelmed interface.
I agree with Terry 100% that the interface should not be overhauled for the sake of making it different. I said twenty years ago and I’ll say it again, if you attempt to repair something that isn’t broken, it will be.
What the IOS team should be allowed to do is to finish the interface, to fix photo transfer from iPhoto so folders arrive in proper order with labelled and key-worded photos, etc. A dedicated team should not just be allowed, but required to walk around Apple with jailbroken iPhones, after which Apple should fill the voids they find with world-class solutions that invite everyone to the party, not just desperate and/or expert users. Another group should have their iPhones snatched back, replaced with the competitions’, assigned to find out what they’re doing better, not with the aim of copying them as they did Apple, but of doing them one better.
Since you asked for another company big enough to pick on, here are the changes that made Gmail’s web client become insufferable to me even though I’ve been using it for around a decade:
1: I got annoyed with the way buttons started disappearing and appearing based on what I had highlighted. This is in the same spirit as the problems described in your article. Of course, it may be that this was always the case and I didn’t care before, but it definitely became a problem in conjunction with…
We can only hope that this monochrome fad—and a fad it is—will pass soon. As for the absence of labels, in 1985, the Mac Human Interface Group adopted the slogan, A Word is Worth a Thousand Pictures” when it had become apparent, even then, that there were now too many icons and symbols in the Mac world to keep them all straight.
Re:Symbols. Here’s an example from a non-computing realm. We are all accustomed to clothing tags with washing instructions. In the last decade these tags have expanded to include symbols standing for wash temperature, dry temperature, whether washing/drying is allowed, etc. This is probably very helpful for non-English readers. However now some tags have symbols only and no text. I can never remember that the square with a circle in it and one dot means dry on low temp, two dots means medium temp and three dots means high temp. One extra line under the box means permanent press dry cycle and two lines under the box means gentle dry cycle. I had to print myself a guide to the symbols and place it where I sort the laundry so I can easily refer to it. Now sorting the laundry takes a little longer!
I really hope Jony Ive understands Fitts’s law and, going forward, improves this.
I turned off the Dock magnification years ago for these reasons. Moving targets are so unproductive. The Dock works well enough for me when the number of icons doesn’t change by more than a few (uncommon running apps appear and disapear), and I keep a shortcut of a list of extra things on the right side. I built my own predictability. But it needs a smart makeover. Again, if Jony is in charge now, I hope he is not just thinking of great visual design, but taking these kinds of things into account.
Tog, I’m really glad you’re still thinking and writing about these issues. Thanks! Your insights affect my work every day.
” “A Quiz Designed to Give You Fitts,”” is not underlined requiring me to move the mouse over it to discover that it is a link. That violates what you expound in this article by requiring me to get near it to reveal itself.
It’s not so much forgetting as having never known. Doctors learn what came before, then build upon that. People coming into our field often have the arrogance of their youth untempered by any knowledge of either HCI design theory or history. A newly-minted doctor like that might hit upon the idea of letting the bad humours out of people by bleeding them, not realizing that attaching dozens of leeches to people has already been found less than successful.
Some of the examples I offered this month, such as the Safari one, are subtle effects, and I would expect no one but the most senior designers to be equipped to make the right call, at least before starting user testing. Others, such as making the Print and Save buttons invisible in Safari PDFs are an embarrassment, equivalent to having a young doctor trot into an established practice on the first day of work lugging a large jar of leaches, bent on attaching them to everyone who comes through the door.
It would not be the doctor’s fault. It would be the medical school’s fault. The kind and quantity of hapless errors we continue to see crop up in generation after generation of products indicate that the most fundamental principles of HCI design—discoverability, stability, visibly, and use no leaches—are still not being taught to graphic designers, industrial designers, or software engineers.
If I may suddenly spin into a new metaphor, how can we blame our young professionals for inventing a square wheel when we never bothered to teach them about round ones?
It’s a bit off-topic, but doctors brought back leech therapy. The leeches are raised and kept in a sterile environment. They are used to draw the blood out of hematomas (deep bruises with lots of bleeding), amputated digits after reattachment (the veins cannot keep up with arterial flow), and the blood that pools beneath skin grafts of burn patients (blood pushing up the graft will detach it).
Actually, I was aware of that which is why I mentioned “bad humors.” It’s funny how supposedly obsolete and even discredited treatments and technologies return to the fore.
The requirement for all links to be underlined was swept away by around 1998, replaced by a much more general de facto standard: Links have to look different. I preach a slightly tighter rule: Links have to look different, and at least one aspect of that must be that they appear in a different color. On the current website, the links are both in a different color and in bold. It often feels to those of us that work in HCI that we are not even invited to the table when design decisions are made. When we are invited to the table, however, we don’t get to run the show. A good final design is all about balance between engineering, marketing, graphic design, HCI, etc. In the early days, people needed a radical visual indication that something was one of those weird links they heard about. Today, people expect there to be links. Making pages today look as ugly as those early pages did would not serve us well, nor do links need to stand out the way they did.
Bravo Tog!! I spent 20 years trying to convince my students about this kind of thing, only to find, as you point out, that the engineers are getting to make these decisions. These (sometimes niggly) little problems can make a world of difference to user-friendliness, and thus to productivity.
So many of the bugs and foibles we endure seem of such little consequence. The Safari example I used is such a minor bother, it would hardly appear worth mentioning. Probably only costs me three to six seconds per day. The iPhone little text bubble appearing off the top of the screen when you try to select text near the top of the screen only costs me perhaps 30 seconds a day (plus the $2.25 I have to put in the Swear Jar).
Apple locates the Autocorrect window way up on the screen where the text will be entered instead of within or just above the screen keyboard where I’m actually looking when touching keys. The text in the Autocorrect window is also tiny and light blue. That double-whammy probably costs me about ten minutes a day. (And it would cost me at least $20 in the Swear Jar except I’ve got my wife and family convinced the official name of the service is “Damnyouautocorrect!”)
When you tie all this seemingly minor annoyances together, you end up with a significant productivity hit, whether you’re adding up the thousand tiny cuts an individual user takes from every little bug and foible encountered over the course of a day or tallying the hit just one foible costs collectively. Over the course of just one year, 3 seconds lost per user per day by 250 million users amounts to 208,333 hours lost in total. That’s a big number, bigger than the time it would take to fix it. Considerably bigger.
I see what you’re saying, but I think you have it backwards.
Safari has one rule: Tabs are *always* spread out across the full width of the window. This makes perfect sense.
Want the first tab? Go all the way to the left.
Want the last tab? Go all the way to the right.
Want the middle tab? Go to the middle.
This works whether you have 1 tab or 12. And the tabs are always the easiest to hit, because they’re always the maximum size they can be.
With Firefox, the last tab (if it’s the second of 2 tabs) is way off to the left. The last tab (if it’s the 3rd of 3 tabs) is still to the left. Only when you get to 5 or more tabs is the “last” tab off to the right–where you expect it.
Your logic assumes people want the same physical location rather than the same relative location when you have a *variable* number of items. I suppose some people do want the same physical location (but only up to a point, because they shrink), but I think Safari is more logical for most. It’s always the relative position that’s important.
Logic is such a high-level cognitive function that humans don’t even develop it until around age six. Determining location by applying logic necessitates interrupting the user’s workflow. Persistent objects that never move require no computation, no determination, no application of rules at all, and that’s the point. It’s not a matter of how simple, logical, or intuitive a set of rules might be, if there are any rules beyond “it’s where you found it last time,” the target isn’t persistent, motor memory is out the window, the task must be put on hold, and the user’s efficiency will be affected.
A very important point, and I have studied this specifically: Users are likely report that interfaces with even frequent high-level cognitive interruptions to be highly efficient. Furthermore, when tested with both interruptive and “clean” interfaces, they will report the interruptive interface let them perform their task quicker! They may even report greater satisfaction using such an interface just because of the cognitive engagement involved. However, the stopwatch data will reveal truth, that the “clean” interface was the clear winner, with users sometimes taking as little as half the time to perform the assigned task.
Tog, can you please post a link to the study you speak of here, where frequent high-level cognitive interruptions appear to be highly efficient?
I would love to. Unfortunately, it was a research project I carried out while working deep in the bowels of a locked, secret laboratory inside a mysterious California corporation named after a popular fruit often gifted to educators. I have already revealed too much! I must think of my family!
In other words, what’s researched at Apple, stays at Apple. This seems unfair and contrary to the spirit of the collegial flow of free ideas until you consider that, for the last 30 years, every time Apple has released a product, some other company or companies have instantly and mercilessly ripped them off. Little wonder they don’t want to give these guys any extra help.
Tog, so sorry to hear you couldn’t publish it! I believe your work might finally explain the mysterious draw of Emacs and Vim to engineers.
How quickly they forget! Someone just has to wave a newer and tinier computer environment in front of their eyes and their minds blank out. Everything has to be rediscovered.
On the subject of browser tabs, but on a more positive note, I find Firefox’s Tab Groups (http://support.mozilla.org/en-US/kb/tab-groups-organize-tabs) to be a very elegant solution to the problem of excess number of tabs. It allows you to keep several groups of tabs on different subjects, so you only have a limited number of tabs at a time, while keeping with an overall view to see the whole picture and switch from one to the other. It’s you who manages your own space, so you know where things are, and you have a search if you can’t remember. It kills the tab overload problem, that leads either to horizontal scrolling (as in Firefox, without the tab groups) or tabs so small they have no space for text (as in Chrome).
One of the reasons I use Chrome over Firefox or Safari is its slightly different tab interface. When opening new tabs Chrome and Firefox work exactly the same way, but when closing tabs Chrome is a bit different.
If you have many tabs open in Firefox they’ll gradually start to shrink so they all fit in the window, and as you close each one the tabs will grow to fill the available space. Chrome works the same way when opening new tabs. However, each time you close a tab in Firefox the close box for the next tab jumps out from under your mouse as the tabs change size. You end up doing a maddening dance to chase down the close boxes, which is even worse if you’re using a trackpad rather than a mouse because you have to mentally change the trackpad’s function back and forth between a sideways cursor movement and the click-to-close movement. At least with a mouse you can do both at once. In Chrome however, once you start closing tabs they all stay the same size and in the same position. This lets you keep the cursor in one place and close as many tabs as you want with zero mental effort.
Chrome’s tabs will stay the same size and in the same position for as long as the cursor is still in the tab bar. Only when you finally decide to move the cursor elsewhere will the tabs resize.
It could be argued that’s bad because it makes it too easy to inadvertently close things you didn’t intend to close, but I find that the comfort of not having to chase the close boxes (something you do dozens if not hundreds of times a day) far outweighs the very few times I’ve closed one too many tabs (which should be easily findable in the Recently Closed area of the History menu anyway). I’ll take the occasional sharp jab of a wrongly closed tab over the constant dull punishment of chasing close boxes any day of the week.
I just took the Quiz Designed to Give You Fitts’, and netted, perhaps, a grade of 40-50%. That I did even that well is because I’ve been reading your articles for several years.
My question today is: I want to use a computer (and programs) designed after those principles. Do any exist any more? Suspecting the answer is not really, who is closest? what about projects from people who don’t actually build the computers, but overlay, hack, reprogram or otherwise fix what the the Big Brothers are doing?
I don’t have an answer to your question as to who is doing it better, and, at least in the case of Apple, they continue to make it as difficult as possible for people to address their errors, omissions, and purposeful exclusions.
The application of Fitts’s Law ebbs and flows. People charged with design finally learn about it and then, a few years later, they’re replaced by a new batch who either have no knowledge of it or just believe that pretty design trumps usability. The problem is that we do not have an educated customer base who actually can tell good behavioral design from bad. All they know is that if it’s bright and shiny, it must be good. If people knew how much more productive they could be if, say, Apple added arrow keys to their mobile device keyboards so people wouldn’t have to futz around for the longest time trying to drop the text pointer with pixel accuracy, they would tell Apple to stuff it until they fixed the problem. Instead, they don’t know, and no third party can add such keys because a few graphic designers, with zero training in behavioral design, think the keyboard looks better without them.
Another thing that’s bad about stretching the tabs to the full width is that the tabs become very very large. When they’re so large, you can only see one tab at a time. A quick glance at the tab bar may look like there is only one tab open. Users that aren’t familiar with the tab bar might be a bit confused by this as well because the difference in appearance of selected and unselected tabs isn’t contrasted enough.
Thanks for this timely article. As another example, in Gmail’s new “improved” composing pane, the Cc button is gone until you hover over where it should appear near the right, somewhere in the addressing area. This drives me nuts.
Compared to a computer, a fighter plane is rather much a “single tasking” device! What on earth (or the sky) makes you think the “second tab” will always have the same information? That the use case for wanting to get to it is so similar every time that the muscle memory effect has meaning?
In this case, Safari made the decision that having a bigger space to read the title of what the content actually is, is more important than providing a consistent target. That’s a reasonable tradeoff… in my mind the drawback is that it LOOKS less like a button at that width.
There are two competing, but still both true, thoughts: “if you can’t measure it, you can’t master it” vs “not everything that counts can be counted, and not everything that can be counted counts”. You pay so much more attention to the former at the expense of the latter, it distorts your view. With Fitts’ Law, you have a very powerful tool for describing very simple, static interfaces. But with, say, the Mac vs Windows menu bars? Sure, Mac is faster to zip the mouse to, but with Windows, I never have to think about which window on the screen the menubar refers to– unlike the Mac. If I have a large sized screen and am using both firefox and chrome, I have to burn a lot of time scanning to see if the menu screen is one or the other. That cognitive time and effort is a lot more significant than the physical motion, but because it’s tougher to make a good test for it (since most tests implicitly tell you what to do so that A/B tests make more sense) it gets much less attention from you.
Your arguments re: tabs would be compelling if users were presented with longer titles as they got more and more tabs, but, perversely, the more important it becomes that the user to be able to read the titles, the less of each title is presented. Both largely systems fail in that regard, particularly in the case of two tabs where whatever the other tab is, it is the one and only window you are not now looking at.
As for a fighter plane being a “single-tasking” device, as a pilot of far, far simpler aircraft, may I assure you it is not a single-task device. Even the simplest of aircraft compels a pilot carry out a wide range of tasks simultaneously and, unlike a PC, they must be done with precision timing and in exact sequence. Deviation results in actual physical death.
I’ve heard the argument about the advantage of the menu bar being tied to the window before, and, on the surface, it makes sense. After all, if you carry out some action thinking it is going to affect Window A, but it affects Window B, you have definitely made an error. However, the root error occurred when you first thought you were working with Window A! True, the decoupling of the menu bar from the window made matters worse by lulling you into believing nothing had gone wrong when it already had, but the error was already made.
The solution for this is to slightly darken the non-active windows so the user never makes the error to begin with. It doesn’t take very much darkening, and its even been a feature of OSes from time to time. It just isn’t right now for no particularly apparent reason, probably because of simple corporate amnesia.
There’s a measurable hit every time you have to use the Windows menu scheme instead of the Mac’s. Over the course of a day, that can add up to minutes. Fixing the root problem is a better approach than limiting a single phase of the damage caused by an error arising from a different source.
I’m not quite sure why you are burning time “scanning to see if the menu screen is one or the other.” If you lose track of what you’re working on, you need to figure that out, and neither menu scheme will help you do that. Once you have figured it out, you click it once and proceed.