Here's how he begins his post:
There’s an excellent chance I’m being a complete fuddy-duddy, waving my arms and yelling at those damn kids to get off my lawn. That said, it’s a horrible idea to force everyone into touch-based computing.
The unveiling of Windows 8, coupled with Apple’s nudging of OS X closer to iOS with Lion, has me shaking my head.
Why monkey around with happy users by combining their experiences? Some people love iOS. Some people love OS X. I don’t think this is a peanut butter and chocolate moment though. Using Lion as an example, out-of-the-box it enables an option called “Move content in the direction of finger movement when scrolling or navigating” which creates backward scrolling for wheel mouses.
Disabling this option was one of the first things I had to do after my upgrade because every single window scrolled the wrong way.
Kuramoto then goes on to mention the drawbacks for keyboard/mouse users, the need to redesign the applications that run on these operating systems, and the ergonomic issues.
But in the process of replying to his post, I was forced to admit a couple of things.
I was forced to admit that the usability comparison between touch and keyboard/mouse is unfair. In a computing context, I have very little experience with touch computing (where I define touch computing as MOVING your fingers across a screen to perform a particular action). I have over a quarter century of experience in using a mouse (dating back to the Macintosh Plus), and I have over 35 years of experience in using a keyboard (dating back to Miss Jack's typing class). So I'm naturally going to find keyboards and mice easier to use than a touch screen. Give me a couple of years of experience with a touch screen, though, and I'll probably have a different view.
Why will I have a different view? Because I was also forced to admit that a touch screen is more intuitive than a mouse or a keyboard.
Think about how these older devices work.
Let's start with the keyboard, because it's been around longer. Whether you're using a typewriter or a computer or my beloved LG env3 phone, the way in which you use a keyboard is to set your hands on a table (or another flat object), then move your fingers around. When you move your fingers around on the table, things change in another location. As I type these words right now, the movement of my fingers is causing letters to appear on a screen that is several inches away from my fingers. If I were on a typewriter, something similar would be happening; my keystrokes would cause letters to appear on a piece of paper. While the actions and reactions are perfectly understandable in the realm of physics, the fact remains that there is a type of separation between what my fingers do on a keyboard, and the results of those actions.
If anything, a mouse - the vaunted mouse, part of the long-standing effort to make things easier - is even more of a problem than the keyboard. Take the classic example of using a mouse to drag a file to a trash can. Now if I were in the real world, I would walk to the piece of paper, pick it up with my hand, walk to the trash can, and throw the paper into the trash can. But that's not how it happens in the so-called "intuitive" world of computing. Let's go through the steps:
- First, you take your right hand and rest it on a horizontal table. This table is several inches away from the item that you want to modify - namely, an icon on a computer screen.
- Next, you "move your mouse" to the file. This involves sliding your hand horizontally across the table. This causes a movement on the screen above you that is not identical to what your hand is doing. If I move my hand away from me, the pointer on the screen moves up. If I move my hand toward me, the pointer on the screen goes down. Your brain is going through all sorts of mental calculations to move things on the screen. You don't think about all this, because you have 5 years or 10 years or 25 years of experience moving a mouse. But if you end up with a native on a desert island who has never seen a computer before, how are you going to explain a mouse to the native?
- But now it gets better. To get the file to the trash can, you have to perform something called "click and drag." In the real world, I would do this by using my hand to grab the paper, then tightly hold the piece of paper as I walk to the trash can. But that's the real world. In the fake world, you hold one finger down and then move your entire hand - and again, moving your hand away from you moves the item up on the screen. Take a moment and find a piece of paper near you. Tap your finger on the piece of paper. Now move your hand. Unless you press really hard, chances are that the paper is going to stay right where it is. And this is intuitive?
- OK, let's assume that you have your paper near the trash can. In the real world, it's conceivable that you could place your hand (the hand grabbing the paper) right over the trash can, then unclench your hand. When you do this, the paper surrenders to gravity and drops down into the trash can. (Well, unless the wind starts blowing.) In the fake world of computing, you have to move your hand in such a way that the trash can changes (in the old days, a white trash can would become black), and then you would lift up the finger that you pressed down earlier. While this action is admittedly more intuitive than the other actions, you still are depending upon the trash can to change - something that doesn't happen in real life - and you're still moving your hand in a horizontal fashion to make changes on a vertical plane.
Now imagine doing that same thing with a touch screen. There's still an air of unreality to the entire affair, but all of the mental calculations and gyrations have been removed. When I want to "click" on a file, I don't move my hand in one location to make a change several inches away. I touch the screen, and I make a change to an object that is underneath my finger. When I move my finger up, the icon for the file moves up. I don't have to move my hand away from me to make the file move up. When I position the file over the trash can, the trash can is right underneath my finger. I'm not doing something in one place to make a change in another place; my hand is directly interacting with the thing that I want to change.
Sounds more intuitive to me.
For the record, here's the original comment that I left at Jake Kuramoto's post:
First off, I hate Reese's Peanut Butter Cups.
Second off, I have a belief (completely unsubstantiated on my part) that keyboard/mouse hardware is simpler than touch screen hardware, and therefore is less prone to breaking. When certainly family members bought the LG env Touch phone, I stubbornly stuck to the LG env3 (no touchscreen) for that reason.
Third off, I agree that using touch on some applications might not be all that efficient. Perhaps it's because of years of practice, but I can be very precise when I use a keyboard or use a mouse. Stick my fingers on a touchscreen, and I know that I'll have less precision. In my mind (again, probably due to lack of practice), a mouse allows me to specify a particular pixel, while a finger occupies a great blob of space on the screen with no precision whatsoever. I shudder to think of the users of my company's software using their fingers to precisely identify fingerprint ridge endings and bifurcations.
But if I'm honest with myself, I'll admit that given a few years of practice, we will be just as adept at using touchscreens as we are in using mice. After all, the touchscreen is more intuitive, since you are directly interacting with the item that you want to modify (rather than interacting with an item on the side of your computer, and having that cause changes several inches away on the screen of your computer). I can't remember how long it took me to learn how to use a mouse when I encountered my first Macintosh Plus in the mid 1980s, but I'm pretty sure that I didn't master the mouse on day one. After some practice, I could probably be just as good on touchscreens as I am with a mouse.
But I'm still worried about the touchscreen breaking.