Why Did Touch Take 4 Decades to Catch On? 245
theodp writes "You probably saw media coverage of Bill Gates showing off touch-screen technology to his CEO play group last week. With the introduction of the iPhone and iPod Touch, touch (and multi-touch) technology — which folks like Ray Ozzie enjoyed as undergrads way back in the early '70s — has finally gone mainstream. The only question is: Why did it take four decades for its overnight success? Some suggest the expiration of significant patents filed during '70s and '80s may have had something to do with it — anything else?"
For the same reason as the Wiimote. (Score:5, Insightful)
This is for the same reason that command pipes/stdin/stdout will always be more useful in unix-like OSes than they will in Windows. Because they essentially come with the system and 90% of the programs are setup to use them. Same as why REXX was so much more successful on the Amiga than it will be on any other OS. If the Wiimote had been an option item, then the software wouldn't have been there and the Wii would have probably been a flop.
All the bits of the puzzle have to come together (Score:5, Insightful)
Most technologies take a while to become mainstream. NAND flash was invented in 1988 and took almost 20 years to become mainstream. Linux was started in 1991?? and is almost mainstream.
Not just patents (Score:5, Insightful)
I don't think it was just a matter of patents expiring, it was most likely because the technology was finally ready for it. In the past most touchscreen-equipped systems I've seen seemed to be pretty weak in every area except the touchscreen, these days the machines equipped with touchscreens are powerful enough to actually take advantage of the touchscreen capabilities.
That said, I'm still waiting for a tablet mac with multitouch tech and a built-in wacom tablet (like the Cintiq) so that I can use my hands to drag stuff around on my desktop and the stylus for actually drawing stuff.
/Mikael
Comment removed (Score:5, Insightful)
Not effective (at least to date) (Score:5, Insightful)
Plus, with any user interface people need a certain confidence in correspondence between what they do and what happens. When you push a button, you KNOW it got pressed. If you push a joystick left, you KNOW you're going left. That 'payoff' is like a contract between you and the machine that goes favorably. But if pressing the screen where you believe you need to press may or may not do what you want, that contract gets shaky. Especially since there's no click or motion to reinforce what you're doing. This, by the way, is why I think 'free space' VR controllers never caught on...at least until the WII.
Still, software can create cues to take the place of physicality and have 'grease' to avoid common miscues. Plus, having the screen be horizontal reduces the fatigue.
But in the end, as archaic as the keyboard seems compared to touch and speech, it really is an incredibly expressive and low-energy-requirement device.
The simple answer (Score:5, Insightful)
The reason touch has become so popular lately is because it has only been recently that powerful chips have become small enough and that power (batteries) have become light enough that we can find use for this stuff right in our pockets--where a mouse/keyboard just isn't practical. (Unless you believe in thumb keyboards, but those are very cumbersome IMO.)
have we forgotten... (Score:3, Insightful)
CHEAP LCDs (Score:5, Insightful)
Because it is not as easy to use. (Score:2, Insightful)
... because it's a terrible interface (Score:5, Insightful)
You have to wave your arms around - which is very tiring (much more so than a couple of finger movements for a mouse). that means you can't keep it up for more than a couple of minutes. If you don't beleive me, just try holding your arm aoutstretched for any length of time.
Second, it takes up an enormouse amount of space. Your fingers don't have the dots-per-inch resolution of a mouse, so the interface area has to be bigger and therefore more expensive.
On a purely practical point, you also cover up the object you're addressing. Unless you have transparent fingers, you can't see all the detail of whatever's underneath. A basic and unresolvable design flaw.
Finally, there's the goo factor. Imagine all the smears, stains and gunge that will accumulate on the touch surface - both from your hands and everyone else who uses it. Apart from the obvious hygiene issues, the surface will get dirty. We know how annoying the occasional fingerprint is on a screen - now think what it'll be like when the screen is covered in grease and other smudges.
In summary, it never caught on. The only people who advocate it are those who've watched Minority Report a few too many times. It's not cool, it's not futuristic and hopefully is doomed to the junkheap of techno-history along with punch-cards and robo-vacuum cleaners.
Great! Now I have to wash the screen all the time. (Score:2, Insightful)
Re: ... because it's a terrible interface (Score:2, Insightful)
millions of iPhones/iTouches sold begs to differ about it never catching on. Not to mention that the same technology will be making its way to consumer laptops and business conference rooms. People like this technology. Yes, you will have smears on it, but with every technology, that will get better with every revision. The fact that you are so against a technology its a character flaw. Be open to it, try it and decide then. All the examples you gave were nothing by hypotheticals when a consumer device has been on the market for almost a year.
This is cool technology. This demo [perceptivepixel.com] is by far my favorite.
Correct answer: Mu (Score:5, Insightful)
Touch didn't "just catch on." It's been around forever and has been evolving steadily and is being used in more and more places. You're postulating that because the iPhone uses touch and Bill Gates did a demo that now, May 2008, it has "arrived"? Touch isn't just now "catching on," it's simply becoming more and more common as technology improves. The regular iPod has had a touch-sensitive wheel ever since the 2nd generation. Laptops have had trackpads for ages. PDAs have had touch-sensitive screens since, well, as long as they've been around. I've seen touchscreen kiosks and ordering screens (Arby's used to have them) The only thing I can say is that as touch technology improves in the same way that all technology improves--becoming cheaper and smaller, in addition to better--it's being offered in more devices where small and cheap matters--i.e., portables.
I had a touchscreen 17" CRT at home almost ten years ago, and while it was really neat--there's something really satisfying about actually pressing a link with your finger to 'click' on it--it was a pain (literally) to use for any extended amount of time. Touch works best when your arms can be at rest, which means your hands won't move much, which means a small device. Now, who wants to poke on a tiny screen on their desk, when they could instead use a mouse and keyboard to manipulate objects on a 20" screen? No one. So, where does that leave us? Where is touch useful? Ding ding ding! In tiny devices that are already in your hand. Or, to put it another way, it's not so much that touch is just now "catching on," it's that we're finally finding things that it's really good for. Like I said, a touchscreen is not a good replacement for a regular old mouse.
Multitouch is a nice new addition to touch technology, but you know what? I hardly ever use it on my iPhone. I rarely zoom in or out. I click and drag a lot, and double-tap to zoom in and out, but this is nothing that couldn't have been done on a mid-90s Palm.
Heres why. (Score:4, Insightful)
This applies to the touch screen table tops and such. We had this technology for a while but its just now were the price to sell the product and the price to produce for the product is in reach of both huge corporations and smaller companies.
For the past thirty years most fast food stores were using the stander hierarchy register machine, green display, you pressed a keypad that added an item and it was top to bottom, very difficult to go back to the top of the list to modify a mistake. Now you go to Mcdonalds, they have touch screen displays, they display the image of the food(Big Mac), you press the items they want or do not want(lettuce, ketchup,mustard), and there is the order and if you need to correct a mistake you can easily click an item and fix the mistake.
Could they have had these type of registers earlier? Yeah, but they weren't cost effective till about 2000 when I believe they started to slowly replace the older registers with these registers.
The point is, the technology is there, its just a matter of making it cheap enough and affordable for companies and people to develop it and buy it.
The day I have a table the size of my kitchen table that can support six people playing an RTS, all through touch screens, none of that voice crap that ive seen on youtube, and were yelling off commands and tactics to each other against six other people in another room, will be the day I crap my pants.
Re:Not effective (at least to date) (Score:5, Insightful)
I agree. The iPhone interface is just so amazing. The other day I was in a big box store and we looked at the GPS units they had. The only thought I had about any of them are "these touch screens are so hard to use."
I realized that was because none of them supported multi-touch. To zoom in you had to go press a little software button, and it would zoom in one level. The levels are all arbitrary. Dragging the map was often relatively unresponsive, if you were even allowed to do it. Compared to the small amount of time I've messed with iPhones (I don't own one) it was just annoying. The interface on the iPhone is just so much better for the map.
It's the same thing at my local Borders. They've always had customer terminals around the store to look up books and such, as long as I've lived here. But a few years ago they replaced some with touch screen devices. Now I think they all are.
Before they just had a mouse and a keyboard. I could what I want in fast, and browse easily using the mouse.
Now they are touchscreen devices. Half the time they don't even seem to respond to my finger touch. I've never been able to decide if I'm touching too fast or slow, hard or soft. Sometimes it works, sometimes it doesn't. The keyboard buttons (which are at least 1-1.5" on each side) are hard to hit with any accuracy. Sure my finger tip is smaller than the button, but I can't seem to press them accurately. Note that this isn't a calibration problem. Once I've figured out how off the individual machine is, it's still hard to hit the right button. The lack of any kind of tactile feedback (the auditory and visual feedback, if there, is often 100-200ms late and thus useless).
Basically, it's a pain to use. They took an easy interface everyone knew how to use, dumbed it down and made it far more useless, and spent a bunch of money in the process.
Yet I could type on the little tiny iPhone keyboard pretty well within seconds of trying. Clearly it was well written, with touch screens in mind. Compare that to the Borders system which, from what I can tell, is just a fancy website with the touchscreen operating as a mouse, distilling whatever you do into a standard mouse click. This removes all subtle differences that could be used to help figure out what you're trying to do.
This is with relatively powerful computers (1GHz plus). Imagine how well touch interfaces could have been done 15 years ago with a 25 to 100MHz processor. Thing how useful touch interfaces were 20+ years ago when most people only had character based displays and were using DOS.
I'd only now that we are getting the necessary precision, processing power, and experience to start making good (multi)touch interfaces.
Not enough computer power. (Score:3, Insightful)
Prior to current days, hardware just made better user interactions. A keyboard or a mouse do a lot of complicated things to feel right to the user, and yet output a simple qualified input to the computer system.
Today all of that complexity and even more is being placed into the UI at the expense of other activities, which until relatively recently was mostly CPU bound.
The last was the elegant creation of the idea to fire up everyone else. In this case the Iphone.
But just like the advancements in keyboards, mouse, trackpads, and game controllers we have only seen the beginning.
My hope is that this will also catch on with the tablet form factor, where somebody will wake up and realize the best place for the menu on a tablet is probably not the upper right hand corner, where a righty will obscure the screen. And that it probably deserves to exist or the right hand side for most items, and even look a lot more like the office ribbon, than the standard menu bar.
This is cool though, we are on the cusp of the next wave of UI. That that comes after the current mouse oriented menu and panel methods. It will be cool!
Re:All the bits of the puzzle have to come togethe (Score:2, Insightful)
Re:Greasy.. (Score:3, Insightful)
No, they're not; fingerprints are still an eyesore on monitors.
There are some appalling grotty screens around work - and they're not touch screens! Some people feel the urge to not just point at the screen, but tap it with their finger for emphasis. Plastic LCD screens aren't as abrasion-resistant as the CRT monitors that replaced them, so when they do clean the thing with whatever dust-laden rag was handy, they often leave a permanent scuff mark.
Look, but don't touch.
Re:it didn't. touch never caught on. (Score:5, Insightful)
All kinds of bank machines and kiosks have had touch screens for years. It's not the touch screens that caught on. It's everything else that caught up -- and got cheap enough for consumer goods.
Re:Not effective (at least to date) (Score:5, Insightful)
This is an important point. Touch screen interfaces are much less abstract than non-touch interfaces. You're actually physically manipulating real little "objects", rather than issuing commands. The problem with this is, the first time you try to drag something, or scroll or zoom, and the interface element you're working with doesn't follow your finger, you're sunk. The whole illusion is shattered, and the UI feels extremely awkward.
This requires a fair bit of graphics processing capability, certainly by the standards of portable devices.
Even the iPhone's hardware isn't quick enough to scroll e.g. complex web pages like this -- so what Apple did, rather cleverly, is, rather than slowing down scrolling (failing to track the finger) until the device catches up, the device simply keeps on smoothly scrolling, filling spaces it hasn't had a chance to draw yet with a checkerboard pattern, which provides a spacial reference.
There are other little things like this that make the device feel more responsive as well. For instance, if you try to scroll off the top of a web page (or other vertically scrollable view), the phone will let you -- the scroll will keep right on following your finger. Then, once you let go, the view will bounce back.
These kinds of tricks were not particularly obvious. Natural-feeling touch UI requires an entirely new vocabulary of UI behaviors, and that's just starting to emerge now.
Re:Because haptics is important. (Score:3, Insightful)
There is also the added complexity of navigating the customized menu of the DVD player itself, particularly those DVD's with multiple menu pages (for complete collections).
Doing something as simple as switching from watching Sky News to watching a DVD will involve:
1. Switch DVD player on.
2. Place DVD in DVD player.
3. Wait for copyright notice to play.
4. Wait for menu to appear
5. Ensure universal remote is in DVD mode
6. Figure out whether left arrow or down arrow moves between menu options.
7. Wander around until correct menu item is found.
8. Press [Play].
9. Adjust volume until sound level is in comfort zone.
10. Watch DVD.
11. Press stop to end DVD.
12. Remove DVD
13. Switch DVD player off.
14. Adjust satellite/TV volume to get back into comfort range (do this repeatedly due to stupid adverts maximising the sound level they are allowed to play at).
Even freeview satellite offers 300+ channels, and the channels are not easily identified. For BBC 1Scotland it is something like channel 941, for BBC London, it is something like channel 944.
Re:Correct answer: Mu (Score:3, Insightful)
For me, the probable next step forward is when we get better haptics integrated with touch. For example, though the iPhone is decidedly neat, it's tricky (by comparison with a normal mobile phone) to use without looking at it since you can't feel where the buttons are. And I don't know about you, but I don't want to have to look at what my fingers are doing in order to use a device (yes, I can touch type)...
Re:Not effective (at least to date) (Score:3, Insightful)
The term in the programming community for that was, IIRC, 'gorilla arm', from the way your arm felt when you tried to move it after a while. I suspect, though, that a significant part of this was due to technology -- a display was a large CRT, that had a depth roughly equivalent to the width of the screen, and there was a mindset that the display was something that sat out in front of you for you to look at. Aside from a number of 'table' video games, it took LCD displays to make displays something that you could reasonably mount in the same position as the key panel for a register. If the iPhone's touch screen had to be used held at eye level at arm's length, I expect that it would still be looking for its first thousand sales.
Re:For the same reason as the Wiimote. (Score:3, Insightful)
They did (Score:5, Insightful)
Touch didn't catch on for personal monitors because it is inferior as an input device to a mouse. It works for kiosks or ATMs because people don't use them for long periods and are in a better posture for touch screens and because they are obviously much sturdier than mice. They've been used for PDAs for decades, so the "iPod Touch" is hardly an instance of "catching on". The original Palm had a touch screen as did the Newton. (Though ones designed for a stylus.)
twas a good question (Score:3, Insightful)
Now, the one place they made it HUGE was in restaurants, where hardly a place lacks one (half dozen) now. context is key - a place to enter orders without looking for a keyboard, a place to manage table occupancy, integrated with the credit card system to avoid an extra piece of hardware that could break - the new systems had it all and have had it for over ten years now.
Re: ... because it's a terrible interface (Score:3, Insightful)
millions of iPhones/iTouches sold begs to differ about it never catching on.
In any case, I've only seen a few iPhones/iTouches since they were introduced - it might be common in the US, but I wouldn't say that is necessary true anywhere else particularly. Not that 'catching on' only in the US is anything to sneeze at, especially from a $ standpoint, but still. From what I've read, some people like it and some people hate it.
Re:Clumsy... (Score:4, Insightful)
Re: ... because it's a terrible interface (Score:3, Insightful)
you don't have to wave your hands around in thin air to make this interface hands down (erm, ok, pun intended) the best thing since sliced bread. you can interact with all the items in your computer like items on your desk, almost, and revert to a keyboard as necessary, and make the mouse a needless abstraction.. even less convenient as you have to crank up your acceleration to deal with larger and larger amounts of screen real estate (and sacrificing precision in the process).
It takes more than a touch to be useful (Score:3, Insightful)
There are some applications where they provide the most functional user interface; Apple uses them to great advantage on their iPhone and iPod Touch. It allows rich user interaction on a pocket sized device; no room there for a keyboard or fancy set of buttons. They're not so useful on something like a laptop; there's a keyboard that's much more useful - and the software to make any kind of use of a laptop touch screen is yet to be developed.
Something tells me that history will repeat itself again. Someone will create a workable touch screen interface for general purpose computers, then a major software company will "borrow" the idea and popularize it. The innovators won't get a dime - or any recognition - but the technology will finally break through to the general public.
Re:Clumsy... (Score:2, Insightful)
A mouse doesn't even make sense outside of 2D -- look at how CAD engineers all have spinny-balls and knobs and all kinds of other input devices on their desks -- so as revolutionary as the the mouse was, it was still just "2D done right", and not the epitome of HCI. And even in 2D, anybody who's at all serious about precision seems to have other input devices: every webcomic I've read in the past 5 years is either scanned from paper, or drawn on a Wacom tablet (or Cintiq). If the mouse is so great compared to more direct input devices, why are people who care spending hundreds (or thousands, in the case of the Cintiq) of dollars for alternatives?
Well, except probably Dinosaur Comics.
I'm no Apple fanboi (I use Linux at home), but these kinds of innovations are at the intersection of hardware and software, and Apple has proved to be the best at that area. Only a couple other companies have shown to be very good at this at all -- Palm and Tivo come to mind. But Tivo isn't really a device for which touch makes sense, and Palm's big innovation (Graffiti) was kind of a special-purpose alternative to touch.
Re:it didn't. touch never caught on. (Score:4, Insightful)
Talk all you want about old keyboards but don't imply that they evolved into the iPhone.
Re:it didn't. touch never caught on. (Score:2, Insightful)
In other news, apparently nobody has bought the leading handheld video game machine [wikipedia.org], leaving analysts puzzled as to how it managed to sell over 70 million units in the past 4 years.
Re:it didn't. touch never caught on. (Score:2, Insightful)
Re:For the same reason as the Wiimote. (Score:5, Insightful)
Palms didn't do handwriting recognition, they did custom glyph recognition. You had to adapt to the device, rather than the other way round, which makes recognition a far simpler problem. So Windows was in fact infinitely better at handwriting recognition, because Palms didn't even bother to try.
Re:Clumsy... (Score:3, Insightful)
Or are you suggesting that Picasso should have finger-painted and not used brushes? I mean, most digital artists use tablet interfaces... I myself am using a Wacom Intuos tablet at this very moment, so it's not as if every brush is shaped like a bar of soap. I suggest you consider in your metaphor that the mouse is the handle of the brush, but not the head. It may look clumsier than only using fingers, but the variety of tips offers better control in applying paint than fingertips do...