Forgot your password?
typodupeerror
Input Devices GUI Software Technology

Ideas For the Next Generation In Human-Computer Interfaces 170

Posted by timothy
from the literally-exploding dept.
Singularity Hub writes "For decades our options for interacting with the digital world have been limited to keyboards, mice, and joysticks. Now with a new generation of exciting new interfaces in the pipeline our interaction with the digital world will be forever changed. Singularity Hub looks at some amazing demonstrations, mostly videos, that showcase new ways of interacting with the digital world." Along similar lines, reader shakuni points out a facial expression-driven user interface reported on News.com for operating, say, an iPhone, explaining "This device is tiny and fits into the ear and measures movements inside the ear due to changes in facial expression and then uses that as input triggers. So [tongue out] starts or stops your iPod Touch; [Wink] rewinds to the last song; and [smile] replays the same song."
This discussion has been archived. No new comments can be posted.

Ideas For the Next Generation In Human-Computer Interfaces

Comments Filter:
  • Ah-Choo! (Score:4, Funny)

    by Something Witty Here (906670) on Sunday March 08, 2009 @02:33PM (#27113617)

    And when you sneeze, it reboots!

  • voice control (Score:5, Insightful)

    by Keruo (771880) on Sunday March 08, 2009 @02:34PM (#27113623)

    When windows 95 arrived, I played around with its voice recognition.
    I wasnt quite impressed with it, since the only command I got working properly was "fuck" which caused the machine to reboot.

    Although voice control has interesting potential, its not optimal for most situations. (think open cubicle office)

    • by sam0737 (648914)

      If the voice recognition works without the voice...

      I think there was an experiment about that? Like probing the nerve that control the vocal cord, and the last time I read is it could recognize 4-5 distinct states after training. Yes, it's even so far from today's voice recognition, but only by then I will consider actively using it. Otherwise I think I will lost my voice in a few days, not to mention any privacy issue it associates.

    • Re: (Score:3, Insightful)

      by Superdarion (1286310)
      Well yeah, but think about it: your brain can differentiante between your boss calling you a useless waste of oxygen from inside his office and the giggles from your coworkers on the outside.

      The aim for technology is, of course, that a microphone can do the same.

      And it makes sense that Windows would understand "Fuck", being the word that it hears the most.
      • Re:voice control (Score:4, Insightful)

        by MBGMorden (803437) on Sunday March 08, 2009 @03:59PM (#27114119)

        I don't think that's the only hurdle to overcome. In a lot of cases, I just don't think voice control is very useful beyond a novelty. I played with it a number of years ago. After a bit of training, it was recognizing my commands pretty well. Thing is, it was tedious as hell to do things with voice control. I spent 10x longer doing things simply for the novelty to doing it using voice commands.

        Seriously: for people who have ever done tech support this should be obvious: even with a human - whose reasoning skills are superior to the best voice recognition system out there, if I am standing there telling them what to do in order to perform an action on the computer, it takes all of 1 minutes before I'm asking them "You know, how about let me sit there for a second and I'll take care of it." (a nicer version of the "MOVE!" part from Jimmy Fallon's Nick Burns - The Company Computer Guy skit from SNL). Most of us can simply do things much faster with our hands than we can explain them.

        Now, if we could truly step into the realm of Star Trek and have virtual AI running the computer - then it might have some application (ie, "Computer - pull up a list of hotels in Miami on Labor Day weekend"). Otherwise, simply as a replacement input device, no matter how good it gets at recognizing commands I just don't see the use.

        • Re:voice control (Score:5, Interesting)

          by ShieldW0lf (601553) on Sunday March 08, 2009 @04:47PM (#27114385) Journal
          I don't think that's the only hurdle to overcome. In a lot of cases, I just don't think voice control is very useful beyond a novelty. I played with it a number of years ago. After a bit of training, it was recognizing my commands pretty well. Thing is, it was tedious as hell to do things with voice control. I spent 10x longer doing things simply for the novelty to doing it using voice commands.

          I feel the same way about it. But my brother swears by it... he can have his hands full of scientific equipment and still issue commands to his computer which is interfacing with the tools he's using.

          I could see this sort of tech being really useful for those who wish to access reference materials while their hands are full too... be it doctors who have their hands covered in blood switching to a different monitor or mechanics who have their hands covered in grease switching to a different schematic.

          Personally, some days I'd give my left nut for a good heads up display and a glove with an integrated chording keyboard and touch pad. If I could do my work lying on my back instead of sitting in this chair, I probably wouldn't have to go to the chiropractor.
        • "Computer - pull up a list of hotels in Miami on Labor Day weekend"

          Florida has hotels that move?

    • Re: (Score:3, Funny)

      by jeffehobbs (419930)

      Back when Mac OS 9 had kind-of-sort-of voice control, you could launch programs by putting them in a specific folder. I made an alias for "Unreal" -- which took up 190 MB of RAM and took about 3 minutes to load on my PM 7500. Whenever someone would come over my dorm room to use my computer, I made a point of mentioning very loudly how something was "UNREAL!" -- and then they got to sit there while 'Unreal' loaded, very, very slowly.

    • by Hatta (162192) on Sunday March 08, 2009 @03:32PM (#27113951) Journal

      I wasnt quite impressed with it, since the only command I got working properly was "fuck" which caused the machine to reboot.

      Are you sure you have the causation straight on that one? When I used Windows 95, it was the other way around.

      • Yes he does:
        1. Computer BSODs
        2. "Fuck"
        3. Computer reboots (or not, given you have a BSOD in the way)

    • Re: (Score:3, Insightful)

      by DrBuzzo (913503)
      Current generations of voice control are quite good and usable. It always seemed that voice control was central to human interface with computers in scifi visions of the future. Star trek and such, nobody ever interacts with the computers aside from asking them to do something. Other visions of the future always had voice control to turn up or down the temperature of a room and do other such things.

      That kind of thing is now entirely doable and entirely affordable with only nominal hardware. The a
      • Yet it has never really taken off except in nitch markets.

        The company I work for sells an air traffic control simulator. Voice recognition is used by the component which simulates aircraft so you can give them voice commands.

        A conventional flight sim could work in a similar way to send voice messages to ATC.

        My wife uses a gnome desktop for her business. Because she uses so many different functions her UI is quite cluttered. I had a look in synaptic and found gnome-voice-control so I will give it a go.

    • Re: (Score:3, Funny)

      by Arancaytar (966377)

      which caused the machine to reboot.

      But with Windows 95, what didn't? ;)

      I think I could have duplicated that effect without even a microphone. (Though whether the "fuck" would be the cause or effect of the reboot is another matter.)

    • Re: (Score:2, Insightful)

      by high_rolla (1068540)

      Voice control has some potential but I think it is one of those technologies that should be a complement to existing input mechanisms (ie keyboard and mouse).

      eg. When doing my normal work I want to use keyboard and mouse as it is more efficient and flexible. Then the phone rings, I pick it up and shortly into the conversation I realise that this is going to be a longer conversation. At which point I just say "computer, save document" rather than having to go back to the keyboard and mouse to do so.

    • by Jurily (900488)

      Although voice control has interesting potential, its not optimal for most situations. (think open cubicle office)

      Think any situation where you're doing anything else than control your computer. The mouse and the keyboard differentiate. Voice recog doesn't. And face recog? Now you can't even read a joke without deleting your hard drive.

      There's a reason why mice have buttons.

  • by Bovius (1243040) on Sunday March 08, 2009 @02:34PM (#27113633)

    Seems like there are some other practical interface options for the iPod.

    * Snoring: stop playing music
    * Gagging: remove song from playlist
    * Startled jump, clenched jaw and frantic grasping at earbuds: reduce volume

  • by Tokerat (150341) on Sunday March 08, 2009 @02:36PM (#27113639) Journal

    I can see useful applications for this, but I hope there is a switch I have to depress while I make the gesture, plus a "hold" switch so I can lock gestures on or off at all times. For example, if I catch my wife cheating and I look stunned, I don't want that to accidentally to push the "panic" button on my car alarm so my nosy neighbor starts poking around during the ensuing drama. That would certainly be a small and silly example of this technology making life more difficult instead of better.

    ...not that I'd ever be able to get a wife (let alone a girlfriend), but at least I made a good car analogy ;-)

    • by Belial6 (794905)
      Well, if you do get a wife, let me know, because if you consider finding her sleeping around to be a small and silly example, I am definitely going to want to 'meet' her. ;)
    • Re: (Score:2, Insightful)

      by Anonymous Coward

      A facial expression driven interface is an absurd idea for the vast majority of users. People's hands are wired to move and manipulate objects. That is why our hands are so effective as "human output devices". Our facial expressions are tied to our emotions. Even if we can get around the weirdness of detaching smiles from happiness and winks from flirtation and so on, there's still the problem that doing that kind of stuff physically feels awkward if it has no emotional content behind it.

    • But wouldn't it be great if you caught your wife cheating on you and your sound system started playing O Fortuna?!

      I can't wait to get a wife and catch her cheating on me!
    • "...not that I'd ever be able to get a wife (let alone a girlfriend), but at least I made a good car analogy ;-)"

      With your sense of humor, it would be a pity if you don't :)

  • "This device is tiny and fits into the ear and measures movements inside the ear due to changes in facial expression and then uses that as input triggers. So [tongue out] starts or stops your iPod Touch; [Wink] rewinds to the last song; and [smile] replays the same song."

    Sneeze a few times, and you just sent an email to your boss calling him a fat ignorant pig

    Get the hiccups, and your car repeated accelerates and brakes, causing multiple accidents..

    And the world ends when the president, grimacing whil

  • changing my tv with hand motion. Right now this would never work think of all the uncontrolled facial expressions people use all the time. As for voice commands that someone else mentioned. I used to like them assuming you could record the commands and train the system. Otherwise the computer will pick anything that has about the same length as the same command. That and my wife thought I was crazy playing a game and yelling commands at my computer.

    • We've had voice control technology for a long time. The problem with it now is exactly what you mention. It needs major improvement.
    • by cp.tar (871488)

      I have toyed with voice commands, some. And I did not like it. Still don't.
      If it's not something I can do silently, while talking to someone and without looking rediculous, I'm not using it.

      Now, a lighter version of those VR gloves that were touted as the future of human-computer interaction, where there would be a few sensors near my fingertips, I could live with. Even typing would work, though it would look silly.
      I do not want to keep turning this wink/smile/nod "feature" on and off all the time. If I h

      • by tomhudson (43916)

        You know those appendages with opposable thumbs that are usually used for tool manipulation?

        ... so you use your "appendages with opposable thumbs" to "manipulate" your "tool."

        Let me guess - you're posting this on slashdot because you've nick-named your "tool" as "CowboyNeal".

        • by cp.tar (871488)

          Wow. From human-computer interaction to verbing my noun in two posts.
          Whatever happened to those 5.25" drives that were to be used for cyber sex? You should have mentioned them as well.

          Still, not a very poor troll. I'll give it a 3 out of 5.

    • by maxume (22995)

      "UP. UP. UP. UP. DOWN." is a nightmare. "Play the most recent Colbert." is a dream.

      Much of the problem stems from trying to bolt a new control system onto the old interface, rather than creating a new interface that works well with the new control. Facial gestures for control sounds dumb, but I wouldn't mind a television that turned down the volume if I stopped paying attention.

      • by Haeleth (414428)

        "UP. UP. UP. UP. DOWN." is a nightmare. "Play the most recent Colbert." is a dream.

        No, it's a fantasy. We can't even do reliable voice transcription, and we can't parse natural-language commands even when they're input as text, so combining the two and trying to parse natural-language voice commands is pure science fiction at the moment.

        And if we ever do build a computer that's capable of it, I doubt it'll be interested in playing the most recent Colbert. It'll probably be too busy enslaving us.

    • So, you want a Bene Gesserit Television set?

      Seriously though, I bet sign language actually would make a decent input method. It's word based rather than character based, so input should be faster for experienced users, and the actions have a much wider range of motion which should prevent repetitive stress injury.

      Further, one thing people forget about with voice control is bandwidth. Specifically, the lack of it in an office setting. You can't have a small shared office with a group of people talking to

  • by PPH (736903) on Sunday March 08, 2009 @02:40PM (#27113669)
    I still think that people using BlueTooth headsets look like they're off their meds, walking down the street, talking to themselves. This'll open up whole new Vistas of crazy-looking people. Is he having a seizure or just skipping through his iPod's playlist?
    • . . . will probably make folks look worse than a botched Botox job. I guess the device will come with a warning and legal disclaimer: "If you can no longer hold your eyelid open, discontinue the winking process."

    • by baKanale (830108)
      You know there's a problem when you see crazy people talking to themselves and it takes you a minute to realize they're NOT talking on a BlueTooth headset.
      • Re: (Score:1, Troll)

        by icebike (68054)

        The talkers (crazy or bluetooth) do not present a risk to you.

        Why so paranoid? If you leave BOTH groups alone there is not a problem.

        • by PPH (736903)

          The talkers (crazy or bluetooth) do not present a risk to you.

          I disagree! The crazy people are largely harmless. But the guy on the BlueTooth headset could be an investment banker, trying to make a deal that could sink the stock market another 1000 points. Those people need to be dealt with immediately.

    • No kidding. I saw one at the train station in Frankfurt last week. At that time of night, it's fairly common to see drunk people chattering more or less aimlessly to themselves or to anyone who will listen, so he wasn't out of place. But what confused me was how well-dressed the guy was. Business suit, tie, expensive shoes...

      Then I realized he was holding a telephone conference. :P

  • I want the Cylon water interface (for my toaster, obviously), but this is the closest thing I can find:

    http://www.technovelgy.com/ct/Science-Fiction-News.asp?NewsNum=1650 [technovelgy.com]

  • There is a simple reason for that, it requires learning.

    I've given this some thought, and there are several basic problems that need to be overcome with the current computer/human interface:

    1 - It is not intuitive, no matter how much we as a society now accept as normal for computers

    2 - computers require a special lexicon to communicate with.

    3 - computers do not fix themselves: if you have a maid/servant it's ok if they are ill for a couple of days, but if you have to be the doctor too, it doesn't work well

    • by icebike (68054)

      > if you have a maid/servant it's ok if they are ill for a couple of days,

      Good example! That ought to represent about .000002% of all slashdotters.

    • by bami (1376931)

      > f the computer itself presented information in a 3D world to the user, it would be intuitive to understand what the user needs to do. To get an idea of what I mean, think of something like SecondLife as the interface on your screen, or the window manager. On the screen is a user customized 'world' that contains 3D icons as part of it's makeup. So the user moves their avatar to their 'office' and the objects there represent those functions that the user associates with the 'office'. A trip to the 3D kit

    • by gilgongo (57446)

      There is a simple reason for that, it requires learning.

      I've given this some thought, and there are several basic problems that need to be overcome with the current computer/human interface:

      I have no real problem with the idea that emulating real-world conditions in order to make things easier to use, but the fact is that once a system has been learnt, most people have the capacity to do a hell of a lot more with it that real world emulations cannot handle because computing is, by its nature, not "natural" in that respect. We have not evolved the physical means to cope with large and highly variant data sets. While you might instinctively arrange a bunch of blocks on a table by size (if size r

  • by Joce640k (829181) on Sunday March 08, 2009 @02:54PM (#27113745) Homepage

    We'll all have to sit infuriatingly still if we want to listen to some music.

  • They became aware of each other. It's only a matter of time before they become aware of themselves!
  • So [tongue out] starts or stops your iPod Touch

    Wouldn't that be an iPod lick?

    It would also make listening to KISS and singing along as Gene pretty much impossible.

  • "This device is tiny and fits into the ear and measures movements inside the ear due to changes in facial expression and then uses that as input triggers.

    I don't advocate gambling, but a device disguised as a pair of hearing aids that incremented a count with a left eye blink and decremented a count with a right eye blink could be used for card counting.

  • Control via thought patterns.

    They already have animals controlling robot arms with their thoughts.

    When you think of say a "pink fried tapir" it will produce a distinct thought pattern.

    1) Get a "super PDA" sort of stuff hooked up to look for your thought patterns.
    2) Think up a really unique thought pattern to get the computer to "start listening"
    3) Think up a really unique thought pattern to get the computer to "stop listening"
    4) Think up various distinct thought patterns and link them with various PDA actio
  • Just got back from CeBIT, tried out an eye tracking device made by Tobii [tobii.com]. I guess the technology has been around for a while now (the girl at the stand said they've been in business since 2003 I think) but I've never had a chance to try it out myself. Very, very impressive.

    Basically you control the mouse pointer with just your eyes. The calibration is dead simple, you just need to look at two corners of your screen and that's it. The accuracy of the device amazed me completely. The sentiment is perhaps best

  • I have a terrible time trying to wink -- its going to be impossible for me to go back to the last song.
    I think with facial gestures, while cool- there are so many unconscious movements people make in a day or time... you'd constantly be looking for the manual hold button.
    A scenario: Say I'm listening to music and jogging down the street. A 3 year old comes up, and isn't a brat, and I smile.
    Crap, song repeat.
    So I wink, and well because I am an awful winker, I'd make all kinds of scary facial gestures
  • Pen, printing press, and keyboard. I don't think we're about to come up with a new way any time soon.

    Speech to text is still evolving but has major problems, some inherent (such as the fact that others have to listen to what you're saying to your computer). Touch screens are the best bet for new improved user interfaces. The only new kind of interface that will really revolutionize computers will be a neural interface, and we're years (maybe decades) away from that, not to mention the moral issues should w

  • Awful idea. It is tiring enough to have to make facial expressions to interact with people. When I interface with my computer I don't want to waste that effort.
  • And it gets ever harder to tell people who are crazy from those who are using modern technology...

    Talking to themself? They might be crazy... or maybe they have a really well-hidden cellphone. Weird facial expressions that don't appear to relate to the environment? Crazy... or thinking about philosophy, or one of these.

    Now we need to get close enough to see if they smell funny ... and some geeks smell funny anyhow. :(

    • by maxume (22995)

      Or just don't worry about obviously crazy people (unless they are coming at you with a knife or something). For the most part, the quietly crazy are much more dangerous.

  • by rAiNsT0rm (877553) on Sunday March 08, 2009 @03:53PM (#27114085) Homepage

    Everyone is close but just missing the boat in my opinion. Touch is the way to go but NOT directly on the display screen. A second screen (similar to the dual screened OLPC concept, or a Nintendo DS) that can be customized by each app or else function as a standard pointer/multi-touch input. It has to be essentially a full-on touchscreen display with full color and solid refresh rate.

    This would spur all kinds of new interactions, games, and input.

    • The Wacom Cintiq [wacom.com] monitor/tablet connected as a secondary display gets you pretty close to that... if you're willing to spend the cash. It doesn't have complex haptic feedback and requires a stylus, but it's a step in the right direction.

  • Blue sky (Score:4, Insightful)

    by YourExperiment (1081089) on Sunday March 08, 2009 @04:09PM (#27114183)

    While it's great that all this research into potential future interfaces is being done, a lot of them are terribly impractical. I just wish we could get the simple things right with our present day interfaces.

    How about a jog wheel / thumb wheel that actually allowed different speeds of movement (i.e. true analog) instead of being just a disguised rocker switch? How about a mouse wheel that didn't force me to move slowly through documents a line at a time, but instead had the same capability for fast and slow movement as the mouse sensor itself?

    These are things that would actually be useful now, and are simple to implement with current technology. Perhaps companies could get these right today, in addition to investing in all this blue sky research.

  • by Anonymous Coward on Sunday March 08, 2009 @04:57PM (#27114457)

    And no mention of graphics tablets, which have been available from retailers as long as the mouse. I admit these weren't too popular until the Wacom units were combined with Photoshop in the 90s, but people did buy and try the Koala pads. MIDI has been a significant input device group too. Touchpads are also left out. Stylus interfaces like Newton and Palm... geeze, the list goes on.

    Singularity Hub doesn't sound like much of an authority. Thanks for the heads-up Timothy, but a self-submitted shallow adver-blog like that is what makes for accusations of slashvertisement. Better to have specific interface news posts run on, well, Slashdot.

    (No mention of the Powerglove? I mean where's the love?)

  • by stoicio (710327) on Sunday March 08, 2009 @05:02PM (#27114493) Journal

    I prefer tried and true ergonomic interfaces. For this reason I suggest
    levers and foot pedals. All lever interfaces should have a grip lock
    to keep them from moving by themselves.

    There should also be two large dials to allow for precision X/Y
    axis movement of the cursor.

    Random numbers should be generated with a large wheel that has a rubber
    stop and pins. Simply spin the big wheel for a random number.

    There should be cranks on the side and top of the monitor to allow
    the view to be scrolled.

  • smile-left wink-left wink-left wink-yawn-right wink-smile-frown-slap forehead

    opens root console access.
  • Why Go Backwards? (Score:4, Interesting)

    by gilgongo (57446) on Sunday March 08, 2009 @05:29PM (#27114703) Homepage Journal

    Disclaimer: I am a UI designer, and it's been the way I've earned my living for the past eight years.

    All the "revolutionary" UIs that we've seen like Siftables and perceptive pixels appear to make a major assumption that I don't accept: that dispensing with the virtualisation of data and our interaction with it is automatically good.

    Bringing data and its manipulation "into our world" (as the Siftables guy puts it) seems to me to be a completely retrograde step. One of the reasons why we have computers in the first place is because our world and our physiology is in fact VERY BAD at manipulating large numbers of objects, or pouring paint from one place to another to create the right colour. Keyboards and mice, command lines and pipes, even folders and sub-folders (maybe), are several orders of magnitude better and more flexible at controlling the entropy that we need to control in order to get stuff done. We spent the last 10,000 years working that out - why the hell are we trying to re-discover our inefficiencies?

    I suspect the reason for this is because designing improvements to current UI is in fact very, very hard indeed. Of course, there is another reason: self-promotion by academics hoping to be given jobs heading up large corporate R&D departments for ten times their MIT salaries. But I'll let that pass.

    Basically, anyone who things humans have a future in significant problem-solving through the manipulation of real-world objects either doesn't understand the past, or is so used to the efficiencies that current human-computer UI models bring that they have ceased to understand them. The key to this understanding is an extreme abstraction of the real world, not its re-creation.

    • by foqn1bo (519064)

      While this is a good point, your analysis (and in fact, every single Slashdot headline about HCI) assumes that interaction design is limited to designing interfaces that are intended as replacements for generic, everyday data manipulation, such as mice/keyboards/etc. The real assumption we should be confronting is the notion that interface design is like a diagram of human evolution, or that innovations should necessarily replace everything that have come before them.

      Siftables, for instance, make little

      • by gilgongo (57446)

        Somewhere in between is the realization that we ought not only to rethink our one-size-fits-all perspective on technology, but that we should also keep in mind the way new technologies affect the meaning of interaction.

        Agreed. So perhaps the problem here is indeed simply one of MIT academics' egos and career enhancement. Pushing "Next Generation In Human-Computer Interfaces" rather than simply saying "Hey look - here's an idea kids might like!"

  • *nod, smile, tongue, wink wink, tongue*

    - "Is that guy mentally retarded?"

    - "No, he's just operating his iPod. Note the earplugs."

  • HMD of at least 800x600 with headtracking and a virtual desktop that is, say, 4000x3000. Looking spatially by moving my head is a lot easier than tabbing or switching desktops, and I could retain the spatial-memory of my always open windows.

    I've got multiple monitors now, but there is only so far that can go, physically. Sure, I'd lose the peripherial vision of those other screens, but I could have screens all around me instead of just what fits on my desk.

    The vuzix vr920 looks tempting, but it is only 64

  • The first of the new paradigms should be "you can customize it to your liking." I have a Kindle, which I absolutely love, but I would change a couple of the buttons around and actually disable one if I could - I should be able to configure them how I want them, but I can't. I also have a couple of video game consoles and would love to be able to set the control configurations up for all my shooter-type games to work the same way. Jump should (for me) always be in a certain spot, same for shooting, etc. It's

  • This is fun. I'm sure it's a great device, and I can think of loads of tihngs to control with it. But I swear to you, when I'm listening to music, the last thing I'm going to do is to put something in my ear!

  • A couple of companies (http://www.emotiv.com and http://www.ocztechnology.com/ [ocztechnology.com]) are already working on products utilizing direct "mind control" style interfaces (previously posted on ./ here: http://hardware.slashdot.org/article.pl?sid=08/03/22/138201 [slashdot.org] among others).

    Interestingly both of those products also utilize facial expression recognition to supplement the basic "mind reading" done by the probes attached to your head!

    I'd like to see where this technology goes outside of the gaming industry, far better

  • How about a mouse with a joystick on top? that would leave one hand for all motion control and the other for the rest of the keyboard. Perhaps there is such a product already and I am just unaware of it.
  • What's with the product placement - why not "Say, a Motorola V980"? TFA doesn't even mention Iphone AFAICT. This is Slashdot not some dumbed-down tabloid, you can say "phone" without us having to be given as example of one.

    Next we'll be having "Now you can view a webpage ... on the Iphone" - but wait, we did have that one last week.

  • If you watched the Parody video for 'surface' you can skip this post...

    SO rather than improve upon the interfaces that we have, everybody is looking for something brand new so that all the software and hardware we've already bought and learned how to use effectively can be flushed down the toilet and we can buy all new technology, which is slower, less efficient and requires re-learning everything. Great. Go Humanity.

    All I want is a tactile touch surface for my laptop... use that morphing technology from th

Real programs don't eat cache.

Working...