Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
Windows Operating Systems Software Input Devices Microsoft Technology

Windows 7 Touchscreen Details Emerging 152

Posted by Soulskill
from the fingerprints-on-the-windows dept.
nandemoari writes "Microsoft has revealed more about Windows 7 and its support for touch screen technology. The system sounds impressive, however, reports suggest it appears to have a high error rate. In an early version of the system, Microsoft found some problems. For example, both the zoom and rotate functions worked less than 75% of the time, often because the computer confused the two. To rectify this, engineers redesigned the system so that it only looks out for gestures specifically relevant to the program being used. This made a significant improvement: the zoom gesture was now recognized 90% of the time. The problem is that even a 90% success rate may be too low. If you can imagine how frustrating it would be if one in ten keystrokes or mouse movements didn't do what you intended, you can see why touch screen technology will need to be even more reliable if it's to truly improve the user experience. PC Authority has a related story about statements from HP, who don't expect such technology to replace keyboards and mice any time soon."
This discussion has been archived. No new comments can be posted.

Windows 7 Touchscreen Details Emerging

Comments Filter:
  • Geeze (Score:2, Interesting)

    by djupedal (584558) on Saturday March 28, 2009 @04:18AM (#27368737)

    You'd think that with that 'big ass table [youtube.com]' they've been so proudly parading around they'd have this figured out.

    I mean, letting everyone think it was a touch screen, when in reality it uses several cameras down below the glass to track motion - you'd hope they'd get it right when it came to something that actually utilized touch...why are we not surprised to learn they've stuffed this up.

  • Can you imagine... (Score:2, Insightful)

    by djupedal (584558) on Saturday March 28, 2009 @04:23AM (#27368749)

    ...how much of a flop the iPhone would be if it had the same operational statistics?

    Can we face facts now as to MS's rudimentary implementations in regards to touch tech will never be more than a high school science project? Huh...please?

    Their efforts are nothing more than routine fluff to scam investors. C'mon...let's get real and let's all let MS know so they can get off the stage already.

    • by JaredOfEuropa (526365) on Saturday March 28, 2009 @04:37AM (#27368805) Journal
      The iPhone gets it wrong as well from time to time. Most notably in the email app, where you can scroll the list of emails up and down, or wipe across a mail to bring up a delete button. If I try to scroll, it always does. But if I try to wipe an email, half the time it thinks I want to scroll. Oh, and don't get me started on the times the iPhone thinks it's being rotated sideways and goes into landscape mode, when I am merely placing it flat on the desk.

      But seriously, I've yet to come across a device with a touch screen that is as responsive and accurate as the iPhone's. Once you get used to that, other devices feel clumsy and sluggish. Especially that MS big ass table; I've played around with it for a bit, but it is hardly ready for any serious use.
    • by Anonymous Coward on Saturday March 28, 2009 @07:18AM (#27369357)
      Microsoft is trying to implement touch technology across an entire operating system and all its userland components.

      Apple has implemented touch technology on a specialized device with specialized hardware made specifically by Apple that, while quite impressive in what functions it can perform, is not even close to the broad range of tasks a fully fledged desktop/laptop computer and associated operating system performs.

      Which do you think might be slightly more difficult to implement?
      • by xouumalperxe (815707) on Saturday March 28, 2009 @11:04AM (#27370465)

        Apple has done it on their MacBooks, too, since the newer unibody ones have a trackpad that does 1-4 finger gestures. I'm also not entirely sure wth "the broad range of tasks a fully fledged desktop/laptop computer and associated operating system performs" is supposed to mean. We're talking about an input device, and being able to recognize a number of gestures associated with that device. If you're trying to transform it into an "operating system-wide touch technology" you're already doing it wrong!

        • by Mattsson (105422) on Sunday March 29, 2009 @03:53AM (#27377537) Homepage Journal

          Isn't the big difference that Apple currently use a trackpad that can register multitouch while this is to be a touchscreen that can register multitouch?

          I'd figure that things like distinguishing between, say, a click on a button and a multitouch-gesture where the first touch hit that button and the other fingers touch the screen half a second later, would be somewhat of a problem.
          On one hand, you want a quick and responsive interface that doesn't wait 500ms before acting on your input.
          On the other hand, you want the interface to be able to figure out what you wanted to do, like rotating a picture, not what you acutally did, which might have been to first close the picture and then rotate the screen.

          I have these kinds or problems all the time with my Iphone, especially when using it one-handed.
          A zoom becomes a button-click, a slide becomes a button-click, a button-click becomes a slide, etc.
          On a small device which present you with one application at a time and which isn't used extensively for productivity, this isn't much of a problem.
          If put to use in a multi-window environment where having your input misinterpreted might loose you hours of work or slow down your productivity significantly, it's a huge problem.

          • by xouumalperxe (815707) on Monday March 30, 2009 @04:06AM (#27385629)

            Isn't the big difference that Apple currently use a trackpad that can register multitouch while this is to be a touchscreen that can register multitouch?

            Other than the tablet-like 1:1 mapping of screen and input device, I can't think of a single conceptual difference between a trackpad and a touchscreen. So the real question is: is MicroSoft trying to do anything funky with the absolute positioning of your fingers?

      • by SuperKendall (25149) on Saturday March 28, 2009 @02:00PM (#27371963)

        Apple has implemented touch technology on a specialized device with specialized hardware

        You could describe any computer like that.

        The fact is that while the device is specialized, the software and applications are not especially so - at least beyond the fact the development API has a lot of mechanisms to take touch into account. But they don't really lack for any of the other API's you'd get with a desktop, that 99% of application developers use today.

        Even the touch API is generalized, to where you could use different hardware as long as you could get points where users were touching on screen.

    • by hkmwbz (531650) on Saturday March 28, 2009 @11:08AM (#27370485) Journal
      The iPhone lacks responsiveness as well. No touch device I've tried gets it right. Including the iPhone.
    • by Locutus (9039) on Saturday March 28, 2009 @11:16AM (#27370537)

      those were basically full blown desktop CPU powered and Windows could not keep up with the touch interface. Doesn't that shock anyone considering Apple and others are doing it on much lower end hardware? On top of that, Windows 7 is supposed to be a performance based release, the first ever from Microsoft and it's supposed to scale down to netbooks too.

      I wonder how they'll market their way out of this one? Maybe pop open a dialog box asking if they really wanted to do that.

      FYI, I had to use Windows XP(fully updated) recently and it blows me away how unresponsive all of Microsoft's dialogs and utilities are. I was constantly when an hourglass and could not operate any other tabs or features until the first task was completed. This was on a dual core 3GHz system with 2.5GB of memory so it was not the CPU or swapping. Holy shit batman, is this what people have been using? IBM got rid of this kind of thing in the 1992 with what is called multi-threading. I guess ignorance is bliss because I could not stand using that system and waiting so much and what is with all the damn reboots after installing an application?

      It should be pretty obvious, Microsoft still sucks at creating a usable OS if you progress beyond a beginner level of usage. Or you just like doing things very slowly. IMO

      LoB

  • Not convinced (Score:5, Insightful)

    by Mattb90 (666532) <matt@allaboutgames.co.uk> on Saturday March 28, 2009 @04:24AM (#27368753) Homepage Journal
    I'm not convinced that the touch screens can replace a keyboard and mouse on a desktop, or even a laptop, for some time. Text is the big issue, and I can't see myself being able to achieve the same typing speed on a touch screen until there's some really good haptic feedback in place. While handwriting technology could come on leaps and bounds (and has done so), I already type faster than I can write, so this wouldn't be helpful to me. For the mouse there is definitely places where touch would work better, particularly for new users, but the precision of a mouse is better for certain applications (notably gaming) compared to stubby fingers and having them block your view of the screen. Even if Microsoft can get touch working nicely in Windows 7, it's still going to be quite some time until I'll be getting rid of my keyboard and mouse.
    • Re:Not convinced (Score:5, Informative)

      by Vectronic (1221470) on Saturday March 28, 2009 @05:09AM (#27368907)

      It doesn't have to replace the keyboard and mouse, in most cases it just has to add to it, the mouse didn't replace the keyboard.

      There are some things that would be quicker with touch, some with mouse, some with keyboard, for instance touch, would probably be better at the (Reply To This) than having to grab the mouse, align it, click, the keyboard is better for this and the mouse is better for selecting/editing text.

      Plus, because of the difference with accuracies, things like Virtual 3D spaces, you could use touch as a main anchor to hold/move an object, the mouse to operate on the object, simultaneously.

      Or in 2D spaces, like Photoshop, you could use the mouse for painting/drawing, but touch to control rate of flow, or pressure, in a far more natural/instinctual way.

      Also, touch would be better for public access stuff, because keyboard and mouse are the same, no edges, breakable/grabbable/tied down inputs (ie: pens on chains) etc, and potentially they can be self-cleaning too...

      • Re:Not convinced (Score:3, Interesting)

        by somersault (912633) on Saturday March 28, 2009 @06:48AM (#27369219) Homepage Journal

        There are uses for touch in mobile and public devices sure, but why would you stretch over to your screen just to hit "reply to this" on slashdot when you have the mouse right next to you? There is no reason to have a touch screen in any situation where you can conveniently use a mouse or even a touchpad - which is mostly limited to sitting at a desk of course, but that's probably still where the majority of PC based computing is done these days, even with the rise in popularity of laptops. I would probably prefer a touchscreen over one of those little rubbery keyboard nipples though - I'm not a big fan of joysticks for mouse input, despite getting used to it on my PS3.

      • by initialE (758110) on Saturday March 28, 2009 @06:51AM (#27369227)

        In certain scenarios it may well replace the kb and mouse, especially in the mobile market. Tablet PCs without a stylus or keyboard, interactive TVs, etc.

      • by backwardMechanic (959818) on Saturday March 28, 2009 @06:51AM (#27369229) Homepage
        My network analysers have touch screens, keypads (on the instrument), mice and keyboards. I hate it. Apart from the fingerprints all over the screens, you always have to move your hands - screen-keypad-screen-mouse-keyboard - it makes simple operations tedious.

        I like to use the keyboard for most thing, and a mouse is perfect for some. The problem is when an application requires you to continuously mix the two. Do you trust MS not to further complicate this?
      • by darkitecture (627408) on Saturday March 28, 2009 @07:07AM (#27369297)
        Here's a novel idea:

        For every example you gave, I don't see one that could not be achieved with more accuracy and less hassle than having a mouse in each hand with a different colored cursor. Except for maybe public access, which let's face it is served very well with current touch technology already.

        But selling people another $5 mouse just wouldn't be as cool, hypeworthy or anywhere nearly as marketable now, would it?
      • by Hao Wu (652581) on Sunday March 29, 2009 @12:08AM (#27376517) Homepage
        People seem to be ignoring how quickly your arm will get tired from hold it up. Try an experiment- point at your screen and see how long before it's annoying...
    • by VinylRecords (1292374) on Saturday March 28, 2009 @05:22AM (#27368953)

      Well hopefully we won't have to worry about typing in a few years if speech to text is improved upon.

      The mouse though, don't never see that getting replaced, at least not in the near future.

      • by somersault (912633) on Saturday March 28, 2009 @06:53AM (#27369239) Homepage Journal

        I can't see speech to text working in most noisy office environments (meaning even if the computer is 100% accurate, do you want to have to say "computer" before every command, or have to tell it when you start and stop dictating, then forget to tell it to stop note taking when you get a phone call and have to get it to delete the last 5 minutes of text, etc.. a computer that can intelligently tell the context of your speech is a long way off..), or situations where you want to keep your work and/or personal conversations confidential. I see the mouse as far more likely to be replaced than any text input device.. it is a good piece of kit, but there are many more possibilities for replacing mice than keyboards.

    • by peragrin (659227) on Saturday March 28, 2009 @06:47AM (#27369211)

      you want me to tell you a secret?

      The mouse wasn't used on most computers up until the early 90's. it is an add on.

      Touch screens are an addition to rather than a replacement of. Indeed many places a touchscreen and keyboard will work better than a mouse and keyboard. as you can have a smaller footprint for the "interface"

      Also MSFT's problem is that it is using the same gesture for two different functions. draw a lower case l and an i on the screen and have each do different things. of course the system gets confused. of course it is also standard MSFT to do stupid things like that.

  • Best of both worlds? (Score:5, Interesting)

    by Max Romantschuk (132276) <max@romantschuk.fi> on Saturday March 28, 2009 @04:27AM (#27368771) Homepage

    I don't think the keyboard+mouse combo needs replacing, for most applications. But I do see immense potential in touch screen tech.

    My "dream desk"? A huge normal monitor, a keyboard+mouse combo, and a horisontal touch screen / tablet beside them.

    Touch manipulation just makes more sense on a horisontal surface to me. Touch wouln't hurt on vertical monitors, but it's not for continuous work. So give me a solution where I can, say doodle a graphic on my touch screen / tablet, lying on my desk, but don't make me give up my keyboard and mouse or hover my arms in the air for that.

    Also, a horisontal touch screen would be an ideal secondary controller for games and stuff... :)

    • by LiquidCoooled (634315) on Saturday March 28, 2009 @06:24AM (#27369129) Homepage Journal

      I have spent a good part of the last 12 months working with a touch device and I agree with you.

      At the same time though using a small touch screen for notetaking and drawing is practical - I have mine sat within reach most of the time.

      The biggest problem as you say is gorilla arm, my tablet sits in a larger enclosure that lets me rest my wrist whilst still allowing me to write and draw and control what I'm doing.

      I am working up towards creating a touchable wallmounted display and think about it more and more as my UI takes shape, long duration stroking is out of the question, but its practical to have a pokable touch panel as long as the UI allows it.

      heres where I've got so far: http://www.youtube.com/watch?v=iMXp0Dg_UaY [youtube.com]

      regarding the microsoft problem, I have had heated discussions with people about gesture control over the top of standard UI elements and the fact the system and user will be confused by them.
      Android and the iphone both suffer from this mixing up of metaphors and would be better having good clear decisive UI instead of magic wands.

      • by Max Romantschuk (132276) <max@romantschuk.fi> on Saturday March 28, 2009 @07:32AM (#27369433) Homepage

        Nice project! I have to show it to some friends of mine who have Maemo devices. I wish I could afford getting one myself, but as it is now I don't really need it... want it though. ;)

      • by mattack2 (1165421) on Saturday March 28, 2009 @03:13PM (#27372529)

        Android and the iphone both suffer from this mixing up of metaphors and would be better having good clear decisive UI instead of magic wands.

        Could you give specific examples? I'm not saying I disagree, I just honestly can't think of what you're referring to. I've never used Android, so iPhone examples would be preferable IMHO, but both are useful.

        It seems to me that "gesture control" is often used as a faster way of doing something otherwise -- in the same way that keyboard equivalents to menu items are a faster way than moving the mouse to the menu, clicking and releasing on the menu item.

        For example, swipe on a mail message to bring up the delete button. You could instead 'go into' the message and hit the trash button. (Though I guess a counter-example is that swipe is the only way to delete a podcast for example, unless you want to do all of the managing through a computer.)

      • by A_Non_Moose (413034) on Sunday March 29, 2009 @10:24AM (#27379287) Homepage Journal

        ...long duration stroking is out of the question,

        I'm afraid this will be taken completely out of context.

        Like now.

        Uh, what to say?...ah, yes: "you must be new here".

        Backup sarcastic comment (gotta have a backup): "you're doing it wrong".

    • by somersault (912633) on Saturday March 28, 2009 @06:56AM (#27369247) Homepage Journal

      I think having the touch screen sloped towards you would be better, otherwise you'd have to crane over it to use it.. you don't get touch feedback like you do on an actual keyboard so you'd need to be looking at it all the time

    • by initialE (758110) on Saturday March 28, 2009 @07:14AM (#27369331)

      I'm waiting for a touch screen on e-paper keyboard that can reconfigure itself to whatever application you are running. But the tactile response is an issue though.

    • by pavon (30274) on Saturday March 28, 2009 @12:53PM (#27371203)

      I agree with most of that but,

      Touch manipulation just makes more sense on a horisontal surface to me.

      Real-world experience has shown that an angled surface is better than both horizontal or vertical, which is why nearly anyone who draws for a living, like draftsmen, and animators, all use angled "drafting" tables. The fact that the images on the screen can't fall off, (unlike paper which has to be taped down) eliminates the only (minor) downside that an angled table normally has. It would also prevent you from setting coffee cups on it, which is probably a good idea :)

      But I agree, I would love a big drafting-table like touch-screen.

  • by Hognoxious (631665) on Saturday March 28, 2009 @04:28AM (#27368773) Homepage Journal

    A story about touch screens that doesn't say they cure cancer and solve world hunger?

    What is this site, and where is the real slashdot?

  • by Chlorine Trifluoride (1517149) on Saturday March 28, 2009 @04:31AM (#27368789)
    That unpossible!
  • by Psychotria (953670) on Saturday March 28, 2009 @04:33AM (#27368795)

    Why waste developer time on this for a consumer OS? Fair enough if they were developing an OS for a kiosk or touch phone or something. But Windows & is supposed to be for regular PCs. The last thing I, or anyone I know, is for touchscreen capabilities. Not to mention that I've never seen a touchscreen in a retail store. I don't want fingerprints all over my monitor. I can interact with my OS just fine with my keyboard and mouse. Thanks.

    • by JaredOfEuropa (526365) on Saturday March 28, 2009 @04:45AM (#27368825) Journal

      Windows & is supposed to be for regular PCs. The last thing I, or anyone I know, is for touchscreen capabilities.

      I, for one, am waiting for usable touchscreen technology to find its way into "regular" PCs. I'd love to have a keyboard-and-mouse-less flat panel computer in the kitchen or the living room, for light browsing, looking at photo's, ordering pizza, that sort of stuff. Think a supersized iPhone, on a pedestal or even hand-held. HP already have one in their line-up if I'm not mistaken. And with Windows 7 having a native capability to deal with touch screens will probably make for better quality of touchscreen support.

      • by Psychotria (953670) on Saturday March 28, 2009 @04:52AM (#27368859)

        Yeah, I agree. But for what you're saying wouldn't a "web browser" work? Why waste all this time integrating the crap into Windows 7 when a custom / stripped-down OS that can just run a web browser would do the trick? I'm not arguing that there are good uses for touchscreen--there are. I am arguing that building the functionality into a mainstream OS is not necessary. I am guessing that MS wants Windows 7 to be used in offices; I am pretty sure that my typing speed in Word (for example) is gonna drop when I am using an on-screen keyboard. But if my job was based solely on web browsing or ordering pizzas or turning my lights down I'd be a CEO.

      • by TheRaven64 (641858) on Saturday March 28, 2009 @09:26AM (#27369939) Journal
        You can pick up touch screens quite cheaply if you look. I have a 200MHz Pentium that has a touch screen TFT (12", I think), with the computer built into the back. I picked it up on eBay a few years ago for under £100. Building it into a coffee table has been on my to-do list for ages, but I never got around to it. The CPU isn't up to much, but it's more than enough for an X server and a music player, with other apps being run on a more powerful machine.
    • by Soul-Burn666 (574119) on Saturday March 28, 2009 @05:44AM (#27369017) Journal

      Why? Because laptops are a big market and touchscreens are becoming quite popular. My Kohjinsha SX3 has a touchscreen and it helps a lot, even not in tablet mode.

      Better touchscreen support could open up new ideas for ultra-mobile computing. Perhaps an on-screen multi-touch keyboard with haptic response and futuristically even memory plastic for actual physical response.

      A dream of mine is a foldable screen where one side acts as a virtual keyboard and the other as a screen. When angled, it's like a laptop and when flattened, it's a full sized tablet.

    • by Sepodati (746220) on Saturday March 28, 2009 @06:46AM (#27369205) Homepage

      I can interact with my OS just fine with my keyboard and mouse.

      Then you're not the intended customer, so don't buy one! I don't want a semi-truck but that doesn't mean they shouldn't be built because other people can surely put them to good use.

      -John

    • by liquiddark (719647) on Saturday March 28, 2009 @09:39AM (#27370007)
      Methinks you've missed the hundred or so embedded Windows error screens on TDWTF.
    • by westlake (615356) on Saturday March 28, 2009 @12:59PM (#27371279)
      Not to mention that I've never seen a touchscreen in a retail store. I don't want fingerprints all over my monitor.

      The HP TouchSmart Computer [youtube.com] has been around for quite some time now.

  • by PolygamousRanchKid (1290638) on Saturday March 28, 2009 @04:40AM (#27368811)

    At work, my Monitor is at the edge of my desk, my legs are up on the desk, and I'm leaned back as far as my chair goes. I could not reach the touch screen without leaning forward and up, and that would take effort. And I am a lazy-ass critter, why else would I work in that position in the first place?

    At home, things are no different, I usually work with my legs up on the sofa, sitting up, but leaned back on the comfy cushions. Again, touch screen out of reach.

    So this technology really doesn't interest me.

    Well, maybe if the touch screen came will a big, long stylus or I could use a sawed-off cue stick. However, I might get in the habit of whacking the touch screen with the stylus, when I get angry about something on the screen.

    • by Sycraft-fu (314770) on Saturday March 28, 2009 @06:23AM (#27369123)

      The idea for a touch screen wouldn't be to use it in a traditional office environment. You'd be working in a setting where you were near the screen. Once possibility would be to have the screen laying flat and you over it like a work surface. Another might be screens mounted on a wall that you got information from. Still another would be like a tablet PC, but without the need for special stylus.

      It really isn't a "This is going to replace your workstation," sort of thing. I don't think many people in the industry are deluding themselves in to thinking it is a mouse/keyboard replacement, rather it is just another input option. There are situations where mouse/keyboard doesn't work well, and maybe touch screen would be a good choice.

      Like I said once area I could definitely see is wall mounted information displays. You have some big screens that show system status or something. Well, not convenient to have a mouse and keyboard anywhere there. So you've got three options:

      1) No control. The displays show only what they were configured to. You can walk up and look, but you can't request more/different info.
      2) Remote control. You have a station somewhere near by that actually controls the displays. If you want them to change, you need to go to that.
      3) Touchscreen. Just touch the displays to change them.

      I could see a powerful touch screen (as in one with these new features like zoom, rotate and such) being really useful for status readouts. If you see somethign going on, go up and zoom in for a better look and maybe call up more info.

      So not the Next Big Thing(tm) in computers, but a neat addition, if it works well.

    • by eebra82 (907996) on Saturday March 28, 2009 @08:41AM (#27369739) Homepage
      It's a sad day when Slashdot moderators mod parent post insightful. He is basically saying that he doesn't touch a feature that requires touches. On a related note, I am allergic to poison.
  • by Anonymous Coward on Saturday March 28, 2009 @04:42AM (#27368813)

    There is a distinction between gesture recognition and direct manipulation. With direct manipulation there is no recognition (using fancy algorithms like used in speech recognition). Succesfull multitouch applications use direct manipulation.

    The basic mistake here is that MS is trying to make old programs to work with multitouch gestures. For multitouch the UI of the applicatios needs to be redesigned and reprogrammed. There is no way around that.

  • I think I'm with HP on this one, I can't see touch interfaces becoming popular on the desktop.

    Having to reach up and touch the screen is physically more demanding than resting your hands on a keyboard or mouse. You also don't get the same tactile response as you do when pushing a key or clicking a mouse button.

    Touch makes more sense for mobile devices where a full size keyboard or mouse is not available, and maybe on laptops where you could use the touchpad for gesture input. Even then, it's not always the best option as an on-screen keyboard means less space for viewing the content.

    • by spydabyte (1032538) on Saturday March 28, 2009 @06:36AM (#27369167)
      What do you mean you're with HP? They have a whole campaign [hp.com] devoted to desktop touch screens...

      As for most of the other blatantly wrong comments, I think it's incredibly important to develop this. Everyone is only considering touch screens as the main outputs. But what about a dedicated input being a touch screen. Like the Optimus Maximums [artlebedev.com], but extremely cheaper and more diverse. This one application voids both this post [slashdot.org] and this post [slashdot.org], the two highest rated comments in this thread. There are so many applications to multi-touch technology, and only R&D will get us there.
      • Stop and think about how you would use the kind of device you are suggesting for a moment.

        The Optimus keyboard really isn't such a good idea, because to use it as anything other than a standard QWERT layout keyboard you have to look at it to see what the keys now do. Wouldn't it make more sense to just display some keyboard shortcuts on the screen so you can touch type them?

        The keyboard's main function (text input) typically does not change, as 90% of the time that is what you are doing with it. Shortcuts are additional to that, they don't normally replace it. You are working with information on the screen, information you want to see without your hands in the way, and which you can more easily use the mouse to manipulate instead of having to look down, refocus and touch a button. Having a separate input device which you can use blind is by far the best solution.

        I'm not saying that there are not applications for touchscreens, just that they are unlikely to replace the keyboard and mouse any time soon. Unless you can give me some specific examples with common applications (e.g. word processor, web browser etc).

      • by sponga (739683) on Saturday March 28, 2009 @11:06PM (#27376143)

        Yah, it is pretty sad that the comments modded up are out of touch of what is going on in the real world.

        I know that we could introduce some of this into our engineering house for laying out blueprints and designs, give it time and eventually AutoCAD will have some integrated stuff to take advantage of this. In fact I am sure some of us younger engineers would jump on the newer technologies.

  • by FranTaylor (164577) on Saturday March 28, 2009 @04:51AM (#27368851)

    Why should this be different from any other Microsoft product?

  • by Z00L00K (682162) on Saturday March 28, 2009 @04:53AM (#27368863) Homepage

    Fingerprinting is always a problem when you use touch screens.

    Especially after a snack or a meal.

    Aside from that - users also have different movement patterns, which causes every user to be recognized and to let the device learn the behavior of the user.

    And don't forget that gestures have different meanings in different cultures.

  • by Konster (252488) on Saturday March 28, 2009 @05:06AM (#27368897)

    .o q q o ss uoo nq 'ooz o usno s1qod u buou ou , pu 'ou 7 sopu o q busn ,

  • by ionix5891 (1228718) on Saturday March 28, 2009 @05:07AM (#27368899)

    are there any Linux alternatives yet?

  • Screen are for our eyes. And we use hands to input. Screens are setup vertically because that good for our eyes. Keyboard are placed on our table because that how our hands are grown. Moving the screen to the table or move my hand to interact a vertical panel is no good!

    Gesture are actually good if it could be done on the keyboard. I owned a Touchstream LP keyboard (which is the exact multitouch technology that Apple acquired, google for its image), I enjoy doing gestures on it. However, the lack of touch feedback is a stopper.

    If someone could make a keyboard such that each individual keycap is a simple touch pad, and hence can sense the movement of fingers and hence gesture, while all this still preserve the feeling of pressing a key, that would be a killer hardware!

  • by 9Nails (634052) on Saturday March 28, 2009 @06:29AM (#27369153)

    Sitting here at my desktop, monitor out of arms reach, I can't help but think that Touch is a useless feature. I'm not going to be swayed to Touch as a feature until I can make use of it. Perhaps if I was a notebook user I'd reconsider my enthusiasm. But that said, I think there's a way that they can attract desktop users...

    Some company needs to completely replace the 10-key pad on a desktop keyboard with a touch screen. It should be the same size as the 10-key pad or larger, and feature a 10-key on/off switch. In the 'on' mode, you would use it as a normal 10-key. In the 'off' mode, it would give the user a touch device which could manipulate the images on the monitor. The user might see a selection box on-screen targeting the area of the pad that is available. Touch gestures would allow manipulation of the desktop. Of course a mouse would still be used for most point and click interactions. It probably should use OLED for high angle visibility and should have soft ridges for tactile feedback when you enable 10-key.

  • by therufus (677843) on Saturday March 28, 2009 @06:30AM (#27369155)

    I've never used such a device but I can see where a computer may become confused. When we touch the screen and move our fingers, we would more than likely change the distance between our fingers unintentionally when rotating in a circle. Maybe the solution is simple.

    When the computer detects that 2 fingers are on the screen, maybe simply displaying a circular template with which the user could follow when rotating, and downing the sensitivity would work. You could just create some kind of shadowing overlay or something. If the user wants to zoom, then just doing the usual 'fingers-closer' or 'fingers-further-apart' would work too.

    I'd like to see touchscreen implemented correctly. There are so many areas where just 'grabbing' something on screen and moving it would be so much more user friendly. Particularly when it comes to people with disabilities.

    • by HTH NE1 (675604) on Monday March 30, 2009 @11:52AM (#27390183)

      I've never used such a device but I can see where a computer may become confused. When we touch the screen and move our fingers, we would more than likely change the distance between our fingers unintentionally when rotating in a circle. Maybe the solution is simple.

      I have a simple solution: stop trying to separate them into two discrete commands and just do both. Why is this even an issue?

      If you're trying to do precise manipulations, then don't use your fingers. Fingers are clumsy; that's why we invented tools, even if it's a virtual tool. You wouldn't want your doctor performing surgery using just his fingers to rip you open. Even with a mouse, if you want to drag out a circle, you use the circle tool and hold down a key to tell the software to constrain the result to circles and exclude ellipses. With that modifier, your inability to drag out a perfect circle becomes an ability to fine tune amongst possible perfect circles at subpixel accuracy.

  • by owlstead (636356) on Saturday March 28, 2009 @07:04AM (#27369279)

    It may be all the rage, but I almost never use zoom and/or rotate. Normally I take some time to setup my application right, and after that I use it only if a document (e.g. a webpage) misbehaves. Once the character size is correct, I use the scroll bars. Now, the auto-rotate of camera's, PDA's, photo frames etc., that's something I find truly useful. I wish I had it for my LCD screen.

    Am I the only one that thinks that rotate and zoom are both rather pointless things to optimize? I've got a MS 4000 keyboard at work, and while it is a brilliant keyboard, I've literally *never* used the zoom function on it and you cannot reprogram it to do scrolling.

  • by dpbsmith (263124) on Saturday March 28, 2009 @07:05AM (#27369291) Homepage

    But Microsoft can fix this easily.

    When you touch the screen and it's not clear what you want, an animated character can pop up and say "Hi! It looks like you're trying to rotate the screen image!" and coach you on how to bend your fingers into the right position to meet the software's expectations.

    To prevent errors, when you're done, a dialog box can pop up saying "Do you really want to rotate the screen image? Allow/deny." Then there will be no errors... or any errors that do occur can be blamed on the user.

    And, of course, there can be a Screen Rotation Wizard to give you a simple six-screen walkthrough, and context-sensitive Help available simply by tapping your ring finger in the northeast quadrant of the screen while you're making your gesture.

    The Microsoft Way is that the computer should control the user, not the other way around. Once the touchscreen programmers absorb this fundamental principal, all their problems can be easily solved.

  • by Frankie70 (803801) on Saturday March 28, 2009 @07:20AM (#27369363)

    Porn!!!

  • by commodore64_love (1445365) on Saturday March 28, 2009 @07:26AM (#27369399) Journal

    I'm a nerd. I have no muscles to hold-up my arm. Pushing a mouse across a pad is about all the effort I can muster.

  • by toppavak (943659) on Saturday March 28, 2009 @08:05AM (#27369571)
    Seem to be accepted fine with a 90% success rate. I've always noticed that whenever most people that don't use one every day pick up an iTouch/iPhone, for example, the error rate in typing and gestures can be even higher than 10% yet this does not deter people because the basic functionality like scrolling and "clicking" work fine. In my experience, people are willing to chalk up errors in slightly more involved gestures to "getting used" to the particular touchscreen's properties. I know several people running various 7 beta/RC builds on their tablets full-time and are absolutely in love with it. I'm considering getting a Lenovo X200 tablet later this year and I look forward to being able to try out 7 on it.
  • by Zero__Kelvin (151819) on Saturday March 28, 2009 @08:10AM (#27369599) Homepage
    Description: A system of providing a consistent user interface to facilitate an improvement in ease of use.

    I can't believe someone else has not thought of that before [slashdot.org]! Anyone can clearly see why Microsoft is famous for their innovations! (i.e. as long as they don't follow the link and read it)
  • by Prototerm (762512) on Saturday March 28, 2009 @09:03AM (#27369825)

    Imagine what the screen will look like with all that orange Cheetos stuff smeared all over it.

    Lame!

    And I won't get into what peanut butter and jelly will look like.

  • by Ancient_Hacker (751168) on Saturday March 28, 2009 @09:44AM (#27370033)

    This industry has such a short memory. Some of you may remember the HP computer, the one with the butterfly on the screen? And the smiling actors touching the screen? HP blew about $85 million dollars advertising that computer and technology.

    It turned out people did not like touching their screens, for many reasons:

    (1) If the room temp is above 75F or your nervous about getting this paper done on time, you'll leave a smudge every time you touch the anti-reflective coating.

    (2) A finger is not a very precise pointing tool.

    (3) After 30 minutes of pointing you get the heavy-arm syndrome, or if you persist, the B-24 pilot arm. (B-24 had an extremely hard to turn and pull steering yoke-- B-24 pilots could be distinguished by their Schwartzernegger-sized biceps.).

    (4) The third time your finger misses the "save" menu item and hits "exit", you swear and give up using the touchscreen.

    Yes, I know, youngsters, you think touchscreen technology has improved over the last 15 years, but human fingers and arms and sweat glands have not.

  • All this pinging back and forth over gesture-based interfaces vs. keyboards and mice... is wasted on me. I just want a nice wireless subvocal interface between me and my devices. When I use it I can speak aloud and the device won't take action because it knows I am, well, vocal. I guess the only danger of subvocal would be found in meetings at work where I find myself muttering profanities under my breath after the idiotic comments people make ;p

    As far as the actual subvocal transfer to my device... I am honestly hoping someone can come up with an implant that is fairly small which goes near the vocal chords or attach to a nerve which drives them (a little scary) that could be powered by my body and reliably transmit the information of a distance up to the length of my arm, where it could then interface with something more powerful to transmit over larger distances if needed (but generally not needed).

    All of this of course just being a stop-gap solution to when we have true brain/nervous system level interfaces.

  • by mocm (141920) on Saturday March 28, 2009 @11:17AM (#27370541) Homepage

    touchscreen cleaning products

  • by His Shadow (689816) on Saturday March 28, 2009 @11:58AM (#27370761) Homepage Journal
    Those looking forward to "gorilla arm" computing can rejoice. Unless multitouch gestures are supported on the trackpads of laptop mice, touchscreens on desktops is just a curiosity for kiosks and schoolkids.
  • by zerofoo (262795) on Saturday March 28, 2009 @06:28PM (#27374163)

    This isn't new. I've been giving Microsoft products hand gestures for years.

    -ted

6 Curses = 1 Hexahex

Working...