Demo of Laptop/Tabletop Hybrid UI 66
TheGrapeApe writes "The ACM Symposium on User Interface Software and Technology (ACMUIST) has an interesting proof-of-concept video up demonstrating the use of cameras and laser pico-projectors to 'extend' a laptop's user interface to adjacent surfaces. The video demonstrates some simple gestures like tapping and dragging being captured on the 'extended' surface. While the prototype appears to be somewhat cumbersome, it's easy to see how it might be more elegantly integrated into the hardware with more R&D."
Tony Stark already has this... (Score:3, Interesting)
Re:Man that was bad (Score:2, Interesting)
I agree the quality of the samples were pretty bad, but I think the point was to show that the technology is out there and can be developed further (provided they get enough investors) into something usable and practical.
I work in an accounting firm and I can totally see this being used to scan documents and other related material to send to clients for quick sharing and transfer of information. Sure, you can walk it over to the scanner and email from there, but anytime you can keep a user at their desk then you increase the efficiency and work-output. That's just one possible scenario, of course.
How about in the medical world? You could use this to scan MRI's or patient history files and import into a shared database. Or maybe specific articles in publications that can be scanned for research. The billing dept. could place an encounter form on the desk and the system could recognize the form-type and drop it in the patient's file for later submission to the payer resulting in faster reimbursement to the Dr. I'm sure someone else could think of even more practical applications...
I'd be very interested to demo this product once they make advancements.