Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Google Displays Technology

Adjusting to Google Glass May Be Hard 154

New submitter fluxgate writes "Steve Mann (whom you might know for his having pioneered wearable computing as a grad student at MIT back in the 1990s) writes in IEEE Spectrum magazine about his decades of experience with computerized eyeware. His article warns that Google Glass hasn't been properly engineered to avoid creating disorientating effects and significant eyestrain. While it's hard to imagine that Google has missed something fundamental here, Mann convincingly describes why Google Glass users might experience serious problems. Quoting: 'The very first wearable computer system I put together showed me real-time video on a helmet-mounted display. The camera was situated close to one eye, but it didn’t have quite the same viewpoint. The slight misalignment seemed unimportant at the time, but it produced some strange and unpleasant results. And those troubling effects persisted long after I took the gear off. That’s because my brain had adjusted to an unnatural view, so it took a while to readjust to normal vision. ... Google Glass and several similarly configured systems now in development suffer from another problem I learned about 30 years ago that arises from the basic asymmetry of their designs, in which the wearer views the display through only one eye. These systems all contain lenses that make the display appear to hover in space, farther away than it really is. That’s because the human eye can’t focus on something that’s only a couple of centimeters away, so an optical correction is needed. But what Google and other companies are doing—using fixed-focus lenses to make the display appear farther away—is not good.'"
This discussion has been archived. No new comments can be posted.

Adjusting to Google Glass May Be Hard

Comments Filter:
  • by jesushaces ( 777528 ) on Friday March 01, 2013 @08:04PM (#43050853)
    It's in TFA (sorry, I'm new. won't do it again)

    Research dating back more than a century helps explain this. In the 1890s, the renowned psychologist George Stratton constructed special glasses that caused him to see the world upside down. The remarkable thing was that after a few days, Stratton’s brain adapted to his topsy-turvy worldview, and he no longer saw the world upside down. You might guess that when he took the inverting glasses off, he would start seeing things upside down again. He didn’t. But his vision had what he called, with Victorian charm, “a bewildering air.”

    Also, for more info on Stratton's experiment check http://en.wikipedia.org/wiki/George_M._Stratton#Wundt.27s_lab_and_the_inverted-glasses_experiments [wikipedia.org]

  • by jabberw0k ( 62554 ) on Friday March 01, 2013 @08:28PM (#43050991) Homepage Journal

    The word is disorenting, I have been reliably informated. Your misuse of suffixes must be cessated and desistated, or your poetic license will be cancellated. Although "(dis)orientation," "information," "cessation," and "cancellation" are verbs, the corresponding verbs are "(dis)orient," "inform," "cease," and "cancel" -- no "-ate" at the end.

  • by maxwell demon ( 590494 ) on Friday March 01, 2013 @08:36PM (#43051047) Journal

    But people usually don't run around holding their smartphone in recording position because it would be hard and look siilly. Google Glass is always in recording position by default, thus removing an important barrier to have it constantly recording. And there will surely be an incentive to have the camera always on (so that virtual objects can be put in the right place, or you can get extra information on what you currently see.

    Imagine a simple application which uses face recognition and image search to find out the name of the person you are currently looking at, and displaying it close to that person. An immensely useful application if you tend to forget people's names, or have problems recognizing people. However it means that (a) the wearer will immediately know the names of all people they see (as long as they are stored in the system), thus reducing your privacy relative to the wearer, and (b) Google will know the position of any person the wearer sees and the system can identify, even if that person has never used anything associated Google in their lifetime, thus reducing your privacy against Google. And if you ask how that image gets into the Google system: For example, some friend of him has stored a photo on Picasa.

  • That's a pretty bid assertion. How does it feel to be old enough where you need to keep up excuses about young people so you don't have to think about your age?

    Going into wearable computing, especially glass, and not knowing of Steve Mann would be like looking into fast food burgers and not stumbling upon McDonalds*

    it all old dead tree stuff? really?

    http://eyetap.org/publications/index.html [eyetap.org]

    As if the guy who has been wearing computer glasses, he built, wouldn't use digital storage.

  • Re:So... (Score:5, Informative)

    by Austerity Empowers ( 669817 ) on Friday March 01, 2013 @09:00PM (#43051195)

    I don't know, some of us are very sensitive to these sorts of things, while others not so much.

    People still think I'm making stuff up when I say "shakey cam movies make me vomit", or Portal 2 for that matter. Most people have absolutely no problems, a few feel mildly queasy. But some of us get physically ill. Shakey cam movies continue, and don't announce themselves as such until AFTER they've taken your money, and some video game companies still restrict FOV options or don't provide ways of disabling "head bob", and other disorienting effects. They simply don't believe there's a problem, and their testers aren't picking up (perhaps being desensitized to it from long hours anyway).

    I don't think they missed anything "fundamental", but it would not surprise me at all if they missed something significant but outside their test group.

  • by mill3d ( 1647417 ) on Friday March 01, 2013 @09:25PM (#43051335)

    The Apache systems completely replaces the field of view of the targeting eye and is designed to work alongside binocular vision, overlaying data atop what is seen by both eyes ; albeit in different colors (augmented reality). The perspective remains the same for both eyes though.

    The problem with Glass seems to be in forcing a spatially unrelated image onto one eye forcing the focus to shift from from the environment to the Glass display, the strain coming from the other eye having to focus somewhere in mid-air. That's unnatural and needs to be forced without a distinct object to look at.

  • by FatLittleMonkey ( 1341387 ) on Friday March 01, 2013 @10:16PM (#43051649)

    And a percentage of pilot-candidates flunk out because they can never adapt to it. The rest have to be trained to it. Not something you want in a general consumer device.

    That said, I don't see Mann's objection. His first display worked like the Apache system, with the same problems. Google Glass works differently to both.

  • by maxwell demon ( 590494 ) on Saturday March 02, 2013 @05:10AM (#43052969) Journal

    No matter if you want to fight against it or adapt to it, in any case the first step is awareness of the problem. Only if you are aware of the problem, you can decide on how to act on it. Therefore the most important thing is to tell people about the problems. Only if you are aware of the problems, you can make an informed decision. And only if you are aware of the problem, you can take appropriate precautions. Such precautions may be quite simple, like asking everyone coming into your home to leave their Google glasses outside, to protect your privacy in your own home.

I've noticed several design suggestions in your code.

Working...