New submitter fluxgate writes "Steve Mann (whom you might know for his having pioneered wearable computing as a grad student at MIT back in the 1990s) writes in IEEE Spectrum magazine about his decades of experience with computerized eyeware. His article warns that Google Glass hasn't been properly engineered to avoid creating disorientating effects and significant eyestrain. While it's hard to imagine that Google has missed something fundamental here, Mann convincingly describes why Google Glass users might experience serious problems. Quoting: 'The very first wearable computer system I put together showed me real-time video on a helmet-mounted display. The camera was situated close to one eye, but it didn’t have quite the same viewpoint. The slight misalignment seemed unimportant at the time, but it produced some strange and unpleasant results. And those troubling effects persisted long after I took the gear off. That’s because my brain had adjusted to an unnatural view, so it took a while to readjust to normal vision. ... Google Glass and several similarly configured systems now in development suffer from another problem I learned about 30 years ago that arises from the basic asymmetry of their designs, in which the wearer views the display through only one eye. These systems all contain lenses that make the display appear to hover in space, farther away than it really is. That’s because the human eye can’t focus on something that’s only a couple of centimeters away, so an optical correction is needed. But what Google and other companies are doing—using fixed-focus lenses to make the display appear farther away—is not good.'"