Please create an account to participate in the Slashdot moderation system


Forgot your password?

The Science Behind the InfinitEye's Panoramic Virtual Reality Headset 42

muterobert writes "The Oculus Rift has competition, and it's incredible. The InfinitEye has a 210 degree field of view (compared with the Oculus Rift's 90) and surrounds your peripheral vision in the game completely. Paul James from RoadToVR goes in-depth with the team behind the new device and finds out how high-FOV Virtual Reality really works. Quoting: 'At the present time, we are using 4 renders, 2 per eye. Left eye renders are centered on left eye, the first render is rotated 90 left and the second looks straight ahead, building two sides of a cube. Right eye renders are centered on its position, the first is rotated 90 degree right and the second looks straight ahead, two sides of another cube. We then process those renders in a final pass, building the distorted image.'"
This discussion has been archived. No new comments can be posted.

The Science Behind the InfinitEye's Panoramic Virtual Reality Headset

Comments Filter:
  • Out of date (Score:2, Interesting)

    by Anonymous Coward

    Oh come on, i thought we had got over the whole "strapping-two-screens-to-your-head-like-we-did-in-the-70s" thing

    this has far more potential and could actually be mounted into glasses you could wear unlike the screen-strapping tech like Occulus and its clones which haven't progressed much since the 70s (complete with Fresnels lol) []

    • Issues... (Score:5, Informative)

      by Junta ( 36770 ) on Tuesday November 26, 2013 @08:59PM (#45533613)

      First, for the apples to apples portion of the discussion, that display technology is 45 degree FoV. Given the article is about a project largely of interest because they were ambitious to 210 degrees which is much higher than the still respectable 90 degrees of Rift, bringing in a 45 degree FoV product into the discussion isn't immediately helpful. Now you *could* be suggesting that the technology could do better if they wanted, but until that's demonstrated it would be risky to assume that. The most optimistic reference material I could find about that sort of design said '100 degree FoV could be possible' based on designs that acheived 60 degree FoV' (but that's not exactly an apt comparison, since that material predates DLP which means it isn't quite talking about the avegant solution). In short, Avegant is aiming squarely at private consumption of video content rather than immersion.

      Second, a significant driver for these new projects is a realization that HMD isn't a market that can drive a lot of custom, one-off design work right now. In order to get to a technology that people can actually *get* at an approachable price, they are working to leverage mass-market display technologies that are largely paid for by their use in tablet and similar form factor applications. DLP into the eye is a bit more custom and will probably not be as cheap.

      Also, this discussion is solely about the display technology, but a very large part of the work that Oculus is focused on is motion tracking, which is a pretty critical component.

      Finally, at least that prototype doesn't exactly look like the poster child of 'glasses you could wear', it's still pretty bulky.

      I'm not saying that Avegant should pack up and go home, it could be very promising, but that's not a good reason to tell Oculus and InfiniEye that they are on a dead end path either. Avegant doesn't waste available resolution like the alternatives, but currently there is no solution that leverages the full resolution of the utilized technology while also providing an immersive FoV, but the former point might be moot if the tablet manufacturers continue their one-upmanship to the tune of 3840x2160 7" displays.

      • Curved LED displays are possible. There's a few demo mockup's of watch form factor displays out there. It's still going to be tricky to build a lens system to project the image as if it is further away.
        • Curved display will be the future, yes. The *far* future.
          As said by other in this thread, Occulus and InfinitEye project try to make the technology cheap affordable and mass produced. Thus they concentrate on the cheapest and most available hardware.
          Currently, 5" to 7" *flat* displays are ubiquitous in tablets, and thus are very easy to source for Occulus and IntinitEye (and thanks to the "Retina" fad launched by apple, these screens have high resolutions too, thus looks nicely even if blown up to the full

      • I thought Oculus Rift was supposed to provide a 110 degree field of view, which is what their web site says. Are they changing that for the consumer model?
    • this has far more potential

      If immersive efefct i what you'tre going for, the potential for this technique is in fact severly limited.
      First let's strip away some marketing mumbo jumbo:

      The "projecting directly onto the retina" pitch is bull.
      Unless you want to venture into eye surgery, you can't bypass the optics of the cornea etc ("lazers" or no "lazers"), so any light looking like it comes from a particular direction has to actually arrive from roughly that direction. It follows that and some part of the chain has to physically cover

  • Latency? (Score:3, Insightful)

    by Scowler ( 667000 ) on Tuesday November 26, 2013 @08:41PM (#45533503)
    I RTFA, but didn't see a latency (from sensor to screen redraw) spec. Isn't that supposed to be a pretty important criteria for these devices?
    • Yes, among other things. They use 2 renders per screen, so you should be expecting about 16ms more latency than oculus. Also they have double the vertical pixels compared to oculus, so you might expect more latency compared to the oculus screen from computations too. Latency comes down to several factors, rendering latency, screen latency and everything in between. pixel to pixel latency is very much dependant on the screen type. Oculus has said theyre looking into OLED screens that have minimal latency (fe
      • Specifically, I'm more concerned with the lag between accel/gyro positional reading and screen update. If this lag gets too high, you get a tearing effect which can seriously degrade the overall VR experience.
        • Both oculus and this one seem to use ~1000hz tracking. so its not an issue, tearing comes from display refresh and it's not about the tracking. In a video i saw some guys at occulus break down the latency and it was something like 2ms for usb, 16ms for game engine/rendering, ~16ms for screen refresh@60fps/hz and 16ms for pixel switching.
  • I remember the first VR fad in the 90s... it seemed like such a neat idea. However, the graphics were horrible, frame rates sucked, head tracking was laggy, headsets were bulky, screens were blurry, FOV was too small, and people were still trying to figure out 3D movement control schemes. I've felt that ever since around 2004 we've been ready to give VR another shot, now that we've fixed or have the technology to fix every single one of those problems. And it seems like a lot of different companies are go
  • Bad article (Score:1, Insightful)

    by Insomnium ( 1415023 )
    So many things wrong. TFA is comparing horizontal and vertical FOVs. Both sets have about 90 degree vertical FOV. Oculus rift has about 110 FOV and human is maxed out aroud 180. Also peripheral vision is not very sharp. Oculus is really not having too much problems with the FOV anyway. The main visual candy problem seems to be the DPI among few other things. This item does not even improve on the real issues of oculus, like motion sickness, positional tracking and others. TFA seems to be an advertisement,
    • Fully agreed. I'll be happy to trade off FOV and some PPI if it means we can keep screen refresh rate up, sensor latency down, and resulting rotational error low. Bonus if these things can be done on battery power, and not require being connected to a wall outlet all the time.
      • Batteries wouldn't overcome the need to have a connection to your video card. I read somewhere that Oculus think it will be 5 years before that is overcome.

        • We don't need to be connected to a pc with beefy enough mobile processors. todays high end mobile processors are quite beefy, still not quite beefy enough. but not that far.
          • Why couldn't the video be wirelessly sent to the screen. The WiiU has a screen that gets it's picture wirelessly sent to it by the system. It runs off of battery and the WiiU console has to be able to drive two separate video outputs at once, one for the TV and all the players using the Wiimotes, and a separate video signal sent to whoever is using the game pad. It doesn't seem too far off from a head mounted display. Just send both pictures to the HMD.
            • It could, but the latency would be way too much on current systems. 60fps@1080p is a minimum. and total latency is around 45-50ms right now (from movement to photon) and it needs to get lower. being wireless would add to that latency too.

        • by drkim ( 1559875 )

          Batteries wouldn't overcome the need to have a connection to your video card...

          Unless your video is coming out of a wearable.

          A backpack with wearable computer, batteries, and a wireless link for net gaming. Wires out of the backpack to your mic, HMD, controller and headphones.

          If you're using a desktop, but you're on a Virtuix Omni, the problem is not so much the length of your cables, as much as them getting twisted around.

          • Or wirelessly sent to the wearable display. The WiiU gamepad gets it's picture sent wirelessly from the console and it runs on battery for a few hours before you need to plug it in.
      • Agreed, some time in the not too distant future this concept could be a portable console. beefy enough battery and future mobile chips will be able to run a decent game in 1080p in stereo. couple of years and it will be so, if oculus (or someone else) manages to break trough in the pc market.
  • With these somewhat asymmetric FOVs, a single number doesn't provide enough information to understand what you're getting.
    What's needed now is the "inside angle" and the "outside angle", where:
    - inside angle = how much either eye can see toward the other eye
    - outside angle = how much either eye can see away from the other eye
    (in either case, measure the angle from "straight ahead" over to the cut-off point where you can no longer see anything)
    In a symmetric system, both of these numbers are the same (or pre

"If the code and the comments disagree, then both are probably wrong." -- Norm Schryer