Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Games Hardware

Jack McCauley's Next Challenge: the Perfect Head-Tracker For VR (ieee.org) 25

Tekla Perry writes: He used a webcam and LEDs to do position tracking for the Oculus DK2, but Jack McCauley, co-founder of Oculus and now working independently, says that's the wrong approach. He likes the laser scanning system of the HTC Vive better, but says it's just not fast enough. McCauley thinks he can do better, using a design approach borrowed from picoprojectors. Speaking at this week's MEMS Executive Congress, he said better tracking of head position will solve the problem of VR sickness, not more expensive screen technologies.
This discussion has been archived. No new comments can be posted.

Jack McCauley's Next Challenge: the Perfect Head-Tracker For VR

Comments Filter:
  • I think everyone wants better tracking, I mean I even think Oculus would agree that V2 might have something better, but this is a version 1, calm down things are all going to change and get better over and like comparing cell phone abilities. I am sure I will be getting a Oculus CV1 when it comes out, I really do not have the room for the VIVE myself, but I can not say I would not buy one in the future sometime on version 2 or something.

  • by jeffb (2.718) ( 1189693 ) on Friday November 06, 2015 @10:53PM (#50881637)

    Sure, we need better tracking, with higher temporal and spatial resolution, and lower noise. But to get low-enough latency from position sensing to image display, you need high performance at every stage -- position acquisition, image computation, data transfer to your display, and display refresh rate. If the total of all those latencies exceeds the maximum tolerable delay, you lose. If you're going to get sick with total delays of 10 ms, and your display refreshes at less than 100 Hz, there's nothing you can do -- until you get a higher-refresh-rate display.

    • A 100 Hz display is probably good enough as long as it's stroboscopic and not sample and hold.

    • Re: (Score:2, Interesting)

      It's not just about latency. If your display is near zero latency and you don't account for head movements, you're going to cause sickness. That is an inevitable consequence of binocular vision. Latency is >10 ms in human photoreceptors, and adaptation to head position is based on vestibular feedback downstream. Latency is not the limitation. Matching self motion to visual motion is the limitation. If these are mismatched, latency won't matter and people will get sick. I applaud the dude for realizing th
      • Comment removed based on user account deletion
        • Re: (Score:2, Insightful)

          by Anonymous Coward

          What on earth do you mean by "traveling at 1 MS[sic]"? The units in question here are milliseconds (ms); they are not a velocity, or angular velocity, or anything that makes sense in the way you are attempting to use them.

        • this is wrong. google flicker fusion frequency.
      • I meant "position acquisition" in the general sense, which includes head position. As you say, head position is critically important, and we currently don't have any way to manipulate the vestibular feedback system. (And if you ever do develop one, good luck convincing users to let you mess with their inner ear.)

        The problem is that increased latency anywhere in the pipe translates to positional inaccuracy during slew operations -- the faster the slew, the bigger the error, and that's what leads to VR sickne

        • Interesting, my take-away from the article was that the display isn't being updated with the right information, i.e., perhaps different parts of the visual field should be updated differently due to the feedback from the head motion monitor, and current software isn't yet doing the optimal job given the inputs it gets. That was my impression from wearing one. But yeah, manipulating the vestibular feedback system would be a cool addition to any helmet. I think tcms or eventually optogenetics is up to the jo
    • by KGIII ( 973947 )

      How would these work for not just virtual reality but for augmented reality? Is there a chance at putting a camera on the front of them and passing that through to the VR headset and then intersecting that with data? I'm thinking something more utilitarian... I'll try to think of an example...

      Say you're looking at a new house - you're there physically. You toss on a headset and scan the room by looking around. Then you put furniture in the room. Then you can move the furniture around, and check it out, and

  • Throwing a horizontal and vertical laser plane across the space at 1000 times per second is pretty trivial, 10000 per second too ... 0.1 ms not fast enough?

    • You want to shine lasers into people's eyes??
      • Sure, a 1500nm range IR laser plane scanned at a couple 1000 RPM with good interlocks. Why not?

        That range gets absorbed inside the bulk of the eye by the way, which is why it's slightly misleadingly known as eye-safe. Longer wavelengths dump most of their energy on the cornea and shorter wavelengths dump most of their energy on the retina, so they have much lower damage thresholds.

    • On reading the article he just wants to use MEMS because supposedly that's necessary, which is bullshit. The only problem with larger scanners is that they might be a little noisy, 1000 scans per second with a polygon or small flat mirror is going to be quiet enough for a prototype though.

      I'd worry more about the detectors, doing sub ns time to digital is not really hard, doing it cheaply for a lot of detector is going to take a bit of R&D though.

    • by grumbel ( 592662 )

      I think it's not so much that it isn't fast enough, but that you end up with a big box with a big spinning mirror. What he wants to do is to take the same technology and replace the spinning mirror with a DLP-style micromirror [wikipedia.org], thus you could do the same thing much smaller, cheaper and with less power, which in turn would allow you to have more of those devices for better tracking.

  • Maybe it will work if you are sitting in a chair in real life and in the game- but if the screen shows running and the body isn't moving, some people are going to still get sick.

One person's error is another person's data.

Working...