Oculus Rift VR Headsets - Page 11 - AVS Forum
Forum Jump: 
 3Likes
Reply
 
Thread Tools
post #301 of 329 Old 06-09-2014, 11:02 PM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 96
Quote:
Originally Posted by Joe Bloggs View Post

Surely those points are going to make judder easier to see too.
You have to factor eye-tracking into this. Only if you're not eye-tracking.

If you're eye-tracking ultralow persistence motion that's framerate-refreshrate synchronized -- e.g. viewing motion tests on a CRT such as www.testufo.com/photo -- then there's no judder and there's no stutter. The persistence is so low that the motion is point sampled so there's no stutter/judder from things like 50% persistence (180-degree stutter), so as long as persistence is so low that the motion is effectively point-sampled. Not 50% persistence, not 25% persistence, but as close-as-possible to point-sampled persistence (e.g. persistence as tiny a fraction of a refresh cycle as possible), THEN you can simultaneously eliminate judder AND motion blur during the eye-tracking scenario. I think you already very well know what I am talking about. Now if you keep your eyes still, there's some strobe effects. Motion blur ruins your ability to accurately and comfortably eye-track, while you can easily eye-track when there is zero blur (before the stroboscopic effect begins to bother you). When you fixate your gaze on something scrolling on a screen, there is no strobing effects.

In fact, I wrote it in the Area 51 Display Research Forum at Blur Busters Forum about the problem of finite framerates, So what refresh rate do I need? [Analysis]

One particularly good section is the one about the strobing effect of finite framerates, which can still be seen even at 120fps. This is the "judder" you talk about, and this is still visible at 1000fps.
Quote:
Originally Posted by Mark Rejhon 
There are some additional factors where one may require well beyond 75Hz, and possibly far beyond (e.g. 1000Hz), include the stroboscopic effect. (See Why We Need 1000fps @ 1000Hz This Century).

Several sources of what you wrote already explains this, but here's a good demonstration:
View http://www.testufo.com/photo#photo=eiffel.jpg which I've embedded below using the [testufo] tag.

Stare at a stationary point in the middle of the top edge of this moving animation.

TestUFO Animation: Scrolling Eiffel Tower

1. Put your finger at the top edge of this animation. Keep the finger stationary.
2. Stare at your finger (or next to your finger). Keep your eyes and finger stationary.
3. As the antenna part of Eiffel tower scrolls under your finger, you will see multiple antennas appear
(the stroboscopic effect -- kind of like the reverse version of the phantom array effect -- stationary eye but a series of static images that represent moving object at a finite refresh rate)
4. This problem is most pronounced at 60Hz (e.g. antennas 16 pixels apart at 960 pixels/second)
5. This problem still exists at 120Hz (e.g. antennas 8 pixels apart at 960 pixels/second)

This will work even on the slowest laptop LCD panels, too.

Same kind of situation occurs when you spin your mouse pointer in a circle on a black background, it's not a continuous blur, even a 1000Hz mouse will show up only 120 'copies' of cursor per second (at 120Hz refresh rate) when you spin the mouse pointer rapidly in a circle on a black background while you stare stationary in the middle of your monitor (this is also known as the 'mousedropping' effect).

The only way to eliminate all stroboscopic effects like this, without adding motion blur back, is to do flicker-free persistence at ultrahigh frame rates (either by interpolation, but preferably real frames), so that there's continuous motion rather than static frames that can cause stroboscopic interactions (phantom array, mouse dropping effect, wagonwheel effect, etc).

You can add (1/fps)th millisecond worth of intentional/artifical motion blur to mask this effect, much like movies do (35mm film), in order to fix the strobing, filmmakers add intentional motion blur. For example, at 1000 pixels/second and 16 pixel step per frame (60Hz), you could add 16 pixels of intentional GPU-effect motion blurring, to eliminate this stroboscopic effect.

However, adding motion blur is very bad when you want to simulate virtual reality (as both you and I already know, from John Carmack's talks, and Oculus). For this use case, you want 100% of all motion blur to be 100% natural, created inside the human brain, if possible -- no externally added motion blur as a band-aid. Also, motion blur is undesirable by a lot of readers on Blur Busters, who come to this very site, in the pursuit of elimination of motion blur. So someday into the future, we'd want to attempt to do strobefree low-persistence. To do 1ms persistence without flicker/strobing/phosphor/etc, you need to fill all 1ms timeslots in a second, and that means 1000fps@1000Hz to achieve low-persistence with no form of light modulation. That, as you can guess, is quite hard to do with today's technology, so strobing is a lot easier.

75Hz completely solves the motion blur problem by allowing low persistence above flicker fusion threshold. However, it doesn't solve 100% of the problem of making virtual imagery completely indistinguishable from real life. Certainly, it's often "good enough", and it will have to be good enough for the next decades (or few), probably.

There is a Law of Persistence: 1ms of persistence (strobe length) translates to 1 pixel of motion blurring during 1000 pixels/second. Decay curves (e.g. phosphor) complicate the math, but strobe backlights such as LightBoost, ULMB, BENQ Blur Reduction are essentially near-squarewave and very accurately follow this Law, to the point where I've begun to call this "Blur Busters Law of Persistence". This does make some assumptions (no other weak links, stutter free, frame rate matching refresh rate, perfect smooth VSYNC ON motion such as stutterfree TestUFO motion, motionspeed that are slow enough that random eye saccades are an insignificant motion factor). I find I can track eyes accurately on moving objects on screen (i.e. ability to count eyes in the TestUFO alien, which are single pixels), up to approximately 3000 pixels/second from arm's length away from a 24" monitor. Different humans will have different eye tracking speeds, but this kind of defines the bottom end persistence that we need, since 1ms of persistence at 3000 pixels/second blurs the eyes to 3 pixels wide rather than 1 pixel. This is the reason why I told BENQ to support 0.5ms strobewidth in their new firmware (they listened; now we just have to wait for the fixed XL2720Z firmwares to ship), since I apparently can just about barely detect the motion clarity difference between 0.5ms persistence (strobelength) and 1.0ms persistence (strobelength). For 1080p 24" at arm's length away, most people track reasonably accurately at 960 pixels/second. Others, track at 2000 pixels/second before eye tracking can't keep up. I find I cap out approximately at that motionspeed. During 3000 pixels/second TestUFO animations, this means the difference between 1.5 pixels of motion blurring (insignificant blurring at http://www.testufo.com/ghosting#pps=3000 or http://www.testufo.com/photo#pps=3000 ) versus 3.0 pixels of motion blurring (alien eyes blurred at http://www.testufo.com/ghosting#pps=3000 as well as windowframes of buildings blurred at http://www.testufo.com/photo#pps=3000 )... I have this beta firmware installed on my XL2720Z, and it confirmed my findings: 1ms persistence is not the final frontier. So, I recommend manufacturers start considering 0.5ms persistence, and not stop at 1.0ms persistence. This will become even more demanding in the VR era, during panning during fast head-turning speeds, and 4K screens (twice as many pixels to track across), so 0.25ms might actually produce a human noticeable improvement over 0.5ms. (e.g. 8000 pixels/second during slow head turning -- creating 2 pixels versus 4 pixels of motion blur during 0.25ms persistence versus 0.5ms persistence). For now, 1ms persistence (LightBoost 10%) is sufficiently low to satisfy the majority of population, as you still get a lot of brightness loss trying to achieve lower persistence, and compensating with brighter strobes gets expensive (oe.g. custom backlights/edgelights). That said, you can still just do 75Hz, with say, 0.25ms persistence and call it a day, unless you were concerned about stroboscopic effects.

We are stuck with stroboscopic effects, ever since humankind invented the concept of frame rates / refresh rates when we came out with zoetropes and kinetoscopes of the 19th century, we have never yet been able to successfully record and playback continuous motion naturally in a framerateless manner, so we have the artifical invention of the frame rate for now -- since it's the easiest way to virtually represent motion.

The lighting industry has done several studies about human detection of stroboscopic effects of flickering light sources (it's a good reason why fluorescent ballasts have gone electronic and often use >10KHz rather than strobing at 120Hz). The stroboscopic-effect detection threshold (phantom array detection) can be quite high, even 10,000Hz for a portion of human population -- see this lighting industry paper, so that will define roughly the refreshrate we need, although we could get by with just 1000fps@1000Hz + 1ms of motion GPU-effect blurring (fairly imperceptible, but enough to prevent wagonwheel effect).

stroboscopic_detection.png

I totally agree with the individuals such as those in Valve Industry and Oculus, about the elimination of the vast majority of artifacts during low-persistence >75Hz -- this is definitely the sweet spot, as you've described. By all means, it doesn't completely eliminate all differences between virtual imagery and real-life imagery, we still will need >1000fps@1000Hz to pull off the "real life indistinguishability" feat, or some kind of future framerateless continuous-motion display, even a display that refreshes faster only where the eye is staring at, etc. By going to low persistence via strobing, we solve a large number of VR problems, just that low persistence using today's technology necessitates strobing and that problem is unsolvable without going to ultrahigh framerates. (0.5ms = 2000fps@2000Hz needed for flickerfree low persistence with zero strobing, zero light modulation)

Corollary/TLDR: As you said, low-persistence 75Hz+ is definitely the sweet spot that solves a lot of problems. However 75Hz is still not enough to pass a theoretical Holodeck Turing Test, "Wow, I didn't know I was standing in Holodeck. I thought I was standing in real life.", because there still remain side effects of finite framerates, that cause motion to not fully mimic the completely step-free continuous motion of real life (no judder, no stutter, no wagonwheel artifact, no blur, no strobing, no visible harmonics between framerate vs refreshrate, no phantom array, no mousedropping effect). To do so via finite refresh rate, we need ultrahigh framerates synchronized to ultrahigh refrehsrates, 4-digit, in order to completely solve all possible human-detectable side effects of a finite frame rate, achieving low persistence via continuous light output, without strobing/phosphor/light modulation, to achieve simultaneously completely stepfree, strobefree, and blurfree motion necessary to mimic real life.

Very interesting talk though -- and we need more people like you, visiting this brand new forum which launched barely more than a month ago!

Also, here's photos of the Eiffel Tower Test. You stare stationary at the screen while the eiffel tower scrolls past. Less strobe effect. The same problem occurs on any finite-refresh-rate display (CRT, LCD, plasma, whatever).

stroboscopic-60hz.jpg

stroboscopic-120hz.jpg

The same problem occurs for CRT and LCD, strobed and non-strobed, flicker and flickerfree, phosphor and phosphorless.
Quote:
Originally Posted by Joe Bloggs View Post

Real life (what virtual reality is trying to re-create) doesn't blank out 9/10ths of the time.
This is correct. The problem is real life is also not composed of frames either. The artificial invention of "frame rate" since the 19th century, of attempting to represent motion
There are visual artifacts of any finite framerates, you must choose either motion blur OR strobing. What has been definitively proven is that if you were forced to choose one or the other for VR, you definitely don't want motion blur.
Quote:
Originally Posted by Joe Bloggs View Post

He's not going to say the new Occulus judders when you move your head. His job is to promote it.
It has nothing to do with self promotion. I don't think you realize the magnitude of importance of low persistence for VR purposes. Most people who have witnessed it with their eyes agree of the importance.

The judder is not visible when you're accurately eye tracking, and it's much easier to eye-track low-persistence at 4000 pixels/second than eye-track high-persistence at 1000 pixels/second. The eye-tracking comfort improvement of low-persistence outweighs the motion blur, when you're doing fast head tracking.
Quote:
Originally Posted by Joe Bloggs View Post

Couldn't that allow you to do head tracking with less delay?
Lowering lag is important too. But don't forget that motion blur adds perceived latency. This is critically important -- and a surprising finding. It's been shown in tests that reduction of motion blur also reduces perception of latency. Life is not composed of a series of static frames. As you track moving eyes, your eyes are in a different position at the end of a refresh than at the beginning of a refresh.

I am familiar with the judder you talk about here, but you're akin to misdiagnoising -- the lesser of evil vs the bigger of evil. Some of my work is going into upcoming scientific papers so I am extremely experienced in the stutter, judder, and motion blur, and all the corresponding mathematical tradeoffs between them, in the eye-tracking-motion versus eye-not-tracking-motion situation.

On low-persistence displays such as my BENQ-Z Series or LightBoost, I can pass the TestUFO Panning Map Readability Test. This is a very slow 1200 pixels/second motion, similar to the panning speed caused by a slow 20-degree-per-second head turning. You can't read this map on a 60Hz LCD or a non-strobed 120Hz LCD. But the map labels are readable on a CRT, on a LightBoost monitor, or on Oculus DK2. You can read text on a paper book that's sliding past your face at a few inches per second. But you can't read tiny text scrolling past at that speed on a 60Hz LCD or full persistence display such as DLP (unless it's using black frame insertion or high refresh rate to compensate). Now, you need a display that can pull off that virtually zero-blur feat, for comfortable head-tracked VR.

The artificial existence of "framerate" means the beginning of a frame visibility is at a different time than the ending of a frame visibility -- as you track moving objects on a screen, your eyes are in a different position at the beginning of a frame cycle than at the end of a frame cycle. This blurs the frames on your eyes. You need to fix this via framerateless technology which does not exist (infinite framerate), so you have to add black periods between the frames, so you can point-sample the frames instead.

I wish we didn't need black frame insertion, but there is no other way to pull off just 1ms persistence. 1000fps@1000Hz is the only flickerfree/strobefree alternative to making the frame visible for just 1ms with large black gap till next 1ms frame.

During the eye-tracking (on a panning scene, like www.testufo.com/photo), 75Hz with 2ms strobe flashes, has exactly the same motion blur and same zero-judder effect as flickerfree 500fps@500Hz. We're of course discounting the non-eye-tracking situation. Some people, no doubt will be annoyed by judder and might even prefer motion blur, but this would be a minority of people.

The bonus is Oculus DK2 low-persistence mode can be enabled/disabled, so that flicker-sensitive people can enjoy full persistence for completely flickerfree operation (at the cost of motion blur, of course). OLED rolling scans are extremely adjustable. Several new monitors have adjustable persistence (e.g. BENQ Z-Series persistence is adjustable from 0.5ms through 5.0ms, or flickerfree/strobefree mode), so it's not like low persistence is being forced upon your eyes, if you do genuinely prefer nausea-inducing motion blur during virtual reality. Recent convention center tests with the public, have shown that there's more VR nausea from motion blur, than VR nausea from flicker/strobing. The anti-strobe people do not realize the wonderful flexibility is emerging in these new displays, to give people/users the choice, and some people don't realize the human-visible side effects of finite framerates actually can stay visible all the way to several thousands of frames per second, due to the stroboscopic effect. However, it is far by the lesser of a problem than motion blur than VR.

Joe, before you reply to my post, read Pages #1, #2, #3, #4, #5 of So what refresh rate do I need? [Analysis] [very good one!], as well as Highest perceivable framerate?, as well Michael Abrash's Down the VR Rabbit Hole: Fixing Judder. This is full of useful information about judder and how it relates to eye tracking. And there is a lot of great stuff written by many other motion-fludity-experienced people, including Paul Bakaus and others. And shall I go on? Once you read these, you'll realize that Palmer isn't marketing -- he's just restating known scientific fact and new VR knowledge that the lack of motion blur makes head-tracked VR far, far more immersive. We don't care if it's Oculus, or another vendor -- but low persistence is that important for VR, or it won't ever become mainstream due to everything looking more artificial during VR (not possible to comfortably eye-track faster than slow motionspeeds, due to motion blur).

Eventually we'll have 4K or 8K OLED VR that's fully configurable from flickerfree-to-nearzero persistence, and everybody's happy for now until the infinite-framerate refreshrateless Holodeck arrives (to permit crystal clear real-life motion without needing strobing). But as we all know, that infinite framerate dream is probably not going to happen this century -- but 4K OLED definitely will.

TL;DR: Since infinite framerate is impossible, and we're stuck with finite framerates, strobing (well above comfortable flicker fusion threshold) is generally the lesser of evil for virtual reality, than the mandatory motion blur of a flickerfree display (at current technologically achievable refresh rates). Eye-tracking comfort is numero uno paramount importance during rapid virtual reality movements, and there's no way to eliminate eye-tracking-based motion blur without an ultrahigh framerate (or infinite/framerateless continuous motion) unless you point-sample the frames by adding large black gaps between frames and make the frame visibility as brief as possible. This hugely reduces nausea, as the motion blur creates massively more nausea than strobe/judder effects (during non-eye-tracking situations)
RLBURNSIDE and Joe Bloggs like this.

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
Sponsored Links
Advertisement
 
post #302 of 329 Old 06-10-2014, 06:08 AM
AVS Special Member
 
tgm1024's Avatar
 
Join Date: Dec 2010
Posts: 6,070
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 232 Post(s)
Liked: 637
Quote:
Originally Posted by Mark Rejhon View Post
 
Quote:
Originally Posted by Joe Bloggs View Post

Real life (what virtual reality is trying to re-create) doesn't blank out 9/10ths of the time.
This is correct. The problem is real life is also not composed of frames either. The artificial invention of "frame rate" since the 19th century, of attempting to represent motion

 

I almost commented on this statement of Joe's as well, and then erased it because I realized that his is a statement of what happens before image information lands on the retina.

 

In both real life and VR there's an eye involved feeding information to the brain in a format that isn't a frame at a time.  His statement has to do with before it even gets to the eye: in real life, light hits the back of the eye continually.  In VR, the eye receives it in brief pulses.


WARNING: You have now entered a no @#$%tard zone. Please gather your anti-vaccine propaganda nonsense and slowly back out the way you came in.
tgm1024 is offline  
post #303 of 329 Old 06-11-2014, 11:11 PM
AVS Special Member
 
Wizziwig's Avatar
 
Join Date: Jul 2001
Location: SoCal, USA
Posts: 1,157
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 77 Post(s)
Liked: 93
So I was able to spend a few minutes with the DK2 prototype at E3 and play the Alien and Valkyrie games. They had 2 other games but with a 1 hour line and limited time inside the booth, I didn't try the others. I picked the games where I could also evaluate OLED black level performance.

Here's my impressions:

Let's start with the positives:

Despite some light leaking in at the bottom of the mask, black-level performance was excellent. Especially in Alien. It helps a lot with immersion and fear level for horror games.

The head straps felt more comfortable than the DK1 - maybe it was lighter?

No noticeable lag and very smooth head panning even at extreme angles.

Aliasing is much improved due to the higher resolution. This is especially obvious in the distance.

Now to the negatives:

The biggest flaw for me was the fact that the device tracks your head and not your eyes. So you need to lock your vision into a forward/center stare and let you neck do all the work. It would take some brain training to adjust to that. If you try to move your eyes instead, you will look into the border/peripheral vision area where the optics of the device make everything blurry. Only the center area appears clear so you need to keep moving your head so that the point of interest remains dead center. You end up with a very small useful FOV. It drove me nuts in the Alien game because the motion tracker in your hand was at the edge of the screen and I had to keep moving my head to see it clearly. I don't remember this issue from the original DK1 but I tested it a very long time ago. Either it wasn't noticeable in the older demos or they changed something with the optics of the device. Maybe the interpupillary distance didn't match my eyes? I mentioned it to 2 different booth attendants and they didn't know how to fix it.

I didn't notice any improvement in persistence or motion blur. There was no flicker and the display was very bright so I'm not sure if it was even doing the scanning refresh. I asked and they claimed all units were in the low-persistence mode. To be honest, it was very difficult to evaluate motion blur because you couldn't do eye-tracking for very long before hitting the out-of-focus borders. Overall, I think the motion was fine but not up to CRT levels yet.

The image itself looked a bit like a mis-converged projector because they are using an OLED screen with pentile pixel structure where neighboring pixels share colors (http://www.oled-info.com/pentile). All on-screen text and smaller details had a colored fringe that got worse in the screen borders.

One issue remaining from the first devkit is that it's like wearing a diving mask suctioned to your face. You face sweats a lot even after a few minutes - luckily it doesn't fog up like a diving mask.

Conclusion:

Overall, I was hoping for a larger improvement from DK1. There are still some significant problems to overcome and I can see why they took the facebook money to help solve them. In its current state, I don't think it's ready for consumers. Maybe they can add the eye tracking tech mentioned in this article:

http://www.wired.co.uk/news/archive/...t-eye-tracking

Edit: Some additional info came to light after people received their units which explains some of the issues I saw above. See here:
Oculus Rift VR Headsets
Oculus Rift VR Headsets
Joe Bloggs likes this.

Last edited by Wizziwig; 08-19-2014 at 01:16 AM.
Wizziwig is online now  
post #304 of 329 Old 06-12-2014, 04:17 AM
AVS Special Member
 
irkuck's Avatar
 
Join Date: Dec 2001
Location: cyberspace
Posts: 3,482
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 21 Post(s)
Liked: 60
Quote:
Originally Posted by Wizziwig View Post
....and I can see why they took the facebook money to help solve them. In its current state, I don't think it's ready for consumers. Maybe they can add the eye tracking tech mentioned in this article:

http://www.wired.co.uk/news/archive/...t-eye-tracking
Thank you for the report and the link. The need for adding eye tracking is an entirely new level in this game. It seems tracking is now inevitable but perfecting it for consumers will take a lot of time even with unlimited financial backup from FB. And who knows what problems might be hiding still there. My impression is now that 'perfect' VR is always one year away.

irkuck
irkuck is offline  
post #305 of 329 Old 06-12-2014, 05:33 AM
AVS Special Member
 
Joe Bloggs's Avatar
 
Join Date: Apr 2007
Location: UK
Posts: 2,533
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 110 Post(s)
Liked: 55
The problem with eye tracking is that, just like head tracking, there's always going to be a delay. You move you're eyes quickly, it then has to detect the new position, then render a frame showing the new position and output it to the display.
Joe Bloggs is offline  
post #306 of 329 Old 06-12-2014, 12:03 PM
AVS Special Member
 
tgm1024's Avatar
 
Join Date: Dec 2010
Posts: 6,070
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 232 Post(s)
Liked: 637
What exactly is "eye tracking" as it relates to goggles? You wouldn't want the image to move with your eyes. If it did, when you held your head still and moved your eyes you'd always be staring at the same spot.

WARNING: You have now entered a no @#$%tard zone. Please gather your anti-vaccine propaganda nonsense and slowly back out the way you came in.
tgm1024 is offline  
post #307 of 329 Old 06-12-2014, 01:30 PM
AVS Special Member
 
Joe Bloggs's Avatar
 
Join Date: Apr 2007
Location: UK
Posts: 2,533
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 110 Post(s)
Liked: 55
Quote:
Originally Posted by tgm1024 View Post
What exactly is "eye tracking" as it relates to goggles? You wouldn't want the image to move with your eyes. If it did, when you held your head still and moved your eyes you'd always be staring at the same spot.
You may want it to change the focus in the rendering to what your eyes are focusing on (eg. close up vs far away). Also to detect the true position of the eyes as not every ones are 65mm or whatever apart. See the wired article in one of the posts above.

Last edited by Joe Bloggs; 06-12-2014 at 01:34 PM.
Joe Bloggs is offline  
post #308 of 329 Old 06-12-2014, 08:10 PM
AVS Special Member
 
Wizziwig's Avatar
 
Join Date: Jul 2001
Location: SoCal, USA
Posts: 1,157
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 77 Post(s)
Liked: 93
Quote:
Originally Posted by Wizziwig View Post
Overall, I was hoping for a larger improvement from DK1. There are still some significant problems to overcome and I can see why they took the facebook money to help solve them. In its current state, I don't think it's ready for consumers.
Just a quick update to yesterdays's impressions posted above. A co-worker of mine was still in line when I walked out of the booth so I asked him to look for the problems I found.

Today I was able to get his reaction. He hated the whole thing even more than me but for different reasons. Being a game designer, he focused on the terrible controls in Alien. He didn't notice the focus and color fringing issues that I saw.

So either I'm more picky than most or this is an issue that depends on the viewer - similar to DLP rainbows.
Wizziwig is online now  
post #309 of 329 Old 06-13-2014, 03:01 AM
AVS Special Member
 
irkuck's Avatar
 
Join Date: Dec 2001
Location: cyberspace
Posts: 3,482
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 21 Post(s)
Liked: 60
Quote:
Originally Posted by Joe Bloggs View Post
The problem with eye tracking is that, just like head tracking, there's always going to be a delay. You move you're eyes quickly, it then has to detect the new position, then render a frame showing the new position and output it to the display.
Delay can be solved with proper electronics but this will take a lot of effort and time. OR announced that its product will come in 2015, question is how refined it will be.

irkuck
irkuck is offline  
post #310 of 329 Old 06-25-2014, 12:38 PM
AVS Special Member
 
irkuck's Avatar
 
Join Date: Dec 2001
Location: cyberspace
Posts: 3,482
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 21 Post(s)
Liked: 60
OR is totally massacred... at least price-wise

irkuck
irkuck is offline  
post #311 of 329 Old 06-25-2014, 01:23 PM
AVS Special Member
 
tgm1024's Avatar
 
Join Date: Dec 2010
Posts: 6,070
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 232 Post(s)
Liked: 637
Quote:
Originally Posted by irkuck View Post
Periodically the google guys crack me up.

Whether it's genuine or calculated or in the middle, they come across as likeable goofballs once in a while. Perhaps to soften their image as the evil empire to startups.

WARNING: You have now entered a no @#$%tard zone. Please gather your anti-vaccine propaganda nonsense and slowly back out the way you came in.
tgm1024 is offline  
post #312 of 329 Old 06-25-2014, 08:53 PM
Advanced Member
 
RLBURNSIDE's Avatar
 
Join Date: Jan 2007
Posts: 736
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 24 Post(s)
Liked: 129
Quote:
Originally Posted by Mark Rejhon View Post

....
Eventually we'll have 4K or 8K OLED VR that's fully configurable from flickerfree-to-nearzero persistence, and everybody's happy for now until the infinite-framerate refreshrateless Holodeck arrives (to permit crystal clear real-life motion without needing strobing). But as we all know, that infinite framerate dream is probably not going to happen this century -- but 4K OLED definitely will.

TL;DR: Since infinite framerate is impossible, and we're stuck with finite framerates, strobing (well above comfortable flicker fusion threshold) is generally the lesser of evil for virtual reality, than the mandatory motion blur of a flickerfree display (at current technologically achievable refresh rates). Eye-tracking comfort is numero uno paramount importance during rapid virtual reality movements, and there's no way to eliminate eye-tracking-based motion blur without an ultrahigh framerate (or infinite/framerateless continuous motion) unless you point-sample the frames by adding large black gaps between frames and make the frame visibility as brief as possible. This hugely reduces nausea, as the motion blur creates massively more nausea than strobe/judder effects (during non-eye-tracking situations)
Great post, I saw a version of the Occulus last year at my job, that was cool but still needed a lot of work. Mostly it was the huge pixels that bugged me, and I did feel some nausea but I'm looking forward to seeing the improvements.

Any chance you're attending Siggraph this year? I'll be there. Send me a msg if you go.
RLBURNSIDE is offline  
post #313 of 329 Old 06-25-2014, 08:57 PM
Advanced Member
 
RLBURNSIDE's Avatar
 
Join Date: Jan 2007
Posts: 736
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 24 Post(s)
Liked: 129
Quote:
Originally Posted by tgm1024 View Post
Periodically the google guys crack me up.

Whether it's genuine or calculated or in the middle, they come across as likeable goofballs once in a while. Perhaps to soften their image as the evil empire to startups.
4K and smartphones and VR are a natural mix, this looks fun I'm going to try it.
RLBURNSIDE is offline  
post #314 of 329 Old 07-29-2014, 10:16 AM
AVS Special Member
 
irkuck's Avatar
 
Join Date: Dec 2001
Location: cyberspace
Posts: 3,482
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 21 Post(s)
Liked: 60
OR DK2 has arrived with the new OLED display and 1920x1080 resolution. However, even this display is not ideal. Due to pentile arrangement there is a honeycomb matrix effect. That is to say, ‘pixels’ on the DK2 are still easily distinguishable.

irkuck
irkuck is offline  
post #315 of 329 Old 07-29-2014, 09:02 PM
Senior Member
 
golem's Avatar
 
Join Date: Dec 2002
Location: SoCal
Posts: 308
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 5 Post(s)
Liked: 27
Yeah, I got mine too. It works well for what it is (a developers kit) but I think its still far off from being consumer ready yet.
golem is online now  
post #316 of 329 Old 07-29-2014, 09:34 PM
AVS Special Member
 
Wizziwig's Avatar
 
Join Date: Jul 2001
Location: SoCal, USA
Posts: 1,157
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 77 Post(s)
Liked: 93
Looks like this reviewer found some of the same issues I noticed at E3. Reduced FOV, chromatic aberration at edges of display, color fringing on text, and still noticeable blur on blacks. I was unfortunate to demo games which were almost entirely black - this made any improvement in motion blur almost impossible to see.


At least the video captured of the space flight sim finally explains that crazy loss of focus and increase in color fringing at the edges of the display. They seem to be intentionally adding a color mis-convergence to compensate for crappy optics that suffer from chromatic aberration. A software solution is not the right way to fix this and I hope they'll find a way to include better quality lenses while sticking to consumer friendly prices.
Wizziwig is online now  
post #317 of 329 Old 07-30-2014, 03:58 AM
AVS Special Member
 
irkuck's Avatar
 
Join Date: Dec 2001
Location: cyberspace
Posts: 3,482
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 21 Post(s)
Liked: 60
Quote:
Originally Posted by golem View Post
Yeah, I got mine too. It works well for what it is (a developers kit) but I think its still far off from being consumer ready yet.
Would be nice to add some concrete details supporting this opinion.

Quote:
Originally Posted by Wizziwig View Post
At least the video captured of the space flight sim finally explains that crazy loss of focus and increase in color fringing at the edges of the display. They seem to be intentionally adding a color mis-convergence to compensate for crappy optics that suffer from chromatic aberration. A software solution is not the right way to fix this and I hope they'll find a way to include better quality lenses while sticking to consumer friendly prices.
Could it be there is some vicious circle here: Increased resolution and contrast of the display makes higher demands on the quality of optics

irkuck
irkuck is offline  
post #318 of 329 Old 07-30-2014, 10:50 AM
Senior Member
 
golem's Avatar
 
Join Date: Dec 2002
Location: SoCal
Posts: 308
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 5 Post(s)
Liked: 27
Quote:
Originally Posted by irkuck View Post
Would be nice to add some concrete details supporting this opinion.
The pixel density isnt high enough, easy to see SDE. The panels response rate isnt fast enough, lots of ghosting. The chromatic abberation. This results in the impression of bad image quality. You can also see the borders of the panel, akin to looking thru binoculars.
golem is online now  
post #319 of 329 Old 07-30-2014, 01:02 PM
AVS Special Member
 
irkuck's Avatar
 
Join Date: Dec 2001
Location: cyberspace
Posts: 3,482
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 21 Post(s)
Liked: 60
Quote:
Originally Posted by golem View Post
The pixel density isnt high enough, easy to see SDE. The panels response rate isnt fast enough, lots of ghosting. The chromatic abberation. This results in the impression of bad image quality. You can also see the borders of the panel, akin to looking thru binoculars.
Since this is OLED@1920x1080 I wonder what is needed to eliminate these problems, or are they eliminable at all? A 4K display for the start?

irkuck
irkuck is offline  
post #320 of 329 Old 07-30-2014, 01:18 PM
AVS Special Member
 
sytech's Avatar
 
Join Date: Nov 2011
Posts: 1,060
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 101 Post(s)
Liked: 148
Quote:
Originally Posted by irkuck View Post
Since this is OLED@1920x1080 I wonder what is needed to eliminate these problems, or are they eliminable at all? A 4K display for the start?
Virtual Retina Displays like Avegant Glyph eliminate the pixel structure by beaming the image directly into your eye. That is why I think it will be the ultimate VR solution once they find a way to increase FOV. Supposedly, military grade VRDs can have up to 110 degrees FOV, so it should eventually be possible. I would guess 4-5 years until there would be advanced enough eye tracking and optics to make it happen at a consumer acceptable price.
sytech is offline  
post #321 of 329 Old 07-31-2014, 12:40 AM
AVS Special Member
 
irkuck's Avatar
 
Join Date: Dec 2001
Location: cyberspace
Posts: 3,482
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 21 Post(s)
Liked: 60
Quote:
Originally Posted by sytech View Post
Virtual Retina Displays like Avegant Glyph eliminate the pixel structure by beaming the image directly into your eye. That is why I think it will be the ultimate VR solution once they find a way to increase FOV. Supposedly, military grade VRDs can have up to 110 degrees FOV, so it should eventually be possible. I would guess 4-5 years until there would be advanced enough eye tracking and optics to make it happen at a consumer acceptable price.
I know about the Glyph very well being its kicksupporter exactly due to the fascinating visual technology they use . My question was rather limited to standard displays used in the VR sets like OR. It seems now that even full 1080p HD OLED display there still does not solve artefacts problem due to the discrete pixel matrix. Question is then what display resolution would be needed to solve this problem entirely, 4K? This is important since if it would turned out that even with the 4K the picture is not prefectly clean and some shimmering artefacts are still visible it would mean Glyph is the only way to go.

irkuck
irkuck is offline  
post #322 of 329 Old 07-31-2014, 08:39 AM
Senior Member
 
golem's Avatar
 
Join Date: Dec 2002
Location: SoCal
Posts: 308
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 5 Post(s)
Liked: 27
Quote:
Originally Posted by irkuck View Post
Since this is OLED@1920x1080 I wonder what is needed to eliminate these problems, or are they eliminable at all? A 4K display for the start?
This is an interesting solution by Nvidia

http://www.extremetech.com/extreme/1...-of-each-other
golem is online now  
post #323 of 329 Old 07-31-2014, 02:46 PM - Thread Starter
AVS Special Member
 
barrelbelly's Avatar
 
Join Date: Nov 2007
Posts: 1,688
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 58 Post(s)
Liked: 231
http://www.engadget.com/2014/07/31/o...g/?.tsrc=yahoo

BTW...Luckey, Iribe, Abrash and Carmack are all on record saying the ultimate VR experience will take place when the market has access to 16k-24k of resolution or higher. And they think that can happen over the next 20 years as VR becomes more widely used in a huge range of industries utilizing its unique capabilities. But they also stated that a very good to excellent VR experience can be achieved in the interim with 1080p per eye to 8k (4k per eye). I think these guys are ready right now from a technical and partnership standpoint to launch a consumer version of OR. Their statements sans the FB acquisition suggests they are delaying so a full range of game developers and accessory manufacturers have a wide stable of products ready to feed the VR beast. As far as video tech is concerned. They have come a long way since the prototype stage. And I suspect they will always push the envelop on best in class, practical A/V tech.
barrelbelly is offline  
post #324 of 329 Old 07-31-2014, 04:07 PM
Advanced Member
 
brwsaw's Avatar
 
Join Date: Aug 2012
Posts: 533
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 38 Post(s)
Liked: 64
Hardly worth noting but (it got me exited and wishing for a new PC) Oculus is mentioned on War Thunders PC set up page.
Might not be a game for every one but its nice to see it already listed and available to utilize.

slow going
brwsaw is offline  
post #325 of 329 Old 07-31-2014, 11:01 PM
AVS Special Member
 
Wizziwig's Avatar
 
Join Date: Jul 2001
Location: SoCal, USA
Posts: 1,157
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 77 Post(s)
Liked: 93
Quote:
Originally Posted by irkuck View Post
I know about the Glyph very well being its kicksupporter exactly due to the fascinating visual technology they use . My question was rather limited to standard displays used in the VR sets like OR. It seems now that even full 1080p HD OLED display there still does not solve artefacts problem due to the discrete pixel matrix. Question is then what display resolution would be needed to solve this problem entirely, 4K? This is important since if it would turned out that even with the 4K the picture is not prefectly clean and some shimmering artefacts are still visible it would mean Glyph is the only way to go.
In addition to hiding the visible pixel matrix, they may need higher resolutions for other reasons.

If they can't figure out a way to fix chromatic aberration optically via better lenses or coatings, then they will require some insane resolutions to fix it via software. The problem is that light is refracted/bent through the lenses similar to a prism. This ends up splitting the red from the blue very noticeably. They try to compensate for this in software by shifting the red in the opposite direction on the GPU before it hits the lens. This is a very bad solution because the amount of shift required varies across the screen and does not fall into exact pixel sized amounts. You will always see some color fringing somewhere on the screen. This was my biggest gripe about the DK2.

The pentile OLED pixel structure is also not helping them since it effectively reduces the available resolution even further.

I wish I had the time at E3 to see what Sony was offering. Given their history of producing some excellent front projectors, they might be better at designing a good light path for their headset.
Wizziwig is online now  
post #326 of 329 Old 07-31-2014, 11:32 PM
AVS Special Member
 
irkuck's Avatar
 
Join Date: Dec 2001
Location: cyberspace
Posts: 3,482
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 21 Post(s)
Liked: 60
Quote:
Originally Posted by barrelbelly View Post
http://www.engadget.com/2014/07/31/o...g/?.tsrc=yahoo

BTW...Luckey, Iribe, Abrash and Carmack are all on record saying the ultimate VR experience will take place when the market has access to 16k-24k of resolution or higher. And they think that can happen over the next 20 years as VR becomes more widely used in a huge range of industries utilizing its unique capabilities. But they also stated that a very good to excellent VR experience can be achieved in the interim with 1080p per eye to 8k (4k per eye).
Then I think Glyph is ultimately winning technology and coming much faster than in 20ys. That is because even with the current 720p per eye Glyph has no reported artefacts due to the pixel matrix or optics. The only issue is limited FOV but using 1080p chips can easily solve this. I believe there is essentially no need for more than 1080p in Glyph, maybe just chips in slightly different format better covering full FOV, e.g 21:9.

irkuck
irkuck is offline  
post #327 of 329 Old 08-01-2014, 07:23 PM - Thread Starter
AVS Special Member
 
barrelbelly's Avatar
 
Join Date: Nov 2007
Posts: 1,688
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 58 Post(s)
Liked: 231
Quote:
Originally Posted by irkuck View Post
Then I think Glyph is ultimately winning technology and coming much faster than in 20ys. That is because even with the current 720p per eye Glyph has no reported artefacts due to the pixel matrix or optics. The only issue is limited FOV but using 1080p chips can easily solve this. I believe there is essentially no need for more than 1080p in Glyph, maybe just chips in slightly different format better covering full FOV, e.g 21:9.

Not so fast. If Glyphe is the winning tech...which I doubt...it too will just be a tech in the Face Book Oculus Rift stable. Or Sony Morpheus toolbox IMO.
http://www.forbes.com/sites/dorothyp...won-comic-con/
I think the skeptics are completely under estimating the power of Social Media with the marriage of Face Book & Oculus Rift. And I think that calculus will reveal itself more with movies, Sports and TV shows until full VR games are more available. Super Bowl Parties with friends in different places and countries can take on a whole new scale with VR.
barrelbelly is offline  
post #328 of 329 Old 08-04-2014, 11:11 AM
AVS Special Member
 
irkuck's Avatar
 
Join Date: Dec 2001
Location: cyberspace
Posts: 3,482
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 21 Post(s)
Liked: 60
Quote:
Originally Posted by barrelbelly View Post
Not so fast. If Glyphe is the winning tech...which I doubt...it too will just be a tech in the Face Book Oculus Rift stable. Or Sony Morpheus toolbox IMO.
That was meaning technology, not its owner. Glyph buyout is possible but it is more complicated than thought since the technology behind is Texas Instruments. It is unclear how much control TI wants to exert over it.

Quote:
Originally Posted by barrelbelly View Post
http://www.forbes.com/sites/dorothyp...won-comic-con/
I think the skeptics are completely under estimating the power of Social Media with the marriage of Face Book & Oculus Rift. And I think that calculus will reveal itself more with movies, Sports and TV shows until full VR games are more available. Super Bowl Parties with friends in different places and countries can take on a whole new scale with VR.
There are sky-high visions and there are muddy details on the ground. At this point it is not clear yet if the OR mud can be completely eliminated, what is clear though is that without perfect consumer experience OR will not take over.

irkuck
irkuck is offline  
post #329 of 329 Old 08-17-2014, 06:29 PM
Advanced Member
 
DLPProjectorfan's Avatar
 
Join Date: May 2010
Posts: 791
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 5 Post(s)
Liked: 33
Quote:
Originally Posted by Wizziwig View Post
So I was able to spend a few minutes with the DK2 prototype at E3 and play the Alien and Valkyrie games. They had 2 other games but with a 1 hour line and limited time inside the booth, I didn't try the others. I picked the games where I could also evaluate OLED black level performance.

Here's my impressions:

Let's start with the positives:

Despite some light leaking in at the bottom of the mask, black-level performance was excellent. Especially in Alien. It helps a lot with immersion and fear level for horror games.

The head straps felt more comfortable than the DK1 - maybe it was lighter?

No noticeable lag and very smooth head panning even at extreme angles.

Aliasing is much improved due to the higher resolution. This is especially obvious in the distance.

Now to the negatives:

The biggest flaw for me was the fact that the device tracks your head and not your eyes. So you need to lock your vision into a forward/center stare and let you neck do all the work. It would take some brain training to adjust to that. If you try to move your eyes instead, you will look into the border/peripheral vision area where the optics of the device make everything blurry. Only the center area appears clear so you need to keep moving your head so that the point of interest remains dead center. You end up with a very small useful FOV. It drove me nuts in the Alien game because the motion tracker in your hand was at the edge of the screen and I had to keep moving my head to see it clearly. I don't remember this issue from the original DK1 but I tested it a very long time ago. Either it wasn't noticeable in the older demos or they changed something with the optics of the device. Maybe the interpupillary distance didn't match my eyes? I mentioned it to 2 different booth attendants and they didn't know how to fix it.

I didn't notice any improvement in persistence or motion blur. There was no flicker and the display was very bright so I'm not sure if it was even doing the scanning refresh. I asked and they claimed all units were in the low-persistence mode. To be honest, it was very difficult to evaluate motion blur because you couldn't do eye-tracking for very long before hitting the out-of-focus borders. Overall, I think the motion was fine but not up to CRT levels yet.

The image itself looked a bit like a mis-converged projector because they are using an OLED screen with pentile pixel structure where neighboring pixels share colors (http://www.oled-info.com/pentile). All on-screen text and smaller details had a colored fringe that got worse in the screen borders.

One issue remaining from the first devkit is that it's like wearing a diving mask suctioned to your face. You face sweats a lot even after a few minutes - luckily it doesn't fog up like a diving mask.

Conclusion:

Overall, I was hoping for a larger improvement from DK1. There are still some significant problems to overcome and I can see why they took the facebook money to help solve them. In its current state, I don't think it's ready for consumers. Maybe they can add the eye tracking tech mentioned in this article:

http://www.wired.co.uk/news/archive/...t-eye-tracking
Has anyone here seen or tried out the Gameface Labs Mark 5 prototype HMD at either E3 or the other electronics shows yet ?
The Gameface Mark 5 prototype HMD has a higher resolution than even the Oculus Rift at 1440P 2.5 K or 2560×1440 with Nvida's Tegra K1 driving it.
DLPProjectorfan is offline  
Reply Flat Panels General and OLED Technology

Tags
Oculus Rift Developers Kit

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off