or Connect
AVS › AVS Forum › Display Devices › Flat Panels General and OLED Technology › 4k by 2k or Quad HD...lots of rumors? thoughts?
New Posts  All Forums:Forum Nav:

4k by 2k or Quad HD...lots of rumors? thoughts? - Page 113

post #3361 of 3670
Obviously, the unblurred one.

A similar phenomenon occurs during video gaming in these situations, e.g. display-based motion blur masking lowness of framerate:

-- Console gaming on plasma 30fps versus LCD 30fps.
Some people say they prefer console gaming on LCD 30fps because it looks smoother (because motion blur is masking the lowness of the 30fps). Though it is a personal preference if you prefer LCD motion blur versus the plasma double-image effect (flickering/strobing edges during motion).

-- LightBoost (LCD motion blur elimination) on 120Hz computer monitors, when playing at low framerates.
LightBoost eliminates motion blur so dramatically that low framerates are much easier to distinguish from higher framerates, since there's no motion blur to mask the lowness of the framerate.
So as a result, for LightBoost to really be worth it, you need triple-digit framerates (since LightBoost behaves like a 120Hz CRT).

-- CRT 30fps-vs-60fps is easier to see versus LCD 30fps-vs-60fps. (e.g. TestUFO: Framerate 30fps vs 60fps Animation -- play this on a CRT, then play this again on LCD). This is because motion blur on most LCD's is masking the lowness of the framerates, making them less distinct than on a CRT.

Intentional source-based blur (film) is just another technique to mask the lowness of the 24fps film framerate.
Edited by Mark Rejhon - 7/27/13 at 11:55pm
post #3362 of 3670
Quote:
Originally Posted by Mark Rejhon View Post

Obviously, the unblurred one.

A similar phenomenon occurs during video gaming in these situations, e.g. display-based motion blur masking lowness of framerate:

-- Console gaming on plasma 30fps versus LCD 30fps.
Some people say they prefer console gaming on LCD 30fps because it looks smoother (because motion blur is masking the lowness of the 30fps). Though it is a personal preference if you prefer LCD motion blur versus the plasma double-image effect (flickering/strobing edges during motion).

-- LightBoost (LCD motion blur elimination) on 120Hz computer monitors, when playing at low framerates.
LightBoost eliminates motion blur so dramatically that low framerates are much easier to distinguish from higher framerates, since there's no motion blur to mask the lowness of the framerate.
So as a result, for LightBoost to really be worth it, you need triple-digit framerates (since LightBoost behaves like a 120Hz CRT).

-- CRT 30fps-vs-60fps is easier to see versus LCD 30fps-vs-60fps. (e.g. TestUFO: Framerate 30fps vs 60fps Animation -- play this on a CRT, then play this again on LCD). This is because motion blur on most LCD's is masking the lowness of the framerates, making them less distinct than on a CRT.

Intentional source-based blur (film) is just another technique to mask the lowness of the 24fps film framerate.

 

Correct, but he's extrapolating this to a wrong conclusion.  His assertion was that as the clarity of the image increases from spatial resolution, that the frame rate *NEEDS* to be increased.

 

I believe he's looking at this as if a lower resolution ball is performing the same thing as intentional blurring.  But when a ball moves in front of a 100 dpi camera, it's got X number of pixels to smear through (while the shutter is open).  When that same ball moves in front of a 200 dpi camera, it's got 4*X the number of pixels to smear through.  (Or if it makes it more clear you can think of this as 2*X if you want to constrain the notion to a single axis corresponding to ball travel).

 

The higher resolution gives you more pixels of that same blur, but no less smoothness of motion.


Edited by tgm1024 - 7/28/13 at 10:05am
post #3363 of 3670
Quote:
Originally Posted by tgm1024 View Post

Correct, but he's extrapolating this to a wrong conclusion.  His assertion was that as the clarity of the image increases from spatial resolution, that the frame rate *NEEDS* to be increased.

I believe he's looking at this as if a lower resolution ball is performing the same thing as intentional blurring.  But when a ball moves in front of a 100 dpi camera, it's got X number of pixels to smear through (while the shutter is open).  When that same ball moves in front of a 200 dpi camera, it's got 4*X the number of pixels to smear through.  (Or if it makes it more clear you can think of this as 2*X if you want to constrain the notion to a single axis corresponding to ball travel).

The higher resolution gives you more pixels of that same blur, but no less smoothness of motion.
But it wasn't motion blur that was causing things like the film grain to be so reduced, and make the motion appear smoother/less easy to see each individual frame. There was no motion blur in either video. The 2nd one was blurred, but to simulate lower detail/unfocussed or processing they do on Blu-ray which removes fine detail/SD vs HD or 4K. If the fine detail has been captured and is displayed, including film grain, each separate frame will be more noticeable than if it was lower resolution/blurry/defocused/lacking in detail. 4K will allow fine detail, including film grain to be much easier to see (and so make each frame stand out more) than in the lower resolution formats - depending on the individual film/scan.
Quote:
His assertion was that as the clarity of the image increases from spatial resolution, that the frame rate *NEEDS* to be increased.
Quote:
The higher resolution gives you more pixels of that same blur.
Quote:
it's got 4*X the number of pixels to smear through.
If you increase the pixel resolution 4x (spatial resolution), for anything that moves, especially the camera, what advantage has the increased pixel resolution given you when it just gives you 4x as many pixels of blur - unless you increase the frame rate (assuming everything else is the same, including camera moves)? There's also the fact that people will be viewing at different viewing distances/bigger TVs as discussed before, also increasing the need for higher fps.
Edited by Joe Bloggs - 7/28/13 at 2:43pm
post #3364 of 3670
Quote:
Originally Posted by Joe Bloggs View Post

If you increase the pixel resolution 4x (spatial resolution), for anything that moves, especially the camera, what advantage has the increased pixel resolution given you when it just gives you 4x as many pixels of blur - unless you increase the frame rate (assuming everything else is the same, including camera moves)?

 

"Why not HFR" is a different question.  And 4K is a win either way, but IMO, it's far less important than frame rate is at this point in time.  I'm the biggest proponent of increased frame rate you'll ever find.  But 4K doesn't somehow make it more important.  Frame rate and apparent spatial resolution are definitely linked, but not the causal direction you've described.  Apparently I can't seem to reach you on this.  So I'll let the disagreement go for now.

 

In any case, convince you or not, the true bottom line of all of this I suppose is that both seem to be on their way, which is great.  I just hope it's 120fps sometime in my lifetime.....

post #3365 of 3670
Quote:
Originally Posted by Joe Bloggs View Post

If you increase the pixel resolution 4x (spatial resolution), for anything that moves, especially the camera, what advantage has the increased pixel resolution given you when it just gives you 4x as many pixels of blur - unless you increase the frame rate (assuming everything else is the same, including camera moves)? There's also the fact that people will be viewing at different viewing distances/bigger TVs as discussed before, also increasing the need for higher fps.
There's additional variables that needs to be thrown out onto the table:

(1) Film based material / viewing distance
One is likely viewing the 4K material from the same distance as the 2K material, so the angular blur is the same, even though you have 2 pixels of blur, it's the same distance as 1 pixel of blur at half resolution.

(2) Video league framerates (60fps, 120fps)
Once your framerate is sufficiently high enough, it is not always necessary to further increase framerate to reduce motion blur -- you can simply shorten the refreshes themselves by adding black periods between refreshes. This can introduce flicker, but if it's 120Hz flicker, it is not noticeable by most people.

When source-based blur is not the limiting factor, 60fps with 50%:50% blackframe insertion has exactly the same amount of motion blur as sample-and-hold 120fps@120Hz (flicker-free)
50%:50% black frame = 50% less motion blur
75%:25% black frame = 75% less motion blur
90%:10% black frame = 90% less motion blur
etc.

This is not practical for 24fps material, but practical for 60fps-and-up material, because flicker/strobing effects can be made beyond the human's vision ability to detect. This is one huge reason I'm a big advocate of 120fps@120Hz native material, because it easily permits flicker-based technology without objectionable flicker (to most eyes) and without needing to use interpolation. Alas, 120fps@120Hz is currently the territory of computers playing video games via a powerful GPU.
post #3366 of 3670
Quote:
Originally Posted by Mark Rejhon View Post

This is one huge reason I'm a big advocate of 120fps@120Hz native material, because it easily permits flicker-based technology without objectionable flicker (to most eyes) and without needing to use interpolation.

 

ABSOfreakingLUTELY!!!!!!!!!!!!!!!!!!!!!!!!!

post #3367 of 3670
Quote:
Originally Posted by Mark Rejhon View Post

24fps@24Hz CRT (1:1 pulldown) completely eliminates motion blur and judder for movies, but is super duper flickery.
That you can't see motion judder with 24 fps video at 24 Hz doesn't mean that it goes away since the flicker is absolutely horrible at that refresh rate. That would be like saying there is no difference between 16 mm film and 35 mm film since you can't see the difference between them when they are compared on a VHS tape. One problem can cover up another problem. I would say that the low refresh rate covers up the motion judder.

Quote:
Originally Posted by Mark Rejhon View Post

IMPORTANT NOTE:
Historically, many people's definitions of "judder" / "stutter" can vary, so to clarify, for the purposes, of this *specific* reply:
As an extreme case, I define plain movie "judder" as the sense of motion not looking as smooth/fluid as 60fps@60Hz (e.g. sports on CRT -- perfectly fluid motion, no blur, no stutter, no judder).
I don't think you can get perfectly fluid motion at 60 fps. In the BBC study there was some motion blur even at 300 fps with very quick moving video content. Their are diminishing returns the higher the frame rate goes so I don't think consumer video needs to go higher than 120 fps but I think it would be worth it to go beyond 60 fps.

Quote:
Originally Posted by irkuck View Post

This review of Asus 32" 4K monitor shows even better current problems with the 4K. There is in fact no real 4K electronics to drive 4K panels which is amazing taking into account efforts put to make those ultra high-end panels. If everything goes well the 4K electronics might become available mid-2014.
Isn't this mainly a problem caused by NVIDIA? NVIDIA doesn't properly support displays that use DisplayPort MST because they are trying to protect their professional graphic cards. ATI doesn't do that and once VESA officially adds support for tiled displays later this year than ATI video cards should no longer have a problem with it.

Quote:
Originally Posted by tgm1024 View Post

Why?
Motion judder is caused by the amount of angular change between frames. If you go from 30 degrees field of view (1080p) to 60 degrees field of view (4K) then you double the amount of angular change at the same frame rate. IMAX tested a frame rate of 48 fps in the early 1990's though sadly due to cost reasons they gave up on the idea and it ended up being used only in theme park rides.
post #3368 of 3670
Quote:
Originally Posted by Mark Rejhon View Post

(1) Film based material / viewing distance
One is likely viewing the 4K material from the same distance as the 2K material, so the angular blur is the same, even though you have 2 pixels of blur, it's the same distance as 1 pixel of blur at half resolution.
The recommended viewing distance for 4K is closer than 1080p. Screen sizes for 4K TVs will generally be bigger than for 1080p ones. The screen size to viewing distance needs to be closer than for 1080p otherwise you won't be able to see the extra spatial resolution that 4K gives over 1080p.

eg. if you are currently just able to resolve the full 1080p resolution from your current TV and viewing distance, if you want to keep the same viewing distance but be able to resolve the full 3840x2160 resolution, you'd need a UHDTV twice the width or diagonal as your current your current 1080p TV.
Edited by Joe Bloggs - 7/28/13 at 7:44pm
post #3369 of 3670
Quote:
Originally Posted by Richard Paul View Post

I don't think you can get perfectly fluid motion at 60 fps. In the BBC study there was some motion blur even at 300 fps with very quick moving video content. Their are diminishing returns the higher the frame rate goes so I don't think consumer video needs to go higher than 120 fps but I think it would be worth it to go beyond 60 fps.
I agree 300fps is not the final frontier.

...However, there's extra variables.
...I should point out that you can have 60fps@60Hz with LESS motion blur than 300fps@300Hz, if 60fps is a CRT (phosphor causing 1ms of motion blur), and 300fps is a traditional LCD (1/300sec sample-and-hold creating 3.3ms of motion blur).
...Motion blur is dictated by the length of visible refresh, not by the framerate/Hz[/b] Instead of increasing framerate=Hz, you can also add black periods between frames (aka flicker) to reduce motion blur.
...In other words, 60fps frames flashed at 1 milliseconds, can have create less motion blur than 300fps frames continuously displayed for a whole 1/300sec. This creates eye-tracking-based motion blur.

For a motion animation of eye-tracking-based motion blur, view this on your current LCD display, and then again on a CRT:
TestUFO Sample-And-Hold Blur Animation: www.testufo.com/#test=eyetracking

For a motion animation of how flicker can reduce motion blur, here's a 50%:50% duty cycle flicker that reduces motion blur by 50%:
TestUFO Black Frame Insertion Demo: www.testufo.com/#test=blackframes

View these webpages in a supported browser such as Chrome or IE10+ on a sufficiently recent computer, for fast animation.
As you can see by the "see-it-for-yourself" animations, you do not necessarily need to raise framerate to reduce motion blur.
Ideally, we should raise to 120fps or 240fps, then use strobing beyond this to improve motion clarity even further.

New strobe backlights such as "LightBoost" (in computer monitors) as well as Sony's interpolation-free "Motionflow Impulse" (in new Sony HDTV's) are excellent examples of motion blur reduction via pure flicker techniques, similiar to a CRT. It is noted that LightBoost computer monitors reduces motion blur in computer games more than 80% WITHOUT increasing framerate or Hz. Look at the LightBoost testimonials as well as the LightBoost media coverage (ArsTechnica, AnandTech, TFTCentral, etc). At the 10% setting, LightBoost behaves as a 5:1 black frame insertion (85%:15% black frame:visible frame duty cycle) which reduces motion blur by 85% relative to regular 120Hz LCD, and by 92% relative to regular 60Hz LCD.
Edited by Mark Rejhon - 7/29/13 at 12:56pm
post #3370 of 3670

 

Quote:
Originally Posted by tgm1024 View Post

Quote:
Originally Posted by Joe Bloggs View Post

Quote:
Originally Posted by Chronoptimist View Post

The source rate for film is 24fps. There is no need for anything above 30Hz with 4K on the consoles. Only PC use can really take advantage of 60Hz and above, at 4K resolutions.
Judder will be more noticeable on TVs at 4K

 

Why?

 

Quote:
Originally Posted by Richard Paul View Post

Motion judder is caused by the amount of angular change between frames. If you go from 30 degrees field of view (1080p) to 60 degrees field of view (4K) then you double the amount of angular change at the same frame rate.

 

What??  How does the field of view change with resolution?  Resolution increases the number of pixels of any particular field of view, doubling it doesn't double the screen, nor does it double the angular travel of any moving object.  4K doesn't double the FOV.  According to that logic, then in your example 8K would be 180 degrees FOV which is impossible.

 

Wait.  You shouldn't have used parentheis: You mean 30 degrees AT 2K, vs. 60 degrees AT 4K?  This has nothing to do with the argument at all!  The problem then is the 30 vs. 60, NOT the resolution.  The travel across the screen (or the travel across the CCD array for that matter) has doubled, regardless of how many pixels it takes to render it.  You're not following the argument.  What I'm trying to carefully explain is that judder is NOT more noticeable at 4K than it is at 2K, which was the statement.  It was in response to this idea that HFR was driven by the increase in resolution, which it was not.

 

Just for an example, take the following (some information not needed, but useful because we keep being derailed): Keep the frame rate the same.  Keep the shutter speed the same.  Keep the camera's physical sensor size (in millimeters) the same.  Compare 2K to 4K.

 

A ball moving from right to left will smear across the CCD/CMOS/whatever array the same distance regardless of the resoution.  From frame to frame it will hop the same physical distance in the sensor-space and in screen space in the end.  You will have more pixels describing that very same blur.  And that very same hop.  An increase in resolution by itself does not mandate an increase in frame rate at all.


Edited by tgm1024 - 7/29/13 at 10:10am
post #3371 of 3670
Has anyone heard about the new standard being completed? I'm assuming more details of the official specs will be available at IFA or CEDIA
post #3372 of 3670
First 4K content service is Live; ODEMAX



.
post #3373 of 3670
Saw Smurfs 2, shot with Sony’s F65 4K cameras- spectacular! I think they are better than Red's. However everything I've seen on Sony's 4K TVs did not look near as good. I'll be watching 4K movies at the theater for the foreseeable future.
post #3374 of 3670
Quote:
Originally Posted by Bill View Post

Saw Smurfs 2, shot with Sony’s F65 4K cameras- spectacular! I think they are better than Red's. However everything I've seen on Sony's 4K TVs did not look near as good. I'll be watching 4K movies at the theater for the foreseeable future.
Hope that someone shoots movies with F65 and put them up on Odemax.

Odemax is not just for movies shot on Red cameras.

But as there are really only Red Cameras and Sony F65 and F55 that can shoot high end 4K, there will be much Red footage there as there are more Red cameras in hands of movie makers than Sony 4K cameras.

As for better than Red; Smurfs 2 have been released in 4K so that's an advantage.
Have you seen Oz the Great and Powerful, The Hobbit, The Great Gatsby, Jack the Giant Slayer etc. in the same cinema? (chose those titles for the comparable mix of Live and CGI).

The F65 is a very good camera, or rather say a very good sensor, but the camera is very large and cost twice the price of a Red Epic.
It also has sensor that is four years newer than the RED MX (Mysterious-X) sensor.
So why does Sony need to make such a huge camera to make the images looking good?
Does Sony need a lot of in camera processing to make good images?

Fact is that Sony themselves has almost beaten the F65 with their own F55, both in price, size and image quality.

The first tests of the new sensor for Red Epic (Dragon) have just landed and the consensus is that it beats the sensor in the F65, and again, the F65 is so large that it is not something you can work with without a large crew.
Try to put the F65 on your shoulder or a steadycam. tongue.gif

Sony F65 vs Red Epic camera bodies.

post #3375 of 3670
Yes, saw all those movies in the same cinema, just saw Wolverine 3D also. Not even close to Smurfs. As far as size- look at TV studio HD cameras. For studios, is size relevant to shooting movies? When I go to the movies I don't care about the camera size. If Sony's and Red's new cameras are better- Wow! Is Odemax going to use codecs equal to the theater? Odemax will be good for those that like to watch movies again but I'm not one of them. I want 4K TV. I watch far more TV than movies. I'm not holding my breath waiting for that.
Edited by Bill - 8/4/13 at 11:11pm
post #3376 of 3670
Quote:
Originally Posted by Bill View Post

Yes, saw all those movies in the same cinema, just saw Wolverine 3D also. Not even close to Smurfs. As far as size- look at TV studio HD cameras. For studios, is size relevant to shooting movies? When I go to the movies I don't care about the camera size. If Sony's and Red's new cameras are better- Wow!
Camera size and weight imortant for faster and more fluid production. Not so much in studio, but most movies relay very much on location.
For most movies 75% of the shots are done in a "hand held" configiration.
Quote:
Is Odemax going to use codecs equal to the theater? Odemax will be good for those that like to watch movies again but I'm not one of them.
Odemax delivery is based on the RedRay player.
The compression codec is developed by RED, and is based on the same compression technology RED use in their cameras.

Cinemas use Jpeg2000 codec with huge files.
This is not practical for delivery for home theatres.

Odemax is a new type of distribution system that is also going to be used for Cinemas with full ticket accounting system, but probably most for Independent cinemas.

The RedRay encoder can compress 4K (or up-convert 2K) with the following bitrates.
36Mb/s Cinema quality.
18Mb/s premium HT display quality.
9Mb/s for small screen displays.

Beta test shows the following data rates/size;
Quote:
9MB per minute 68.8MB (90min = 6.2GB)
18MB per minute 134MB (90min = 12GB)
36MB per minute 265MB (90min = 23.9GB)

These are current data rates/specs for 4096x2160 projects.
Link

Quote:
I want 4K TV. I watch far more TV than movies. I'm not holding my breath waiting for that.
Be careful. It might take so much time for 4K to enter broadcast that you might suffocate. biggrin.gif
There are no problem transmitting 4K over satellite or fast cable, but both 4K broadcast cameras and support equipment for Live 4K doesn't really exist yet.

Here are a behind the scenes video from the shooting of Elysium shooting on Red Epic cameras. Notice how many of the cameras are not used on stationary tripods. Camera size/weight matters for the poor guys shooting this.
post #3377 of 3670
Quote:
Originally Posted by coolscan View Post

For most movies 75% of the shots are done in a "hand held" configiration.

 

[.........]

Here are a behind the scenes video from the shooting of Elysium shooting on Red Epic cameras. Notice how many of the cameras are not used on stationary tripods. Camera size/weight matters for the poor guys shooting this.

 

It's occurred to me that since the style of shot is now that bouncing hand-help NYPD-blue hooey for just about everything that there would be a lot of cost savings.  Setup must be a fraction of what it used to be: in the older days, capturing nearly any kind of lateral motion used to require a tracking shot.

post #3378 of 3670
It will benefit Passive 3D, but it may likely to negatively impact 2D.jUCq
post #3379 of 3670
Quote:
Originally Posted by cartrakes View Post

It will benefit Passive 3D, but it may likely to negatively impact 2D.jUCq

huh? you should quote what you are posting in regards to, so people other than you know what you're talking about.
post #3380 of 3670
This new Panasonic VIERA® 65" Class WT600 Series Ultra HD TV claims to have been fitted with HDMI 2.0.
post #3381 of 3670
Quote:
Originally Posted by coolscan View Post

This new Panasonic VIERA® 65" Class WT600 Series Ultra HD TV claims to have been fitted with HDMI 2.0.

Even if it turns out not have HDMI 2.0, it does have DisplayPort 1.2 a which is currently available and can do 4K@60hz.
post #3382 of 3670
Quote:
Originally Posted by sytech View Post

Quote:
Originally Posted by coolscan View Post

This new Panasonic VIERA® 65" Class WT600 Series Ultra HD TV claims to have been fitted with HDMI 2.0.

Even if it turns out not have HDMI 2.0, it does have DisplayPort 1.2 a which is currently available and can do 4K@60hz.

 

The claim is clear: HDMI 2.0 right on the propaganda page that coolscan posted.  :)  However, I'm noticing that they've left off a "specs" page.  And there's no price.

 

Also, hitting the "Store Locator" yields "we cannot find your model number".  LOL!

 

But in any case, this is good news to have happened at this point: we're still in August.

post #3383 of 3670
Quote:
Originally Posted by tgm1024 View Post

The claim is clear: HDMI 2.0 right on the propaganda page that coolscan posted.  smile.gif  However, I'm noticing that they've left off a "specs" page.  And there's no price.

Also, hitting the "Store Locator" yields "we cannot find your model number".  LOL!

But in any case, this is good news to have happened at this point: we're still in August.

I wonder if they are going to eventually change over the PS4 to HDMI 2.0. Like when they added HDMI support to the Xbox 360. Maybe I should hold off my PS4 purchase.
post #3384 of 3670
I've been told this is one of the next generation TVs that will be introduced at IFA next week in Germany. It's not planned for quite a wile and HDMI 2.0 will be out by the time Panasonic begins production.
post #3385 of 3670
Quote:
Originally Posted by sytech View Post

I wonder if they are going to eventually change over the PS4 to HDMI 2.0. Like when they added HDMI support to the Xbox 360. Maybe I should hold off my PS4 purchase.
They probably will; but what benefit would HDMI 2.0 bring to the PS4?
post #3386 of 3670
Quote:
Originally Posted by sytech View Post

Quote:
Originally Posted by tgm1024 View Post

The claim is clear: HDMI 2.0 right on the propaganda page that coolscan posted.  smile.gif  However, I'm noticing that they've left off a "specs" page.  And there's no price.

Also, hitting the "Store Locator" yields "we cannot find your model number".  LOL!

But in any case, this is good news to have happened at this point: we're still in August.

I wonder if they are going to eventually change over the PS4 to HDMI 2.0. Like when they added HDMI support to the Xbox 360. Maybe I should hold off my PS4 purchase.

Both consoles eventually wind up selling newer models w/ the newest/latest input. It would allow GAMES to play at 4K res, because it can handle the 50-60fps vs the 24-30 that movies use(HDMI 1.4b). This of course is meaningless unless there's a boatload of 4K HDMI 2.0 sets out there, and developers convinced there's a demand to produce a 4K game. It'll eventually happen at some point, but not in the near future unfortunately.
Edited by rightintel - 8/28/13 at 7:00pm
post #3387 of 3670
Quote:
Originally Posted by rightintel View Post

Both consoles eventually wind up selling newer models w/ the newest/latest input. It would allow GAMES to play at 4K res, because it can handle the 50-60fps
The consoles barely have enough power to render games above 30fps at 1080p. There will be no 4K60 games on either system.
post #3388 of 3670
Quote:
Originally Posted by rightintel View Post

This of course is meaningless unless there's a boatload of 4K HDMI 2.0 sets out there, and developers convinced there's a demand to produce a 4K game. It'll eventually happen at some point, but not in the near future unfortunately.

 

Game consoles, not games.  It is no harder to produce a game at 1080p than it is at 2160p because the output resolution is not a property of the game itself.  Games render on the fly, and most of the rendering issues are based entirely on the hardware support.  The game itself is written precisely the same.  And even if the software remains aware of the output resolution, a rippling surface of water is tesselated into triangles depending upon what the output abilities are.  Games have *always* been written to handle multiple output resolutions, because the meshes are resolution independent.  The only drawbacks are the resolution of the textures.

post #3389 of 3670
Quote:
Originally Posted by tgm1024 View Post

Game consoles, not games.  It is no harder to produce a game at 1080p than it is at 2160p because the output resolution is not a property of the game itself.  Games render on the fly, and most of the rendering issues are based entirely on the hardware support.  The game itself is written precisely the same.  And even if the software remains aware of the output resolution, a rippling surface of water is tesselated into triangles depending upon what the output abilities are.  Games have *always* been written to handle multiple output resolutions, because the meshes are resolution independent.  The only drawbacks are the resolution of the textures.
But meshes aren't resolution independent. If you have an object (such as a character) made out of a set number of polygons, on a higher resolution display it will still have that limited number of polygons and on a higher resolution display the fact it is made up of a set number of polygons will probably be more obvious, Also, the games for a console will try to do as much as possible at a given frame rate/resolution before it takes too long to draw the frame. So the game will limit the amount of polygons/objects (as well as, like you said, textures) on screen at a given time or other things done to the capabilities of the console. So it isn't just the texture resolution that is the limitation, but other capabilities - how many polygons/objects can be drawn and other limitations of the console in terms of memory/processing speed.
Edited by Joe Bloggs - 8/29/13 at 4:29pm
post #3390 of 3670
Quote:
Originally Posted by Joe Bloggs View Post
 
Quote:
Originally Posted by tgm1024 View Post

Game consoles, not games.  It is no harder to produce a game at 1080p than it is at 2160p because the output resolution is not a property of the game itself.  Games render on the fly, and most of the rendering issues are based entirely on the hardware support.  The game itself is written precisely the same.  And even if the software remains aware of the output resolution, a rippling surface of water is tesselated into triangles depending upon what the output abilities are.  Games have *always* been written to handle multiple output resolutions, because the meshes are resolution independent.  The only drawbacks are the resolution of the textures.
But meshes aren't resolution independent. If you have an object (such as a character) made out of a small number of polygons, on a higher resolution display it will still have that limited number of polygons and on a higher resolution display the fact it is made up of a set number of polygons will probably be more obvious, Also, the games for a console will try to do as much as possible at a given frame rate/resolution before it takes too long to draw the frame. So the game will limit the amount of polygons/objects (as well as, like you said, textures) on screen at a given time to the capabilities of the console. So it isn't just the texture resolution that is the limitation, but other capabilities - how many polygons/objects can be drawn and other limitations of the console in terms of memory/processing speed.

 

In games where the fundamental mesh components are entirely spelled out and not (as in my example) made with complex objects broken into triangles pre-rendering (dependent upon what the hardware sees), then you don't gain components by going up in resolution.  But that doesn't mean that meshes aren't resolution independent.  They are entirely independent from the number of pixels used to display them because any one component is drawn with as many pixels as necessary to complete it's size in the screen coordinate system.

 

The bottom line is that you'll see plenty of games previously run at lower resolutions run just fine with spectacular results at 4K.  There will be no waiting for or lack of "4K games" as was suggested.

New Posts  All Forums:Forum Nav:
  Return Home
AVS › AVS Forum › Display Devices › Flat Panels General and OLED Technology › 4k by 2k or Quad HD...lots of rumors? thoughts?