or Connect
AVS › AVS Forum › Gaming & Content Streaming › Home Theater Gaming › Xbox Area › The Official Xbox One thread...
New Posts  All Forums:Forum Nav:

The Official Xbox One thread... - Page 270

post #8071 of 14765
Quote:
Originally Posted by mboojigga View Post

Campaign looks great from what I have seen. The multiplayer doesn't look as good.

They talked about "looks" regarding visuals, all the gameplay stuff they talked about was "not sure who it's for" and "linear scripted QTE." The only titles that keep raving on about is Titanfall and Forza.
post #8072 of 14765
Quote:
Originally Posted by samendolaro View Post

I have an 80" sharp and sit between 10-12 away.
I can tell if something is in 720P But I can't tell if it is 720P up-scaled to 1080. As long as they don't leave it in native 720P it may be a bit softer but overall won't be noticeable

It's being upscaled either way, you're just noticing the difference in the quality of scaling between the console and your TV. That you can even notice a difference at all means you can resolve more than 720p from your distance.
post #8073 of 14765
Quote:
Originally Posted by darthrsg View Post

They talked about "looks" regarding visuals, all the gameplay stuff they talked about was "not sure who it's for" and "linear scripted QTE." The only titles that keep raving on about is Titanfall and Forza.

See I am excited for Titanfall but will increase that after 2014 arrives for the Spring release. Didn't the developers say this game isn't all QTE? Pretty much isn't ther description the same as God of War which is epic linear scripted QTE?
post #8074 of 14765
Quote:
Originally Posted by bd2003 View Post

Those charts aren't really applicable to games, because film doesn't have any issues with aliasing, texture crawl, etc. Antialiasing isn't good enough to cover all those flaws completely. Even at your distance, a 1080p game will still look much better than a 720p game. It might not be sharper, but it'll be more stable and free of rendering and scaling artifacts.
Those charts are based on the capability of the human eye to resolve detail. They are applicable to any source material.
post #8075 of 14765
Quote:
Originally Posted by Kimeran View Post

But you have to have such a large monitor to get the benefit of 1080p.

Here is a link to what I am talking about:
http://www.rtings.com/info/television-size-to-distance-relationship

I only have a 55" display and it is 1080p, I sit about 10 feet away(I may be a little further). So I am barely in the window to see any benefit of watching in 1080p versus 720p. 4K is of basically no significance to me at all unless I get a TV that is minimum of 80"

So count me as one for the crowd of wanting 720p with higher frame rate.
I've been saying some of this regarding the move for 4K. Most people are already not resolving 1080p fully. Personally, I'm about 15' back from a 120" 1080p image, so I'm covered... but still feel no need for 4K any time soon.

Still, none of this addresses the fact that current gen can do 720p/60 with reasonable fidelity already. I'm all for piling on the pretty shaders and particle effects, but... 1080p should really be the standard for this new generation. That said, there are always issues with launch titles, so I imagine it will all work out fairly quickly. And all the criticisms aside, Ryse does look insanely pretty in motion.
post #8076 of 14765
http://news.cnet.com/8301-10805_3-57603478-75/microsoft-plans-more-tv-shows-for-the-xbox-one/?part=rss&subj=news&tag=title

I love this capacity and strategy with Xbox. It is the only feature outside of VR/OR compatibility that would persuade me to buy Xbox One Between month's 1-6. I am one of those who think they got the TV integration part right. But just miscommunicated it. In fact...they oversold it too much up front. But if they gave me the ability to ala carte purchase all ESPN, Discovery, Weather and Science channels up front, I'm all in. Especially if they allowed me to also flush 100% of those worthless, biased, propaganda driven Cable news shows along with archaic movie channels. They extend this to ala carte rollout of popular programs and I rethink my decision.
post #8077 of 14765
Quote:
Originally Posted by barrelbelly 
But if they gave me the ability to ala carte purchase all ESPN, Discovery, Weather and Science channels up front, I'm all in. Especially if they allowed me to also flush 100% of those worthless, biased, propaganda driven Cable news shows along with archaic movie channels. They extend this to ala carte rollout of popular programs and I rethink my decision.
I wouldn't hold your breath for this. If anything it seems like they've embraced the bundled strategy of cableco/satellite with their passthrough-overlay setup. At this point a la carte TV is still the unicorn of the entertainment industry.
post #8078 of 14765
Quote:
Originally Posted by bd2003 View Post

They'd kill that franchise if they ever dropped it to 30fps.

It'll be pretty hilarious if Titanfall does become the new king of shooters though. The same dev team using the same engine (by tyrantII's definition) will have dominated an entire genre, arguably the entire industry, for over a decade. That's insane.

Call of Duty hasn't been 60fps on console since mw2 and it's more popular than ever. This even WITH the shrinking view angle and software 'tricks' to exact more performance. The average Call of Duty player doesn't care about any of that. Of course, the average Call of Duty player isn't the same person as the average Call of Duty player from 4 years ago...
post #8079 of 14765
Quote:
Originally Posted by c.kingsley View Post

Those charts are based on the capability of the human eye to resolve detail. They are applicable to any source material.

I think there is quite a difference from movies to games. A movie isn't really rendering graphics at a particular resolution, it is being capture and displayed. Perhaps when some gamers downsample down to 1080p from higher resolutions it is something similar but even then real life detail versus computer rendered graphics don't really match up.
post #8080 of 14765
Kingsley and Anderson are right. it's not strictly about resolution. I owned the original Serious Sam and played that game to death on the PC at 1600x1200 at 60+ fps. It was one of the best LAN games ever. Pile in tons of humans (16? forgot) against a massive army of AI enemies. And it pales in comparison to the new Serious Sam HD which runs at 720p but has far more complex models and much better textures. The lighting is better, the sound is better, and the graphics are hugely better. The only disappointing part is the lesser number of human players.

And then if you compare Serious Sam 3 which runs sub 720p, it still looks far superior to the original Serious Sam.

And once again, these are launch titles. You compare what was on the 360 and PS3 at launch versus a year+ down the road and it was night and day.
post #8081 of 14765
Quote:
Originally Posted by freemeat View Post

I think there is quite a difference from movies to games. A movie isn't really rendering graphics at a particular resolution, it is being capture and displayed. Perhaps when some gamers downsample down to 1080p from higher resolutions it is something similar but even then real life detail versus computer rendered graphics don't really match up.
The facts are the facts, this isn't a matter where opinions get to win. Resolvable resolution vs distance does not depend on the source material. Since this is the AVScience forum, if you'd like to not take my word for it, go ask it in the TV forums and run it past the collective intelligence of the ISF calibrators, TV engineers, etc.
post #8082 of 14765
Quote:
Originally Posted by bd2003 View Post

They'd kill that franchise if they ever dropped it to 30fps.

It'll be pretty hilarious if Titanfall does become the new king of shooters though. The same dev team using the same engine (by tyrantII's definition) will have dominated an entire genre, arguably the entire industry, for over a decade. That's insane.

Source is a bit more advanced (and versatile) than the COD engine, but it still has its limitations. My guess is they did so to because it's familiar, and a DX11.1+ Source 2 should be out by the time they start the sequel setting them up perfectly to make the switch.

Meanwhile Activision will still be piecemeal patching that grandfather engine and revolutionary fish AI.

smile.gif
post #8083 of 14765
Quote:
Originally Posted by c.kingsley View Post

The facts are the facts, this isn't a matter where opinions get to win. Resolvable resolution vs distance does not depend on the source material. Since this is the AVScience forum, if you'd like to not take my word for it, go ask it in the TV forums and run it past the collective intelligence of the ISF calibrators, TV engineers, etc.

I'm not arguing resolvable resolution vs distance. My point was that in motion the way games are rendered and image quality are quite a bit different than compared to a movie and image quality at the same final 1080p output. All 1080p images are not the same, more importantly so they are not the same in motion.
post #8084 of 14765
Quote:
Originally Posted by freemeat View Post

I'm not arguing resolvable resolution vs distance. My point was that in motion the way games are rendered and image quality are quite a bit different than compared to a movie and image quality at the same final 1080p output. All 1080p images are not the same, more importantly so they are not the same in motion.
That is because there are other factors, like texture resolution, which make up the image a game presents to the television. This is why two games of equal resolution can look vastly different from one another. It is also why if you compare a modern game like Crysis 3 at 1080p to an old game like HL2, WoW, etc., the old games won't look nearly as good. There are more variables in the perceived realism equation than screen resolution. You could have 4k screen resolution with 8 bit texture resolution and it will still look like an 8 bit game.
post #8085 of 14765
Quote:
Originally Posted by c.kingsley View Post

That is because there are other factors, like texture resolution, which make up the image a game presents to the television. This is why two games of equal resolution can look vastly different from one another. It is also why if you compare a modern game like Crysis 3 at 1080p to an old game like HL2, WoW, etc., the old games won't look nearly as good. There are more variables in the perceived realism equation than screen resolution. You could have 4k screen resolution with 8 bit texture resolution and it will still look like an 8 bit game.

Exactly, which is why I posted this yesterday wink.gif "Resolution is just one factor. I really don't want secs locking themselves into 1080p no matter what. Going 720p or less means they can do so much more in other areas that effect graphics, polygons, textures, physics, number of items on screen, ai etc etc etc all take up resources."

That's also why filming real life is quite a bit different than 1080p rendering on a game. A movie at DVD resolutions still has far more details in textures in motion than 1080p games. You simply aren't dealing with a fixed texture when recording real life and dealing with movement, you zoom the camera in the "texture" of real life doesn't stay the same amount of pixels as it would in a game (granted many games swap out textures for higher res textures but no infinitely).
post #8086 of 14765
Quote:
Originally Posted by freemeat View Post

Exactly, which is why I posted this yesterday wink.gif "Resolution is just one factor. I really don't want secs locking themselves into 1080p no matter what. Going 720p or less means they can do so much more in other areas that effect graphics, polygons, textures, physics, number of items on screen, ai etc etc etc all take up resources."

That's also why filming real life is quite a bit different than 1080p rendering on a game. A movie at DVD resolutions still has far more details in textures in motion than 1080p games. You simply aren't dealing with a fixed texture when recording real life and dealing with movement, you zoom the camera in the "texture" of real life doesn't stay the same amount of pixels as it would in a game (granted many games swap out textures for higher res textures but no infinitely).
I'm not sure what you're arguing on about then. I said that resolvable detail doesn't depend on the source material. You disputed that "because games and movies are different." Which is it? If you agree that resolvable detail (pixel resolution) doesn't depend on the source material then I'm not sure why we're going back and forth?
post #8087 of 14765
Quote:
Originally Posted by c.kingsley View Post

I'm not sure what you're arguing on about then. I said that resolvable detail doesn't depend on the source material. You disputed that "because games and movies are different." Which is it? If you agree that resolvable detail (pixel resolution) doesn't depend on the source material then I'm not sure why we're going back and forth?

It all goes back to the post from BD2003 "Originally Posted by bd2003 View Post ..
Those charts aren't really applicable to games, because film doesn't have any issues with aliasing, texture crawl, etc..." which is what you were quoting. . Video game rendering simply match real life and how it relates to being capture for viewing and displayed in motion. We'll just have to agree to disagree.
post #8088 of 14765
Quote:
Originally Posted by freemeat View Post

It all goes back to the post from BD2003 "Originally Posted by bd2003 View Post ..
Those charts aren't really applicable to games, because film doesn't have any issues with aliasing, texture crawl, etc..." which is what you were quoting. . Video game rendering simply match real life and how it relates to being capture for viewing and displayed in motion. We'll just have to agree to disagree.
Yes, and BD2003 said something that was not true. Those charts are applicable regardless of source material. This isn't an agree do disagree situation, it is a simple fact.
post #8089 of 14765
Quote:
Originally Posted by c.kingsley View Post

Those charts are based on the capability of the human eye to resolve detail. They are applicable to any source material.

Sure, but the problem is that games tend to have "details" that make them look worse than other video content. Increasing the resolution beyond what that chart says you can resolve makes those anomalies less noticeable.

It's almost impossible to describe by words alone, but visually side by side you'd have no problem discerning the difference between 720p and 1080p well beyond what that chart would suggest.
post #8090 of 14765
Quote:
Originally Posted by bd2003 View Post

It's almost impossible to describe by words alone, but visually side by side you'd have no problem discerning the difference between 720p and 1080p well beyond what that chart would suggest.
Maybe if you're some kind of 2 sigma outlier. Those charts deal with resolvable pixel resolution based on average 20/20 vision and that does not depend on the source material.
post #8091 of 14765
Do you guys remember way back during the original Xbox One reveal event, after the event they were taking questions and stuff, and one guy was asking Phil Harrison if cord cutters would be able to use the XB1 features in regards to TV watching and stuff. I remember Phil Harrison said something along the lines that they would have a solution for everybody eventually. He said something about a OTA TV Tuner box that had hdmi out would work just fine with the XB1.

Now that this system is like 65 days away or whatever, has anybody heard any more talk about a OTA TV Tuner with HDMI out and the XB1 ?

Or are cord cutters screwed when it comes to using the fancy TV options of XB1 ?
post #8092 of 14765
Quote:
Originally Posted by c.kingsley View Post

Maybe if you're some kind of 2 sigma outlier. Those charts deal with resolvable pixel resolution based on average 20/20 vision and that does not depend on the source material.

Dunno if it's even worth trying to explain what I mean if it's that black and white to you.
post #8093 of 14765
Quote:
Originally Posted by Anthony1 View Post

Do you guys remember way back during the original Xbox One reveal event, after the event they were taking questions and stuff, and one guy was asking Phil Harrison if cord cutters would be able to use the XB1 features in regards to TV watching and stuff. I remember Phil Harrison said something along the lines that they would have a solution for everybody eventually. He said something about a OTA TV Tuner box that had hdmi out would work just fine with the XB1.

Now that this system is like 65 days away or whatever, has anybody heard any more talk about a OTA TV Tuner with HDMI out and the XB1 ?

Or are cord cutters screwed when it comes to using the fancy TV options of XB1 ?
I have not, but I don't see any reason why you couldn't use a standard OTA tuner w/ HMDI in theory. You'll probably enter your zip code and it'll merge the relevant guide data which is how it works in WMC now.
post #8094 of 14765
Quote:
Originally Posted by bd2003 View Post

Dunno if it's even worth trying to explain what I mean if it's that black and white to you.
I think you're the one in need of an explanation because you think a basic fact of home theater science is open to debate. There are countless threads on these forums about screen resolution and seating distance, I suggest you go search them and find the answers you still need because I'm not going to derail this thread any further on a topic which has been explained to death already.
post #8095 of 14765
Quote:
Originally Posted by c.kingsley View Post

I think you're the one in need of an explanation because you think a basic fact of home theater science is open to debate. There are countless threads on these forums about screen resolution and seating distance, I suggest you go search them and find the answers you still need.

Well, you conveniently ignored the important part, missing the point entirely.
post #8096 of 14765
The most noticeable difference between 720p and 1080p from a distance is aliasing. TLOU on PS3 for example, a beautiful game running at 720p. I'm playing it from 10ft away on a 55" TV. Some of the buildings have jagged edges that are noticeable to anyone even 12-15 ft away. If the game was rendered in 1080p edges would be more precise and therefore not be jagged.

Also I think the one poster made a great point earlier. If the camera zooms in on a real life image detail becomes much greater. If you zoom in on a video game, it exposes how well or poorly the textures were rendered.

You can argue the distance to resolution chart all you want, but if Devs can't properly dispose of aliasing with a 720p video game, then it throws that argument out the window.
post #8097 of 14765
Quote:
Originally Posted by Captain Gregg View Post

The most noticeable difference between 720p and 1080p from a distance is aliasing. TLOU on PS3 for example, a beautiful game running at 720p. I'm playing it from 10ft away on a 55" TV. Some of the buildings have jagged edges that are noticeable to anyone even 12-15 ft away. If the game was rendered in 1080p edges would be more precise and therefore not be jagged.
To be precise: they would appear less jagged at 1080p than 720p. Alternatively, they could use anti-aliasing to solve this problem more effectively than throwing more resolution at the problem. At this stage in the game nothing should have "jaggies" on it. I haven't played a PC game on moderate hardware with less than 4x AA in almost 10 years.
post #8098 of 14765
Quote:
Originally Posted by c.kingsley View Post

To be precise: they would appear less jagged at 1080p than 720p. Alternatively, they could use anti-aliasing to solve this problem more effectively than throwing more resolution at the problem. At this stage in the game nothing should have "jaggies" on it. I haven't played a PC game on moderate hardware with less than 4x AA in almost 10 years.

Ok, we'll someone was trying to argue, there is no difference between 1080p film and a 1080p video game, which is false. Real life doesn't have aliasing. Aliasing is noticeable on a 55" screen from 12-15 ft away. Therefore the resolution to viewing distance chart doesn't apply as well to video games. Does it ?
post #8099 of 14765
I think I understand where BD is coming from.

In film your image is being captured from a solid source of much higher resolution. Lets just say that there are no pixels in real life....so in effect any video is downsampled

However, in a game nothing has been captured. The game is generating pixels to make the image. In some cases the game needs to be upsampled for better picture...therefore you are sayign that the higher the native resolution the better for other effects to come into play?

do I have that right?

Anyways, I still dont think games need to be displayed at 1080p if it results in too much decrease in frame rate. What I think is more important is for the video to have better anti-aliasing and such for smoother and better images....basically I am saying that they should focus more on all the other areas of graphics rather than simply getting to that 1080p mark as I see them as more important.

1080p is still a good target as I am sure many will benefit from it. The amount who have big enough displays or sit close enough though, I honestly dont know.
post #8100 of 14765
Quote:
Originally Posted by c.kingsley View Post

To be precise: they would appear less jagged at 1080p than 720p. Alternatively, they could use anti-aliasing to solve this problem more effectively than throwing more resolution at the problem. At this stage in the game nothing should have "jaggies" on it. I haven't played a PC game on moderate hardware with less than 4x AA in almost 10 years.

There's no form of AA out there that can completely get the job done. The MSAA you're used to using certainly can't. There's always SSAA, but it doesn't make sense to use that unless you're already using every pixel.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Xbox Area
AVS › AVS Forum › Gaming & Content Streaming › Home Theater Gaming › Xbox Area › The Official Xbox One thread...