or Connect
AVS › AVS Forum › Gaming & Content Streaming › Home Theater Gaming › Xbox Area › Xbox getting the shaft on FPS
New Posts  All Forums:Forum Nav:

Xbox getting the shaft on FPS - Page 4

post #91 of 647
This is an endless pointless argument that goes on every generation. If you don't care about exclusives and you don't have a bunch of friends that play in LIVE, get the PS4.

I like exclusives on both of them so I will eventually get a PS4 as well. I bought the Xbox One first because I have friends on LIVE and Titanfall looks cool as hell. I also prefer the xbox controller for shooters due to the analog sticks being off set.

I have no interest in Second Son so there isn't anything coming out on PS4 that I want for a while.
post #92 of 647
Quote:
Originally Posted by blklightning View Post

It's not getting the shaft; the Xbox is lacking quite a bit in power compared to the PS4. It only gets more obvious from here. Imagine how much better the graphics on the Xbox could have been had MS sunk that hundred bucks into the processor and GPU instead of a stupid camera that no one wants. Had they done that, I'd own one right now.

I love this argument, I really do, because it shows a clear lack of information and knowledge about how hardware actually works. Hardware wise, the systems are 99% identical.

You know why the PS4 has a slight edge in visuals and framerate, but takes longer to load and is a bit slower at everything else? Because ALL of its RAM is VRAM, it's video ram that is better used at graphical processing. It's why PCs still use normal ram and VRAM. This is also why the XBO has the edge in multitasking functionality, it's using the proper ram for it.

And clearly, you also do not own an XBO, or you wouldn't be complaining about the kinect. Yeah, the kinect on the 360 was horrid, but the functionality and ease of use and pretty dang good accuracy on the 2.0 is quite good. I'll take the functionality over a 100 dollar cheaper system with nothing I want to play.

Even still, neither system holds a candle to my PC when it comes to performance, so why do I even have an XBO? 3 exclusive series, that's it.

Again, fan boys on both sides need to stfu about specs when the majority of them clearly have no idea how the hardware even works. A bigger number doesn't always mean better...
post #93 of 647
Correct me if I'm wrong here buuuuuut these two systems are basically assembled from the same bin of parts, aren't they??? With marginal "on paper" Spec differences but otherwise.....I thought they were on the path to being PC clones of one-another, just hammered with different OS Kernels. Sure, that's the 50k foot look but......when it boils down to it....theses could be the two most identical systems ever released. It's not like anyone ever confused the output of the Atari's or Intellivisions or Colecovisions, or even the Genesis and SNES......the xbox and ps2 began to look the same until, towards the end, the xbox emerged at being better at widescreen/480p gaming, slightly, than the PS2 was ready for.

But the PS3 and 360 were two systems grown on separate worlds...and the games wound up looking exactly the same. Exactly.

I love it when reviewers have to hype something up to get readers, so they'll blow up some paper number like..."The Xbox One has 3.2 million JigglyBits, and the PS4 has 3.8 million JigglyBits, and that means a significant advantage for PS4!!!" (or Xbox, depending on the site)......but in the real world that on-paper difference might translate to: The game loads 4 seconds faster one system.....or the game has a 5 fps advantage on one over the other, etc. If it turns out these systems are NOT equal as time shakes them both down......I'll consider myself schooled, and be very interested to see what bottleneck was created to limit the power of one over the other..that I'd be curious to read up on.

But right now there' s no reason to think that Next-Gen won't be just like Last-Gen......two systems producing games that look exactly the same.....
post #94 of 647
Quote:
Originally Posted by HeadRusch View Post

Correct me if I'm wrong here buuuuuut these two systems are basically assembled from the same bin of parts, aren't they??? With marginal "on paper" Spec differences but otherwise.....I thought they were on the path to being PC clones of one-another, just hammered with different OS Kernels. Sure, that's the 50k foot look but......when it boils down to it....theses could be the two most identical systems ever released. It's not like anyone ever confused the output of the Atari's or Intellivisions or Colecovisions, or even the Genesis and SNES......the xbox and ps2 began to look the same until, towards the end, the xbox emerged at being better at widescreen/480p gaming, slightly, than the PS2 was ready for.

But the PS3 and 360 were two systems grown on separate worlds...and the games wound up looking exactly the same. Exactly.

I love it when reviewers have to hype something up to get readers, so they'll blow up some paper number like..."The Xbox One has 3.2 million JigglyBits, and the PS4 has 3.8 million JigglyBits, and that means a significant advantage for PS4!!!" (or Xbox, depending on the site)......but in the real world that on-paper difference might translate to: The game loads 4 seconds faster one system.....or the game has a 5 fps advantage on one over the other, etc. If it turns out these systems are NOT equal as time shakes them both down......I'll consider myself schooled, and be very interested to see what bottleneck was created to limit the power of one over the other..that I'd be curious to read up on.

But right now there' s no reason to think that Next-Gen won't be just like Last-Gen......two systems producing games that look exactly the same.....

WE already know the reason the XBOne is less powerful than the PS4. Because of the on die SRAM, they had to use a smaller GPU core. The SRAM takes up a large amount of space. Without it there could have been more room for a larger GPU, but it would have also needed to be a completely different design if that was the case. Sony took a big gamble with the GDDR5 and it paid off. At first they were only going to have 4GB of memory but in the end were able to have 8GB. Which really helped them but in the end hurt Microsoft since it made their memory on par with the XBOne system.

This link show some good info about the XBOne and PS4 dies.

http://www.extremetech.com/gaming/171735-xbox-one-apu-reverse-engineered-reveals-sram-as-the-reason-for-small-gpu
post #95 of 647
I'm thinking specifically for tomb raider it's coming down to the number of GPU cores. TressFX runs on those GPU cores, and it doesn't look like there's any difference in that effect on either platform. So the hair physics is swallowing up a larger proportion of the GPU resources on the Xbox, leaving less for the rest of the graphics. Since they didn't reduce resolution to compensate, frame rate is left to take the hit.
post #96 of 647
Quote:
Originally Posted by bd2003 View Post

I'm thinking specifically for tomb raider it's coming down to the number of GPU cores. TressFX runs on those GPU cores, and it doesn't look like there's any difference in that effect on either platform. So the hair physics is swallowing up a larger proportion of the GPU resources on the Xbox, leaving less for the rest of the graphics. Since they didn't reduce resolution to compensate, frame rate is left to take the hit.
It is hard to say what really happened considering that the ports were done by two totally different developers. The PS4 has better specs on paper but there is no reason that both of these systems shouldn't be able to hit 60fps locked. The reason neither can do it is because we're simply too early in the cycle. Any performance delta between these two games by different developers does not allow any rational observer to draw any conclusion whatsoever. Anyway, back to my PC games until these systems percolate for a few more months.
post #97 of 647
Quote:
Originally Posted by c.kingsley View Post

It is hard to say what really happened considering that the ports were done by two totally different developers. The PS4 has better specs on paper but there is no reason that both of these systems shouldn't be able to hit 60fps locked. The reason neither can do it is because we're simply too early in the cycle. Any performance delta between these two games by different developers does not allow any rational observer to draw any conclusion whatsoever. Anyway, back to my PC games until these systems percolate for a few more months.

people forget how demanding Tomb Raider was on PCs, especially with tressfx enabled... even mid 2013, a lot of PC gamers just left it turned off because of the performance hit, especially if you had an Nvidia card (which is the majority of gamers). Enabling tesselation also caused a big hit on performance. The patch helped but it still pushed a lot of PCs hard.

and talking specs is kind of ridiculous because a lot of people don't really understand the specs and are resorting to regurgitating false arguments pushed out by people looking to make some things look favorable and other things look problematic, depending on their platforms strangths and weaknesses.

If you really wanted top-of-the-line graphics, you wouldn't get a PS4 or Xbox One, you would get a PC. You get one of the consoles for games, ease of use, the social networks, etc. PC gaming has come a long way but it's still more solitary than the Xbox. On the Xbox, well over half of the fanbase chats and communicates. PC gamers use chat more often now than back in the day, but it's still a small proportion. Most PC gamers are fine chatting in text boxes. A lot of games, they might as well be playing bots because if you don't hear, don't communicate, or don't chat, it's hard to tell whether the person on the other side is a bot or not (you only know because you know the game doesn't have bots). The most social game on PC is still WoW.
post #98 of 647
Quote:
Originally Posted by DaGamePimp View Post

I don't know about that, 'graphically' speaking RYSE is about the most impressive game I have played on either console (and I have 10+ games between them already).

Killzone is amazing as well and plays much better than RYSE but I'll give the edge to RYSE on visuals alone (it is Crytek after all).

Jason

I actually think NBA 2K14 is the best looking next-gen game I've played, but Ryse is a close second. I played the PS4 version of NBA 2K14, but I've heard that both versions are mostly identical with some minor differences. Killzone can be stunning at times as well.

XB1 might be a bit underpowered, but if 2K on the Xbox One is as good as the PS4's version, and Ryse looks the way it looks, then I know the XB1 is capable of some pretty good visuals, and it might require more talented programmers, but at least we know it's capable of some very nice visuals. Things could be worse.
post #99 of 647
I have 2K14 on PS4 and I would still say RYSE looks better over-all.

The only really impressive visuals in 2K14 is the characters in the real time cinema's, great presentation though.

During 2K14 gameplay it's an obvious jump over last gen but nothing compared to RYSE or KZ: SF (during gameplay).

Just my opinion. wink.gif

Jason
post #100 of 647
Quote:
Originally Posted by c.kingsley View Post

It is hard to say what really happened considering that the ports were done by two totally different developers. The PS4 has better specs on paper but there is no reason that both of these systems shouldn't be able to hit 60fps locked. The reason neither can do it is because we're simply too early in the cycle. Any performance delta between these two games by different developers does not allow any rational observer to draw any conclusion whatsoever. Anyway, back to my PC games until these systems percolate for a few more months.

No way to know 100% for sure, but it shouldn't be controversial to suggest when it comes down to raw computation of physics, more cores = better. You're certainly right that there's no reason either shouldn't be able to hit 60 if they took the time to profile performance and set the graphics accordingly - if the x1 couldn't handle the same level of quality, whether it's hair, shaders or whatever...they should have rolled it back until it could.

I'm sure digital foundry will have a painstakingly detailed analysis in a few days. If they wanted to get down to the bottom of it, they could profile the PC version to see where tressFX had the largest hit....IIRC the closer to the camera Lara's head was, and the more motion in the hair, the worse it was. So if the x1 takes a bigger hit in those situations, you could make a pretty good case her fancy looking hair is the problem.
post #101 of 647
This is why I haven't bought an xbox one yet. I keep hearing all this negativity and seeing that the ps4 is outperforming it makes me just want to keep my ps4 and that's it. Hopefully these things can change for the one in the near future.
post #102 of 647
Quote:
Originally Posted by bd2003 View Post

No way to know 100% for sure, but it shouldn't be controversial to suggest when it comes down to raw computation of physics, more cores = better. You're certainly right that there's no reason either shouldn't be able to hit 60 if they took the time to profile performance and set the graphics accordingly - if the x1 couldn't handle the same level of quality, whether it's hair, shaders or whatever...they should have rolled it back until it could.

I'm sure digital foundry will have a painstakingly detailed analysis in a few days. If they wanted to get down to the bottom of it, they could profile the PC version to see where tressFX had the largest hit....IIRC the closer to the camera Lara's head was, and the more motion in the hair, the worse it was. So if the x1 takes a bigger hit in those situations, you could make a pretty good case her fancy looking hair is the problem.
I don't think there is any reason both machines shouldn't be able to play the game at 1080p/60. The fact that they don't speaks to the development houses more than the capability of the PS4/XB1. And therein lies my point: the games had two different developers. I'm fairly certain that a year to 18 months from now as development tools and experience increases we're going to look back on these discussions and have a good laugh. There is quite a bit more juice in both of these consoles its just got to take devs time to find it and take advantage of it. Remember, they can't do that on PC because they're coding to a rainbow platform. So PC games improve through brute force...
post #103 of 647
Quote:
Originally Posted by c.kingsley View Post

I don't think there is any reason both machines shouldn't be able to play the game at 1080p/60. The fact that they don't speaks to the development houses more than the capability of the PS4/XB1. And therein lies my point: the games had two different developers. I'm fairly certain that a year to 18 months from now as development tools and experience increases we're going to look back on these discussions and have a good laugh. There is quite a bit more juice in both of these consoles its just got to take devs time to find it and take advantage of it. Remember, they can't do that on PC because they're coding to a rainbow platform. So PC games improve through brute force...

I’m not sure how correct it is but I’ve read some stuff that seems to suggest the bottleneck between the ESRAM and the system is the big reason for all the issues. The ESRAM is very fast, but you can only queue up so much data to it before it’s just waiting on the DDR3. It’s also partly the reason MS went with the GPU they did, because anything bigger wouldn’t have added any performance gains (the GPU would be just sitting idle waiting for code to get from DDR3 to the ESRAM). One guys conclusion (he works on engine coding and AA implementation) is you’re going to see a lot devs having to make the choice between 1080P or 60FPS, not both. Especially on deferred rendering engines, which are becoming more popular and later in this gen.

The API is going to get much better, but some are claiming 32MB ESRAM is just too small to run 1080P, 60FPS with high resolution textures. I’m not sure I believe that, but what we’ve seen so far does suggest there will be hurdles to get over that can directly be pinned on the hardware choices.

DF’s article should be out later today and if the XB1 version of TR:DR is mostly hovering around 45, rather than 35, that’s going to be good news and testament to CD and their sister studios.
post #104 of 647
Quote:
Originally Posted by Artman22 View Post

This is why I haven't bought an xbox one yet. I keep hearing all this negativity and seeing that the ps4 is outperforming it makes me just want to keep my ps4 and that's it. Hopefully these things can change for the one in the near future.

You buy an Xbox for the exclusives, same as the PS4 (assuming either has exclusives that interest you). Multi-plat games then become an issue of where all your friends play assuming you have some to play with. I just bought an Xbox knowing it's not as powerful as the PS4 but I did it because I want what it offers. I will buy a PS4 for iFamous once it releases. Exclusives sell the consoles to me.
post #105 of 647
Quote:
Originally Posted by pcweber111 View Post

You buy an Xbox for the exclusives, same as the PS4 (assuming either has exclusives that interest you). Multi-plat games then become an issue of where all your friends play assuming you have some to play with. I just bought an Xbox knowing it's not as powerful as the PS4 but I did it because I want what it offers. I will buy a PS4 for iFamous once it releases. Exclusives sell the consoles to me.

Exactly, it is similar to the Wii and Wii U. From a gaming aspect you don't expect them to have the best multiplat games but you buy them for their first party offerings and exclusives. Example being if you are have to have the next Gears of War or Halo, those will be on the Xbone One only.
post #106 of 647
Quote:
Originally Posted by pcweber111 View Post

You buy an Xbox for the exclusives, same as the PS4 (assuming either has exclusives that interest you). Multi-plat games then become an issue of where all your friends play assuming you have some to play with. I just bought an Xbox knowing it's not as powerful as the PS4 but I did it because I want what it offers. I will buy a PS4 for iFamous once it releases. Exclusives sell the consoles to me.
That's the same for me. I used to play all multiplatform games on my 360 simply because the controller and the party system, so that I could talk to friends even when I am playing a single player game. The only downfall to that is they don't shut up when you are trying to hear what's going on in a game, so I would end up turning the chat volume all the way down and would forget that they were even there.
post #107 of 647
Quote:
Originally Posted by Artman22 View Post

This is why I haven't bought an xbox one yet. I keep hearing all this negativity and seeing that the ps4 is outperforming it makes me just want to keep my ps4 and that's it. Hopefully these things can change for the one in the near future.

Well if you are thinking about Tomb Raider, here is the breakdown:

http://www.eurogamer.net/articles/digitalfoundry-2014-tomb-raider-definitive-performance-analysis

Quote:
GAMEPLAY
Xbox One
Lowest FPS: 18fps
Highest FPS: 30fps
Average FPS: 29.84fps

PS4
Lowest FPS: 33fps
Highest FPS: 60fps
Average FPS: 50.98fps

Across the whole selection of clips, the effect of the Xbox One frame-rate cap is dramatic - with a 78 per cent frame throughput increase on the Sony hardware, where the engine is allowed to display a newly generated frame as soon as it is ready rather than waiting for the next 33.33ms refresh as is the case on Xbox One.

Edited by freemeat - 1/27/14 at 12:29pm
post #108 of 647
Holy crap, that is a huge difference. I may just hold off on getting TR and grabbing a PS4 in february and getting it then.
post #109 of 647
It looks like the framerate fluctuates wildly on the PS4 version. I would think you'd get a better, more consistent result had they locked it down at something like 45 fps.
post #110 of 647
Quote:
Originally Posted by tsaville View Post

It looks like the framerate fluctuates wildly on the PS4 version. I would think you'd get a better, more consistent result had they locked it down at something like 45 fps.

 

A locked 45fps would look horrible on a 60hz TV, there would be constant judder. 

 

Those fluctuations likely arent as noticeable in gameplay as they are on the graph, and the majority of the game isn't as intensive as the segments they chose to analyze. I'd still have turned the settings down if I had a choice though. :P

post #111 of 647
Quote:
Originally Posted by tsaville View Post

It looks like the framerate fluctuates wildly on the PS4 version. I would think you'd get a better, more consistent result had they locked it down at something like 45 fps.

It sounds like I might prefer the XBOne version. I would rather have framerates that are mostly consistent between 24 and 30, even if lower, than to have the framerates vary wildly between 33 and 60. I would think they would be more noticeable and jarring. I guess I would need to try them both out first to see exactly how it looks to me. Locking it at 45 would have made it more consistent, but then wouldn't that cause other issues?
post #112 of 647
Quote:
Originally Posted by bd2003 View Post

A locked 45fps would look horrible on a 60hz TV, there would be constant judder. 

Those fluctuations likely arent as noticeable in gameplay as they are on the graph, and the majority of the game isn't as intensive as the segments they chose to analyze. I'd still have turned the settings down if I had a choice though. :P

That makes sense. Honestly I didn't see a huge difference in the side by side videos, but the quality of the video wasn't great.
post #113 of 647
Quote:
Originally Posted by tsaville View Post

It looks like the framerate fluctuates wildly on the PS4 version. I would think you'd get a better, more consistent result had they locked it down at something like 45 fps.

Kinda. The low is pretty bad, but an average of close to 51 means it’s not frequent and most of the time it’s got to be hovering 45-60. Capping it for the exception is leaving a lot of performance on the shelf, especially since jutter is much more jarring at lower frame rates. Drops from 60 to 45 during play aren’t going to be game changers.

XB1 version looks like it’s average is actually being undercut due the Cap (probably regularly goes into 40’s), but because jutter is worse with swings around the 30’s (lower frame rates) they did this to offer the best presentation.

So the 70% peak difference isn’t really correct. That said the pesky 40% number popped up again when comparing both min frame rates.

It be interesting to see these tests rerun with a version that disables TressFX. Especially the min values, since even a good PC rig would see huge FPS drops if the camera swung too close to Lara's hair.
Quote:
Originally Posted by aaronwt View Post

It sounds like I might prefer the XBOne version. I would rather have framerates that are mostly consistent between 24 and 30, even if lower, than to have the framerates vary wildly between 33 and 60. I would think they would be more noticeable and jarring. I guess I would need to try them both out first to see exactly how it looks to me. Locking it at 45 would have made it more consistent, but then wouldn't that cause other issues?
Quote:
Originally Posted by tsaville View Post

That makes sense. Honestly I didn't see a huge difference in the side by side videos, but the quality of the video wasn't great.

BD's got it right. You'd be trading much better input for slightly less jutter. Both versions have the same IQ otherwise, besides the framerate and jutter caused by framerate.
Edited by TyrantII - 1/27/14 at 11:14am
post #114 of 647
Quote:
Originally Posted by tsaville View Post

That makes sense. Honestly I didn't see a huge difference in the side by side videos, but the quality of the video wasn't great.

Of course you didn't see a difference, the video was running at 30fps!
post #115 of 647
Quote:
Originally Posted by bd2003 View Post

Of course you didn't see a difference, the video was running at 30fps!

LOL! I think I need to stay out of these FPS discussions; clearly not my area of expertise. biggrin.gif
post #116 of 647
Quote:
Originally Posted by tsaville View Post

LOL! I think I need to stay out of these FPS discussions; clearly not my area of expertise. biggrin.gif

You bring up a good problem though! Youtube isn’t up to the task of showing off next gen previews, let alone the crappy flash players on most gaming websites. Developers and publishers really need to start pushing put files, and asking them be shown in the best quality, so users can see what they’re really getting with games like Ryse, Killzone, and TR.

Right now it’s like marketing HD Blurays by giving out DVD media; or worse… on VHS.

http://www.gamersyde.com is good in a pinch, but you still have to download their uncompressed vids (and they rely on the publishers to get them to them).
post #117 of 647
Quote:
Originally Posted by aaronwt View Post

It sounds like I might prefer the XBOne version. I would rather have framerates that are mostly consistent between 24 and 30, even if lower, than to have the framerates vary wildly between 33 and 60. I would think they would be more noticeable and jarring. I guess I would need to try them both out first to see exactly how it looks to me. Locking it at 45 would have made it more consistent, but then wouldn't that cause other issues?

A locked framerate is only a good thing if it's a multiple of the refresh rate, which is why everything is locked to 30. Generally, the higher your refresh rate, the less fluctuations matter. 60 fluctuating to 50 is a lot less jarring than 30 fluctuating to 24. It's also considerably less noticeable if there's heavy action on screen - judder caused by fluctuations is most noticeable during something like a smooth and consistent camera pan.

Don't let the fluctuations fool you, most "locked" 60fps games have a similar level of flux. Including cod, which is known for its smooth frame rate. If you've played mw3 or black ops 2 on 360, this should offer a similar level of performance. I personally feel like DF is trying to throw the X1 a bone so as not to enrage any of their diehard MS readership, but side by side there's no question the PS4 one will look much nicer.
post #118 of 647
Quote:
Originally Posted by tsaville View Post

It looks like the framerate fluctuates wildly on the PS4 version. I would think you'd get a better, more consistent result had they locked it down at something like 45 fps.

It has been very smooth for me. It has some of the old style Halo/Killzone 2 stutters on occassion where entering a new area. It doesn't feel like it is going up and down all the time. Some massive explosions have lowered it some for sure.
post #119 of 647
Are there demonstrations to download for the Lara Croft game on the PSN and XBL?
post #120 of 647
Quote:
Originally Posted by aaronwt View Post

Are there demonstrations to download for the Lara Croft game on the PSN and XBL?

I don't think so. Huge impressions thread at Neogaf. I do wish we had more demos this generation, instead of what seems like less. I want to try NBA 2k14 and Call of Duty and I will want to try plenty of future titles. Even if they release the demo a week or two after release that is fine with me. I imagine the developers don't want to lose sells from blind buys so releasing a demo early isn't always a good idea.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Xbox Area
AVS › AVS Forum › Gaming & Content Streaming › Home Theater Gaming › Xbox Area › Xbox getting the shaft on FPS