or Connect
AVS › AVS Forum › Gaming & Content Streaming › Home Theater Gaming › Xbox Area › The Official Xbox One thread...
New Posts  All Forums:Forum Nav:

The Official Xbox One thread... - Page 293

post #8761 of 14773
leaning, headtracking, and voice commands in the battlefield 4 settings...

post #8762 of 14773
Quote:
Originally Posted by onlysublime View Post

leaning, headtracking, and voice commands in the battlefield 4 settings...



THIS is awesome news.

Little things like this will make games easier and more fun to play. I can't tell you how many times I fumbled with the controller trying to press the d-pad in order to ask someone for ammo or a medkit.

Headtracking seems interesting....so will that take the place of the right joystick if on?
post #8763 of 14773
That is cool. I'd bet there is a little bit of a learning curve getting used to the head tracking, but after that it should be a great feature--especially the leaning.
post #8764 of 14773
Thread Starter 
I always ask for a med kit when I need one on top of pressing the button anyway,so if I can just do that with my voice,even better.
post #8765 of 14773
How does head tracking work ? If I turn my head I can't see my TV confused.gif
post #8766 of 14773
I doubt you turn your head 90degrees, it should be sensitive enough for a subtle turn of the head.
post #8767 of 14773
Quote:
Originally Posted by centking View Post

How does head tracking work ? If I turn my head I can't see my TV confused.gif

Obviously your movements are exaggerated on-screen so you can still see your tv.
post #8768 of 14773
Check out Track IR for PC, it's the same as that but no goofy gear to wear! man I hope they nail it
post #8769 of 14773
Suspend/resume supported at launch

http://m.neogaf.com/showthread.php?t=704413

I'm pretty sure we knew it was coming at launch but at least we know it's for sure now lol. This is going to be awesome for us folks who constantly get interrupted by wife n kids!
post #8770 of 14773
lots of Track IR vids out there and the users swear by it...

here's a vid from 2008.
post #8771 of 14773
Ryse Confirmed At 30fps 900p For Xbox One (Click to show)
Quote:
Well folks, this is it. The numbers are in and the resolution and frame rates are finalized for Crytek's Ryse: Son of Rome. The game is legitimately capped at 30 frames per second and a resolution of 1600x900. This is how Microsoft will start the next generation of console gaming for the Xbox One.

DualShockers spotted the confirmation from an interview Digital Foundry conducted with Crytek's CEO Cevat Yerli, where the head-honcho of the tech-heavy software company relished in the sub-par settings, surrounding the disappointment many gamers are feeling with words of encouragement and reason...
Quote:
Developers always have to choose whether they go for 60 or 30fps, depending on the type of game and complexity of the project. With Ryse, we wanted to go for a very emotional experience with complex and dramatic lighting, high fidelity environments, and rich characters and character animations.

So 30fps was our choice, and we believe that most developers will go for richer worlds at 30 frames per second rather than 60fps – which would call for compromises, as 60fps demands twice the amount of compute rendering speed. 30fps is a standard that is above, for example, what most cinemas use for showing films. Early demos with higher frame-rate experiences have shown that gamers and viewers have a mixed opinion about its perceived quality – for example, how 48fps cinema experiences were received. So it’s both a production design choice as well as user research.

Well, for cinematic reasons I can't argue with the decision to go 30fps... it worked well for a game like Naughty Dog's The Last of Us, but then again that was a methodical, slow-paced title. I don't know how well 30fps will work when a game relies on smooth combat animations and transitions that players will need to look for when pulling off complex counter-attacks and combos. Then again, this is assuming that Ryse actually features a complex combat system and isn't just a series of QTEs.

According to Crytek producer Mike Read, the game is more than just a cinematic action title that relies on button prompts and Dance Dance Revolution timing acuteness. Read stated that the game is a real game and not just a QTE-fest. Until I see some more convincing footage (and time is running short for Crytek to provide such evidence) I'll believe that this is a generic hack-and-slash with a lot of QTE moments to relieve players of having to exercise acute button combination skills.

Anyway, this news about the game being relegated to 30fps after being downgraded from 1080p to 900p is not good news for Microsoft's console heading into its November launch. Coupled with the undeniably strong rumors of Call of Duty: Ghosts being downgraded on the Xbox One from 1080p to 720p where-as the PS4 rendition of the game will be native 1080p at 60fps, and now you have a situation where the Xbox One is not only looking second rate, but it's looking like it's coming in behind the curve in a bad way.

For gamers looking for true next-gen experiences and for a system that actually leaves something to imagination due to having power that needs unlocking, you'll probably do yourself a favor and pick up the Wii U instead, it has way more 1080p games running at 60fps than the Xbox One.

Original here.
post #8772 of 14773
Quote:
Originally Posted by michaeltscott View Post

Ryse Confirmed At 30fps 900p For Xbox One (Click to show)

Original here.

This is more blog spam than news. The resolution was confirmed weeks ago. That was before the latest video came out.
post #8773 of 14773
Yeah, the resolution was confirmed but I hadn't heard any statement on framerate previously.
post #8774 of 14773
I wonder if they could do Ryse at 720p / 60fps on the Xbox One hardware if they had the time? That seems more desirable, but probably not technically possible at this point. It's happening on PS4 too - Knack is rumored to be 30fps also.

It's a shame - once I built a gaming PC and started playing the games I was playing on 360 at 60fps (Borderlands2, Mirrors Edge, Batman Arkham games, Prince of Persia games, etc.), I couldn't go back to 30. Even for Halo4 it was hard - higher frame rate makes that much of a difference, and defines "next gen" IMO.

Hopefully this is just Launch title issues, and as devs get comfortable with the new hardware, a next-gen standard of 60fps will emerge.
post #8775 of 14773
Quote:
Originally Posted by DaverJ View Post

I wonder if they could do Ryse at 720p / 60fps on the Xbox One hardware if they had the time? That seems more desirable, but probably not technically possible at this point. It's happening on PS4 too - Knack is rumored to be 30fps also.

It's a shame - once I built a gaming PC and started playing the games I was playing on 360 at 60fps (Borderlands2, Mirrors Edge, Batman Arkham games, Prince of Persia games, etc.), I couldn't go back to 30. Even for Halo4 it was hard - higher frame rate makes that much of a difference, and defines "next gen" IMO.

Hopefully this is just Launch title issues, and as devs get comfortable with the new hardware, a next-gen standard of 60fps will emerge.

Yeah, but the level of detail of Ryse is so much better than something like Borderlands 2. Even if you run Mirror's Edge on the PC at max resolution with a high frame rate, it still doesn't look as good as the upcoming Mirror's Edge looks. You look at the better textures, the higher poly counts, the better lighting, etc.
post #8776 of 14773
The proof is in the pudding, so to speak. Ryse isn't some run-and-gun twitch shooter where being able to discern an oncoming enemy from far away is desirable. We'll see how it looks and plays when it's finally reviewed.
post #8777 of 14773
Quote:
Originally Posted by DaverJ View Post

I wonder if they could do Ryse at 720p / 60fps on the Xbox One hardware if they had the time? That seems more desirable, but probably not technically possible at this point. It's happening on PS4 too - Knack is rumored to be 30fps also.

It's a shame - once I built a gaming PC and started playing the games I was playing on 360 at 60fps (Borderlands2, Mirrors Edge, Batman Arkham games, Prince of Persia games, etc.), I couldn't go back to 30. Even for Halo4 it was hard - higher frame rate makes that much of a difference, and defines "next gen" IMO.

Hopefully this is just Launch title issues, and as devs get comfortable with the new hardware, a next-gen standard of 60fps will emerge.

 

Yeah, I'm in the same boat.  I had to struggle to get used to 30fps current gen, since I was a PC gamer for years, I remember when 3D cards were actually separate cards from your 2D card.  :p  But even then, you'd be able to run quake 2 or whatever at 60fps in 480p.

 

15 years later, the fact that we're still getting 30fps games is kind of ridiculous. Maybe it's less of a night and day difference to other people, I dunno...I just can't look at it anymor, it's such an immediate turnoff. I'm getting a little too used to 120fps now even.  :)  

post #8778 of 14773
Quote:
Originally Posted by onlysublime View Post

Yeah, but the level of detail of Ryse is so much better than something like Borderlands 2. Even if you run Mirror's Edge on the PC at max resolution with a high frame rate, it still doesn't look as good as the upcoming Mirror's Edge looks. You look at the better textures, the higher poly counts, the better lighting, etc.

Yeah, Ryse's level of detail is looking good, and I understand their decision not to sacrifice visual quality for framerate and go with a more cinematic style - I agree it was the right decision. I just didn't know they would have to make such decisions on next-gen consoles. That seems to be how Crytek operates - push the details and visuals to the next level, and keep fingers crossed on the framerate.

Based on current PC gaming, I was hoping for 3 main improvements of the next-gen consoles over current-gen, in order of importance:
  • improved textures, lighting, more polys, etc.
  • constant 60fps framerate
  • 1080p resolution

Also improved controller, UI/interface, ect. But of those 3 main things, it looks like we are only getting the first with some of the launch titles.

I'm curious as to how the next-gen console versions of Batman Arkham Origins and Assassin's Creed 4 will compare with the PC counterparts. I know Arkham is running 1080/60 on my PC (had to dial down AA to maintain 60 in a couple areas) and looks good, so is that too much to expect from the next consoles?
post #8779 of 14773
Quote:
Originally Posted by DaverJ View Post

Yeah, Ryse's level of detail is looking good, and I understand their decision not to sacrifice visual quality for framerate and go with a more cinematic style - I agree it was the right decision. I just didn't know they would have to make such decisions on next-gen consoles. That seems to be how Crytek operates - push the details and visuals to the next level, and keep fingers crossed on the framerate.

Based on current PC gaming, I was hoping for 3 main improvements of the next-gen consoles over current-gen, in order of importance:
  • improved textures, lighting, more polys, etc.
  • constant 60fps framerate
  • 1080p resolution

Also improved controller, UI/interface, ect. But of those 3 main things, it looks like we are only getting the first with some of the launch titles.

I'm curious as to how the next-gen console versions of Batman Arkham Origins and Assassin's Creed 4 will compare with the PC counterparts. I know Arkham is running 1080/60 on my PC (had to dial down AA to maintain 60 in a couple areas) and looks good, so is that too much to expect from the next consoles?

Yeah, it is too much to expect.

I think we'll see a larger proportion of games at 60fps next gen, but the drive to keep pushing tech will always be there, and they'll do what they always do on console...sacrifice frame rate and resolution.

PC is always going to be ahead of consoles going forward, that 2-3 years where consoles were clearly more capable than a midrange PC are a thing of the past.
post #8780 of 14773
Quote:
Originally Posted by jasonstiller View Post

Suspend/resume supported at launch

http://m.neogaf.com/showthread.php?t=704413

I'm pretty sure we knew it was coming at launch but at least we know it's for sure now lol. This is going to be awesome for us folks who constantly get interrupted by wife n kids!

Is that site actually for real? Why would people be cancelling their PS4 pre-order because Suspend/Resume will be added at some point after launch? Sure it will be good to have the feature with the XBOne at launch, but I certainly wouldn't have cancelled the pre-order if it wasn't.
post #8781 of 14773
Quote:
Originally Posted by DaverJ View Post

I wonder if they could do Ryse at 720p / 60fps on the Xbox One hardware if they had the time? That seems more desirable, but probably not technically possible at this point. It's happening on PS4 too - Knack is rumored to be 30fps also.

It's a shame - once I built a gaming PC and started playing the games I was playing on 360 at 60fps (Borderlands2, Mirrors Edge, Batman Arkham games, Prince of Persia games, etc.), I couldn't go back to 30. Even for Halo4 it was hard - higher frame rate makes that much of a difference, and defines "next gen" IMO.

Hopefully this is just Launch title issues, and as devs get comfortable with the new hardware, a next-gen standard of 60fps will emerge.

FOr me some games are fine at 60FPS and some are better at 30FPS. The article mentioned something similar with movies at 48FPS. I couldn't stand watching the Hobbit that way. And some games are also like that with me. Some 60FPS games will actually make me feel queasy.
post #8782 of 14773
I think I have said earlier in the thread is I normally can not tell the difference. The only thing that really popped out to me is the The Hobbit at a higher frame rate and I really enjoyed it. Besides that piece of entertainment I can't I have noticed a higher frame rate in anything else.
post #8783 of 14773
Quote:
Originally Posted by spid View Post

I think I have said earlier in the thread is I normally can not tell the difference. The only thing that really popped out to me is the The Hobbit at a higher frame rate and I really enjoyed it. Besides that piece of entertainment I can't I have noticed a higher frame rate in anything else.

I don't understand how you can notice the hobbits 48fps, but not a game's 60fps...unless you've never actually seen a 60fps game before, which is almost believable given how few there were this gen.
post #8784 of 14773
Quote:
Originally Posted by bd2003 View Post

I don't understand how you can notice the hobbits 48fps, but not a game's 60fps...unless you've never actually seen a 60fps game before, which is almost believable given how few there were this gen.

I think they like being called Little People these days and I'm sure if they wanted to do 60 they could. They are people just like us.
post #8785 of 14773
Quote:
Originally Posted by bd2003 View Post

I don't understand how you can notice the hobbits 48fps, but not a game's 60fps...unless you've never actually seen a 60fps game before, which is almost believable given how few there were this gen.

The movie was different from anything I had ever seen, but the games just look like any other games.
post #8786 of 14773
Quote:
Originally Posted by spid View Post


The movie was different from anything I had ever seen, but the games just look like any other games.

 

Check this out:

 

http://frames-per-second.appspot.com/

 

You honestly dont see a diff between 60 and 30?

post #8787 of 14773
EuroGamer has a couple new articles with lots of details...

the first is from a crossplatform game developer... especially talks about the ultimate goal is a consistent framerate above all else. He also talks about game development history and how the consoles' designs influenced the game design over the generations.

http://www.eurogamer.net/articles/digitalfoundry-the-secret-developers-what-hardware-balance-actually-means-for-game-creators

Warning: Spoiler! (Click to show)
In this first piece, a seasoned multi-platform developer offers up his view on hardware balance - not just in terms of the current Xbox One/PlayStation 4 bunfight, but more importantly on how the technological make-up of both consoles will define the games we play over the next few years. If you're a game-maker that would like to contribute to the Secret Developers series, please feel free to contact us through digitalfoundry@eurogamer.net and be assured that any discussions will be dealt with in the strictest confidence.

With just weeks to go before the arrival of the PlayStation 4 and Xbox One, there seems to be a particular type of mania surrounding the technical capabilities of these two very similar machines. The raw specs reveal numbers seemingly light years apart, which clearly favour one console platform over the other, but it seems to me that at a more global level, people can't quite see the wood for the trees. Spec differences are relevant of course, but of far larger importance is the core design - the balance of the hardware - and how that defines, and limits, the "next-gen" games we will be playing over the next eight to ten years.

At this point I should probably introduce myself. I'm a games developer who has worked over the years across a variety of game genres and consoles, shipping over 35 million units in total on a range of games, including some major triple-A titles I'm sure you've played. I've worked on PlayStation 2, Xbox, PlayStation 3, Xbox 360, PC, PS Vita, Nintendo DS, iPhone, Wii U, PlayStation 4 and Xbox One. I'm currently working on a major next-gen title.

Over my time in the industry I've seen a wide variety of game engines, development approaches, console reveals and behind-the-scenes briefings from the console providers - all of which gives me a particular perspective on the current state of next-gen and how game development has adapted to suit the consoles that are delivered to us by the platform holders.

I was spurred into writing this article after reading a couple of recent quotes that caught my attention:

"For designing a good, well-balanced console you really need to be considering all the aspects of software and hardware. It's really about combining the two to achieve a good balance in terms of performance... The goal of a 'balanced' system is by definition not to be consistently bottlenecked on any one area. In general with a balanced system there should rarely be a single bottleneck over the course of any given frame." - Microsoft technical fellow Andrew Goossen

Dismissed by many as a PR explanation for technical deficiencies when compared to PlayStation 4, the reality is that balance is of crucial importance - indeed, when you are developing a game, getting to a solid frame-rate is the ultimate goal. It doesn't matter how pretty your game looks, or how many players you have on screen, if the frame-rate continually drops, it knocks the player out of the experience and back to the real world, ultimately driving them away from your game if it persists.

Maintaining this solid frame-rate drives a lot of the design and technical decisions made during the early phases of a game project. Sometimes features are cut not because they cannot be done, but because they cannot be done within the desired frame-rate.


In most games the major contributors to the frame-rate are:

  • Can you simulate all of the action that's happening on the screen - physics, animation, HUD, AI, gameplay etc?
  • Can you render all of the action that's happening on the screen - objects, people, environment, visual effects, post effects etc?

The first point relates to all of the things that are usually handled by the CPU and the second point relates to things that are traditionally processed by the GPU. Over the successive platform generations the underlying technology has changed, with each generation throwing up its own unique blend of issues:

  • Gen1: The original PlayStation had an underpowered CPU and could draw a small number of simple shaded objects.
  • Gen2: PlayStation 2 had a relatively underpowered CPU but could fill the standard-definition screen with tens of thousands of transparent triangles.
  • Gen3: Xbox 360 and PlayStation 3 had the move to high definition to contend with, but while the CPUs (especially the SPUs) were fast, the GPUs were underpowered in terms of supporting HD resolutions with the kind of effects we wanted to produce.

In all of these generations it was difficult to maintain a steady frame-rate as the amount happening on-screen would cause either the CPU or GPU to be a bottleneck and the game would drop frames. The way that most developers addressed these issues was to alter the way that games appeared, or played, to compensate for the lack of power in one area or another and maintain the all-important frame-rate.

This shift started towards the end of Gen2 when developers realised that they could not simulate the world to the level of fidelity that their designers wanted, as the CPUs were not fast enough - but they could spend more time rendering it. This shift in focus can clearly be seen around 2005/2006 when games such as God of War, Fight Night Round 2 and Shadow of the Colossus arrived. These games were graphically great, but the gameplay was limited in scope and usually used tightly cropped camera positions to restrict the amount of simulation required.

Then, as we progressed into Gen3 the situation started to reverse. The move to HD took its toll on the GPU as there were now more than four times the number of pixels to render on the screen. So unless the new graphics chips were over four times faster than the previous generation, we weren't going to see any great visual improvements on the screen, other than sharper-looking objects.

Again, developers started to realise this and refined the way that games were made, which influenced the overall design.
They started to understand how to get the most out of the architecture of the machines and added more layers of simulation to make the games more complicated and simulation-heavy using the CPU power, but this meant that they were very limited as to what they could draw, especially at 60fps. If you wanted high visual fidelity in your game, you had to make a drastic fundamental change to the game architecture and switch to 30fps.

Dropping a game to 30fps was seen as an admission of failure by a lot of the developers and the general gaming public at the time. If your game couldn't maintain 60fps, it reflected badly on your development team, or maybe your engine technology just wasn't up to the job. Nobody outside the industry at that time really understood the significance of the change, and what it would mean for games; they could only see that it was a sign of defeat. But was it?

Switching to 30fps doesn't necessarily mean that the game becomes much more sluggish or that there is less going on. It actually means that while the game simulation might well still be running at 60fps to maintain responsiveness, the lower frame-rate allows for extra rendering time and raises the visual quality significantly. This switch frees up a lot of titles to push the visual quality and not worry about hitting the 60fps mark. Without this change we wouldn't have hit the visual bar that we have on the final batch of Gen3 games - a level of attainment that is still remarkable if you think that the GPU powering these games was released over seven years ago. Now if you tell the gaming press, or indeed hardcore gamers, that your game runs at 30fps, nobody bats an eyelid; they all understand the trade-off and what this means for a game.

Speaking of GPUs, I remember early in the console lifecycle that Microsoft made it known that the graphics technology in the Xbox 360 was "better" than PS3's and they had the specs to prove it - something that sounds very familiar in relation to recent Xbox One/PS4 discussions. This little fact was then picked up and repeated in many articles and became part of the standard console argument that occurred at the time:

  • "PS3 is better than Xbox 360 because of the SPUs."
  • "Xbox 360 has a better graphics chip."
  • "PS3 has a better d-pad controller compared to the Xbox 360."
  • "Xbox Live is better for party chat."


The problem with these facts, taken in isolation, is that they are true but they don't paint an accurate picture of what it is like developing current-gen software.

One of the first things that you have to address when developing a game is, what is your intended target platform? If the answer to that question is "multiple", you are effectively locking yourself in to compromising certain aspects of the game to ensure that it runs well on all of them. It's no good having a game that runs well on PS3 but chugs on Xbox 360, so you have to look at the overall balance of the hardware. As a developer, you cannot be driven by the most powerful console, but rather the high middle ground that allows your game to shine and perform across multiple machines.

While one console might have a better GPU, the chances are that this performance increase will then be offset by bottlenecks in other parts of the game engine. Maybe these are related to memory transfer speeds, CPU speeds or raw connectivity bus throughputs. Ultimately it doesn't matter where the bottlenecks occur, it's just the fact that they do occur. Let's look at a quote from a studio that is well known for its highly successful cross-platform approach:

"The vast majority of our code is completely identical. Very very little bespoke code. You pick a balance point and then you tailor each one accordingly. You'll find that one upside will counter another downside..." - Alex Fry, Criterion Games

And another recent one from the statesman of video game development:

"It's almost amazing how close they are in capabilities, how common they are... And that the capabilities that they give are essentially the same." - John Carmack on the Xbox One and PS4

The key part in that statement is that they have similar capabilities. Not performance, but capabilities. I read that as Carmack acknowledging that the differences in power between the two are negligible when considering cross-platform development. Neither one of them are way ahead of the other, and that they both deliver the same type of experience to the end user with minimal compromise.

With the new consoles coming out in November, the balance has shifted again. It looks like we will have much better GPUs, as they have improved significantly in the last seven years, while the target HD resolution has shifted upwards from 720p and 1080p - a far smaller increase. Although these GPUs are not as fast on paper as the top PC cards, we do get some benefit from being able to talk directly to the GPUs with ultra-quick interconnects. But in this console generation it appears that the CPUs haven't kept pace. While they are faster than the previous generation, they are not an order of magnitude faster, which means that we might have to make compromises again in the game design to maintain frame-rate.

Both the consoles have Jaguar-based CPUs with some being reserved for the OS and others available for the game developers to use. These cores, on paper, are slower than previous console generations but they have some major advantages. The biggest is that they now support Out of Order Execution (OOE), which means that the compiler can reschedule work to happen while the CPU is waiting on an operation, like a fetch from memory.

Removing these "bubbles" in the CPU pipeline combined with removing some nasty previous-gen issues like load-hit stores means that the CPUs Instruction Per Cycle (IPC) count will be much higher. A higher IPC number means that the CPU is effectively doing more work for a given clock cycle, so it doesn't need to run as fast to do the same amount of work as a previous generation CPU. But let's not kid ourselves here - both of the new consoles are effectively matching low-power CPUs with desktop-class graphics cores.

So how will all of this impact the first games for the new consoles? Well, I think that the first round of games will likely be trying to be graphically impressive (it is "next-gen" after all) but in some cases, this might be at the expense of game complexity. The initial difficulty is going to be using the CPU power effectively to prevent simulation frame drops and until studios actually work out how best to use these new machines, the games won't excel. They will need to start finding that sweet spot where they have a balanced game engine that can support the required game complexity across all target consoles. This applies equally to both Xbox One and PlayStation 4, though the balance points will be different, just as they are with 360 and PS3.

One area of growth we will probably see is in the use of GPGPU (effectively offloading CPU tasks onto the graphics core), especially in studios that haven't developed on PC before and haven't had exposure to the approach. All of the current-gen consoles have quite underpowered GPUs compared to PCs, so a lot of time and effort was spent trying to move tasks off the GPU and onto CPU (or SPUs in the case of PS3). This would then free up valuable time on the GPU to render the world. And I should point out that for all of Xbox 360's GPU advantage over PS3's RSX, at the end of the day, the balance of the hardware was still much the same at the global level. Occlusion culling, backface culling, shader patching, post-process effects - you've heard all about the process of moving graphics work from GPU to CPU on PlayStation 3, but the reality is that - yes - we did it on Xbox 360 too, despite its famously stronger graphics core.

So, where does this leave us in the short term? To summarise:

Will the overall low-level hardware speed of the console technology affect the games that are created on the consoles?

Most studios, especially third-party studios, will not be pushing the consoles that hard in their release titles. This will be due to a mix of reasons relating to time, hardware access (it typically takes over two years to make a game and we got next-gen hardware back in February) and maintaining parity between different console versions of the game.

Yes, parity matters and we do design around it. Looking back to the early days of the Xbox 360/PS3 era, one of the key advantages Microsoft had was a year's headstart, so we had more time with the development environment. Parity between SKUs increased not just because we grew familiar with the PS3's hardware, but also because we actively factored it in the design - exactly in the way Criterion's Alex Fry mentioned earlier. With next-gen consoles arriving simultaneously, that way of thinking will continue.

Will we see a lot of games using a lower framebuffer size?

Yes, we will probably see a lot of sub-1080p games (with hardware upscale) on one or both of the next-gen platforms, but this is probably because there is not enough time to learn the GPU when the development environment, and sometimes clock speeds, are changing underneath you. If a studio releases a sub-1080p game, is it because they can't make it run at 1080p? Is it because they don't possess the skills or experience in-house? Or is it a design choice to make their game run at a stable frame-rate for launch?

This choice mirrors the situation we previously had with the 60fps vs. 30fps discussion. It might not be what the company wants for the back of the box, but it is the right decision for getting the game to run at the required frame-rate. Again, it is very easy to point out this fact and extrapolate from there on the perceived 'power' of the consoles, but this doesn't take all the design decisions and the release schedule into account.

Understanding why this decision was made and what impact it has on a game when it is released is not there yet in the gaming public's psyche. People are still too focused on numbers, but as more games start to arrive for the consoles and people start to experience the games, I think that opinions will change. The actual back-buffer resolution will become far less important in discussions compared to the overall gaming experience itself, and quite rightly so.

So why are studios rushing out games when they know that they could do better given more time?

When it comes to console choice, most gamers will purchase based on factors such as previous ownership, the opinions of the gaming press (to some extent), which consoles their friends buy to play multiplayer games, and in some cases, which exclusives are being released (Halo, Uncharted etc). This means that studios are under a lot of pressure to release games with new consoles, as they help drive hardware sales. Also, if a studio releases a game at launch, they are likely to sell more copies, as console purchasers require games in order to show off their shiny new consoles to their friends.

So, with limited time, limited resources and limited access to development hardware before the retail consoles arrive, studios have to make a decision. Do they want their game to look good, play well and maintain a solid frame-rate? If so, compromises have to be made and screen resolution is an easy change to make that has a dramatic effect on the frame-rate (900p, for example, is only 70 per cent of the number of pixels in a 1080p screen). This will likely be the main driving reason behind the resolution choice for the launch titles and won't be any indicator of console "power" - compare Project Gotham Racing 3's sub-native presentation with its sequel, for example.

As a developer I feel that I am overly critical of other people's games and this has tainted the appeal of playing new games. Instead of enjoying them as the developers hoped, I am too busy mentally pointing out issues with rendering, or physics simulation, or objects clipping through the environment. It's hard to ignore some of these things, as it is, after all, what I am trained to spot and eliminate before a game ships. But I hope with the first round of games that I will be able to see past these minor things and enjoy the games as they were intended.

However, I doubt that I will be playing a next-gen game and saying to myself, "Hmm, you can see the difference that the front-side bus speed makes on this game," or, "If only they had slightly faster memory speeds then this would have been a great game." Let's just wait for the consoles to be released, enjoy the first wave of next-gen games and see where it leads us. You might be surprised where we end up.
post #8788 of 14773
Quote:
Originally Posted by bd2003 View Post

Check this out:

http://frames-per-second.appspot.com/

You honestly dont see a diff between 60 and 30?

The slight difference is in my opinion not worth mentioning. Maybe it would mean more if it something other than little balls on the screen. As this is off topic I think we should leave this to personal differences and move on.
post #8789 of 14773
from what's happening with the next-gen consoles, to me, seems that Sony wanted to desperately get out with a big launch lead. So by moving their timetable forward, they pushed the timetable forward for both companies, making Microsoft respond faster than they anticipated. And the result is that both consoles seem rather rushed. So many features not available at launch for the PS4 (not even sleep mode!) and many features not available at launch for the Xbox One.

Eh... means that we'll be growing with the consoles. Like they say, it's not how you start. These are meant for the long-term and after a year or so, the consoles should be humming. The launch games already seem pretty great as-is so there's plenty of excitement how things will be down the line. I remember the launch titles for the 360 and now look back at how primitive they were compared to later 360 titles.
post #8790 of 14773
Well it is certainly high time for new PSs and XBoxes.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Xbox Area
AVS › AVS Forum › Gaming & Content Streaming › Home Theater Gaming › Xbox Area › The Official Xbox One thread...