The Official Xbox One thread... - Page 262 - AVS Forum
Forum Jump: 
 250Likes
Reply
 
Thread Tools
post #7831 of 17130 Old 09-11-2013, 06:00 PM
AVS Special Member
 
Jeremy Anderson's Avatar
 
Join Date: Jan 2003
Location: Mobile, AL
Posts: 1,682
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 348
Quote:
Originally Posted by aaronwt View Post

Don't most people connect devices directly to their TV? And then most people that have a surround system only have 5.1? I went to a 7.1 system back in 2001 and I was surprised it never caught on.
Exactly. Even with the adoption rate of surround sound right now, it's still not a majority concern. The fact that they're supporting 7.1 as robustly as they are is impressive enough in its own right. Well... not if you're bd2003, I guess. I mean, I'm running 9.1 right now and love it... but I'm definitely aware that I'm in the minority. Regardless, being disappointed that a game console doesn't have support for things that don't exist yet and holding your breath for something no one has any intention of implementing... kinda' silly.

Now I'm gonna go write that strongly worded letter to Chevy because my car doesn't run on unobtanium. You guys let me know when it's time to play some Forza. tongue.gif
c.kingsley and Jayadub like this.

"Never believe any quote you read on the internet." - Abraham Lincoln
Jeremy Anderson is offline  
Sponsored Links
Advertisement
 
post #7832 of 17130 Old 09-11-2013, 06:30 PM
AVS Special Member
 
Jeremy Anderson's Avatar
 
Join Date: Jan 2003
Location: Mobile, AL
Posts: 1,682
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 348
Quote:
Originally Posted by bd2003 View Post

You're acting as if adding more PCM channels is a huge technical challenge, when it's not. PCM is little more than a raw unencoded bitstream. No one needs to push for it, because it's going to be supported by default.
I never said it's a huge technical challenge. I'm saying no one's going to bother, because movies don't need them. Period. It's not supported by default because there's absolutely no reason to support 9.1 PCM. There is no 9.1 PCM content, nor any plan on the part of content creators to create such. So you want a game console to basically implement support for 9.1 or 11.1 discrete channels when no existing equipment supports it? And you want AVR manufacturers to then spend time implementing that for your one niche application that will eventually be made obsolete by object-oriented sound anyway, and which movies will never use? Because no offense... that's kinda' putting the chicken before the egg.
Quote:
Originally Posted by bd2003 View Post

You're overthinking it again. Changing the size and sample rate of the output stream has nothing to do with sound files on disc or in memory. I'd expect the actual sound files to be the exact same 48khz compressed ones they'd be using anyway. Maybe the soundtrack will be 96khz, but that's it. The only difference is that it's mixed at a higher sample rate, and the result is a cleaner mix with less distortion. It's going to take just a few more MB of memory to double the size of the mix buffer. The issue is the increased CPU (or GPU) overhead of mixing it, but that's not too much of an issue either.
If you're not mixing source files at the same resolution as the output, then what's the point? So you can reduce distortion above 48kHz... that no monitor can even produce? Because that's basically what you're talking about... a sample rate that lets you reduce distortion up to an inaudible frequency range. Even people into audio mastering think that's overkill. 24/48 covers you up to 24kHz and human hearing, short of people who have never heard a sound before, tends to run to about 22kHz. So again... WHY? If we were talking 16-bit vs 24-bit, maybe I'd be with ya' on this... but do you seriously see this as some kind of deficit? If so, I guess there really is no satisfying you.
Quote:
Originally Posted by bd2003 View Post

The output and transport isn't the issue, it's whether or not SHAPE is capable of getting out of the way at the system level. The 360's Dolby digital encoder couldn't. There's no reason the 360 shouldn't at least be able to support PCM 5.1 over HDMI, other than that they weren't thinking far enough ahead when they designed it. Don't you see the parallel there?
No, I really don't. It's easy to make that parallel now, but when you consider the time when the 360 was designed and released - a full year before PS3 was out or Blu-ray mattered - Dolby Digital was GENEROUS. That's especially true when you consider that the 360 didn't come with HDMI until a later revision, so it's not like you could add the support for PCM output with the later hardware revision for HDMI without completely breaking the encoding standard of the original product. They would have needed one HELL of a crystal ball to release the 360 with HDMI and 5.1 PCM support.
Quote:
Originally Posted by bd2003 View Post

Virtualizing channels is the laziest possible way to produce virtual surround directly from the console. If you're capable of a producing an essay on how Atmos/OO works, you should be the first to understand that. I'm sure you're also aware how much more sophisticated it was in the late 90s on PC. But I guess it's good enough, because that's how it's been done for the past few years?
HRTF comes with its own issues, because acoustic modeling aside, not everyone's head is the same. Older implementations in PC audio let you select from a few preset models, but... how much do you see HRTF being used right now? Now compare that to how much you see technology like Dolby Headphone being used for consoles to provide surround to headphone users. Seems pretty obvious why they're going that route.
Quote:
Originally Posted by bd2003 View Post

Lol that should be their tagline:

Xbox One: It's good enough!
Then don't buy one... especially if you see it as so woefully deficient because it doesn't support formats that neither exist nor are ever planned to be implemented.
c.kingsley and Jayadub like this.

"Never believe any quote you read on the internet." - Abraham Lincoln
Jeremy Anderson is offline  
post #7833 of 17130 Old 09-11-2013, 07:37 PM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 10,603
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 308 Post(s)
Liked: 1426
Quote:
Originally Posted by Jeremy Anderson View Post


I never said it's a huge technical challenge. I'm saying no one's going to bother, because movies don't need them. Period. It's not supported by default because there's absolutely no reason to support 9.1 PCM. There is no 9.1 PCM content, nor any plan on the part of content creators to create such. So you want a game console to basically implement support for 9.1 or 11.1 discrete channels when no existing equipment supports it? And you want AVR manufacturers to then spend time implementing that for your one niche application that will eventually be made obsolete by object-oriented sound anyway, and which movies will never use? Because no offense... that's kinda' putting the chicken before the egg.

 

AVRs and A/V devices are chock full of niche features that have nothing to do with movies. Multi-zone output, dual HDMI out, various DSP modes, 192/24khz support, just tons upon tons of stuff to one up the competition in the numbers game. Movies havent been begging for 192khz support, and yet it's there. They'll add the higher channel discrete LPCM just to say they have it, even if nothing ever uses it. I'm sure BD players will follow suit with their own matrixing on-board, and they'll output using the discrete channels, just so they can say their new player supports 11.1. 

 

Quote:

If you're not mixing source files at the same resolution as the output, then what's the point? So you can reduce distortion above 48kHz... that no monitor can even produce? Because that's basically what you're talking about... a sample rate that lets you reduce distortion up to an inaudible frequency range. Even people into audio mastering think that's overkill. 24/48 covers you up to 24kHz and human hearing, short of people who have never heard a sound before, tends to run to about 22kHz. So again... WHY? If we were talking 16-bit vs 24-bit, maybe I'd be with ya' on this... but do you seriously see this as some kind of deficit? If so, I guess there really is no satisfying you.

 

 

No, it reduces distortion within the audible range. For a single track, it doesn't make a difference, like you say, humans can't hear that high.  Mix multiple tracks at the same time, and the errors at the high end of the frequency range start to compound and it's definitely audible. Games can mix 100+ tracks at once to create the final output, and all those little errors add up. It's one of the reasons movies tend to sound more detailed than games, because they're mixing at higher sample rates. At the very end they can downsample the entire mix to 48khz to suit human hearing limits and save disc space, and the final result is much better than if they had mixed at 48khz all the way through. The only reason I'd want it to be able to output at 96khz, is because the downsampling is just an unnecessary extra processing step, since there's no need to save disc space in a real-time mix. This isn't something that requires any future support from any industry body, it's just a higher quality mix that everyone gets the benefit from.

 

Plain and simple, if you want the highest quality 7.1 48khz sound, heard on the A/V gear you're using today, you'd want to see them mixing internally at 96khz. Not that mixing at 48khz sounds terrible....it's merely "good enough."

 

Quote:
No, I really don't. It's easy to make that parallel now, but when you consider the time when the 360 was designed and released - a full year before PS3 was out or Blu-ray mattered - Dolby Digital was GENEROUS. That's especially true when you consider that the 360 didn't come with HDMI until a later revision, so it's not like you could add the support for PCM output with the later hardware revision for HDMI without completely breaking the encoding standard of the original product. They would have needed one HELL of a crystal ball to release the 360 with HDMI and 5.1 PCM support.

 

The PS3 had HDMI a year after the release of the 360. You really think they needed a crystal ball to see 12 months out? They managed to get the HDMI video working at pixel perfect 1080p, despite component supporting a max of 1080i. Even the internal video scaler in the 360 was able to support scaling to 1080p, they didn't have any trouble "breaking that standard." My guess is that they simply don't put a premium on audio quality, dolby digital was considered "good enough."

 

Quote:
HRTF comes with its own issues, because acoustic modeling aside, not everyone's head is the same. Older implementations in PC audio let you select from a few preset models, but... how much do you see HRTF being used right now? Now compare that to how much you see technology like Dolby Headphone being used for consoles to provide surround to headphone users. Seems pretty obvious why they're going that route.

 

I see HRTFs being used all the time...in Dolby Headphone. They're just doing it after the fact, and doing a worse job of it. The original xbox supported on board HRTFs, but both decided to skip sophisticated sound processors on the current gen, opting to do most of the work on the CPUs. Before the days of xbox live and PSN, you didnt see many people using headsets for gaming, but now they're all over the place. So now you've got the demand (tons of headset users), and MS is back to using a dedicated sound processor...and yet they didnt take the opportunity to build HRTFs in so games could present full blown high quality 3D audio instead of mediocre channel virtualization. Why? My guess is again that they simply don't care about achieving the best final sound quality, because dolby headphone and the like are "good enough." 

 

 

Just like I always say, the X1's graphics will still look great to most people despite being weaker than the PS4, and I'm sure the same will go for the audio. It's just that it's not a very aspirational product when it comes to A/V quality. Keep in mind where you're posting. :p


Steam/PSN/Xbox Live: Darius510
bd2003 is offline  
post #7834 of 17130 Old 09-11-2013, 07:45 PM
AVS Special Member
 
Leo_Ames's Avatar
 
Join Date: Feb 2007
Posts: 2,423
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 37 Post(s)
Liked: 205
Quote:
Originally Posted by bd2003 View Post

The PS3 had HDMI a year after the release of the 360. You really think they needed a crystal ball to see 12 months out? They managed to get the HDMI video working at pixel perfect 1080p, despite component supporting a max of 1080i. Even the internal video scaler in the 360 was able to support scaling to 1080p, they didn't have any trouble "breaking that standard." My guess is that they simply don't put a premium on audio quality, dolby digital was considered "good enough."

My 360 outputs at 1080p over component. No 1080i max that I can see.
Leo_Ames is offline  
post #7835 of 17130 Old 09-11-2013, 08:16 PM
AVS Special Member
 
c.kingsley's Avatar
 
Join Date: Jul 2005
Posts: 1,664
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 11 Post(s)
Liked: 202
Quote:
Originally Posted by bd2003 View Post

Just like I always say, the X1's graphics will still look great to most people despite being weaker than the PS4, and I'm sure the same will go for the audio. It's just that it's not a very aspirational product when it comes to A/V quality. Keep in mind where you're posting. tongue.gif
Has there even been any info released on the PS4's sound capabilities? From what I've read their sound processor is still a black box. It sounds like you're assuming the PS4 will be better and that is your bias showing. Where was Sony at Hotchips? Where have they shed any light on their capabilities or architecture other than talking about how their GPU has more TFLOPS? They haven't. They have not been forthcoming at all. They've gained a strategic advantage in this propaganda war and no one is pressing for actual details -- and some of you just lap it up. We don't even have a finalized CPU speed for the PS4. As of right now the console is vaporware; just like the PS3 was, I might add...

As far as why the PS4 had HDMI, well, it was a bluray player and HDMI was required for HDCP. That is such a rudimentary fact it shouldn't have to be explained. They didn't need a crystal ball because they knew HDMI was required for BD implementation. But hey, props to them because they gambled on BD and it paid off.
Anthony Cler likes this.
c.kingsley is offline  
post #7836 of 17130 Old 09-11-2013, 08:31 PM
 
TyrantII's Avatar
 
Join Date: Apr 2006
Location: Boston
Posts: 10,114
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 832
Quote:
Originally Posted by c.kingsley View Post

Has there even been any info released on the PS4's sound capabilities? From what I've read their sound processor is still a black box. It sounds like you're assuming the PS4 will be better and that is your bias showing. Where was Sony at Hotchips? Where have they shed any light on their capabilities or architecture other than talking about how their GPU has more TFLOPS? They haven't. They have not been forthcoming at all. They've gained a strategic advantage in this propaganda war and no one is pressing for actual details -- and some of you just lap it up. We don't even have a finalized CPU speed for the PS4.
.

Well, We know the GPU clock is at 800MHz:



CPU is probably the same or similar to XBone, and as BD has pointed out the CPU is not going to be a huge deal in these consoles as they move away from relying on them for less and less tasks. Whether or not they can get a similar GPU/CPU clock boost is up in the air, but they gain little from doing so now. Better to wait until Dev's might need it and the hardware better known.
Quote:
As of right now the console is vaporware; just like the PS3 was, I might add...

Huh?
TyrantII is offline  
post #7837 of 17130 Old 09-11-2013, 08:55 PM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 10,603
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 308 Post(s)
Liked: 1426
Quote:
Originally Posted by c.kingsley View Post


Has there even been any info released on the PS4's sound capabilities? From what I've read their sound processor is still a black box. It sounds like you're assuming the PS4 will be better and that is your bias showing. Where was Sony at Hotchips? Where have they shed any light on their capabilities or architecture other than talking about how their GPU has more TFLOPS? They haven't. They have not been forthcoming at all. They've gained a strategic advantage in this propaganda war and no one is pressing for actual details -- and some of you just lap it up. We don't even have a finalized CPU speed for the PS4. As of right now the console is vaporware; just like the PS3 was, I might add...

 

You'll notice I never mentioned the PS4 a single time in those last few posts, other than to say exactly what you're saying, that it's a mystery box. I'm not assuming the PS4 is better, because we don't know its capabilities, nor do we know it's limits. The only concrete thing anyone seems to know about the PS4's audio is that they've got an onboard MP3 decoder capable of decoding a ton of streams simultaneously, similar to what the X1 can do with their XMA format...that's just about it. If it has the same limitations that the X1 has, I'm equally disappointed.

 

If I had to guess, I'd imagine it's a lot like the PS3 and 360 - all of the basic audio processing that the X1 offloads to SHAPE is handled by the CPU/GPU.  If they had something like SHAPE, they'd be talking about it. So while that places an additional burden on the shared resources, it may be free of any arbitrary limits imposed by fixed function hardware. 

c.kingsley likes this.

Steam/PSN/Xbox Live: Darius510
bd2003 is offline  
post #7838 of 17130 Old 09-11-2013, 08:58 PM
AVS Special Member
 
c.kingsley's Avatar
 
Join Date: Jul 2005
Posts: 1,664
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 11 Post(s)
Liked: 202
Quote:
Originally Posted by TyrantII View Post

CPU is probably the same or similar to XBone, and as BD has pointed out the CPU is not going to be a huge deal in these consoles as they move away from relying on them for less and less tasks. Whether or not they can get a similar GPU/CPU clock boost is up in the air, but they gain little from doing so now. Better to wait until Dev's might need it and the hardware better known.
Huh?
That's great, but you spent a lot of time dodging the fact that the CPU is still an unknown. I'd hardly say the CPU isn't a huge deal, they didn't put 8 core Jaguars in them for nothing. If the CPU was so irrelevant they could have just gone single core and put a massive GPU in its place. The truth is, while the PS4 has better Powerpoint specs released, there is more final, technical information available about the XB1. Sony made all sorts of absurd claims about he PS3 that never materialized. Until they release some actual technical documentation beyond "TEH TFLPZ" I will remain skeptical about anything Sony has to say.
c.kingsley is offline  
post #7839 of 17130 Old 09-11-2013, 09:02 PM
AVS Special Member
 
c.kingsley's Avatar
 
Join Date: Jul 2005
Posts: 1,664
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 11 Post(s)
Liked: 202
Quote:
Originally Posted by bd2003 View Post

You'll notice I never mentioned the PS4 a single time in those last few posts, other than to say exactly what you're saying, that it's a mystery box. I'm not assuming the PS4 is better, because we don't know its capabilities, nor do we know it's limits. The only concrete thing anyone seems to know about the PS4's audio is that they've got an onboard MP3 decoder capable of decoding a ton of streams simultaneously, similar to what the X1 can do with their XMA format...that's just about it. If it has the same limitations that the X1 has, I'm equally disappointed.

If I had to guess, I'd imagine it's a lot like the PS3 and 360 - all of the basic audio processing that the X1 offloads to SHAPE is handled by the CPU/GPU.  If they had something like SHAPE, they'd be talking about it. So while that places an additional burden on the shared resources, it may be free of any arbitrary limits imposed by fixed function hardware. 
I stand corrected then and I apologize if I misunderstood you. It is a mystery box and I'm glad to see we agree! From what I've seen, they are actually handling audio tasks as computations on the GPU on the PS4. I saw an article from their hardware engineer earlier where he states that, but I can't find it now.
c.kingsley is offline  
post #7840 of 17130 Old 09-11-2013, 09:33 PM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 10,603
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 308 Post(s)
Liked: 1426

The closest I've seen to that is timothy lottes (creator of FXAA/TXAA) guessing at it:

 

http://www.gameranx.com/updates/id/12324/article/ps4-a-speculative-analysis-of-sony-s-next-gen-console/

 

Quote:
Support for upto 6 Audio Streams :: HDMI supports audio, so the GPU actually outputs audio, but no PC driver gives you access. The GPU shader is in fact the ideal tool for audio processing, but on the PC you need to deal with the GPU->CPU latency wall (which can be worked around with pinned memory), but to add insult to injury the PC driver simply just copies that data back to the GPU for output adding more latency. In theory on something like a PS4 one could just mix audio on the GPU directly into the buffer being sent out on HDMI.

 

Whether or not Sony's already provided developers tools to use the GPU for audio, I have no idea. I don't doubt that Timothy knows his stuff, and he doesn't have a dog in the race...if he says a GPU is ideal for processing audio, I take him at his word. MS has a lot of good reasons to include something like SHAPE when they're so heavily focused on multitasking, it's fast, but none of the processing is all that sophisticated. The MS documentation shows that devs can override the hardware processing and implement whatever they want in software, but then they're relinquishing that hardware advantage.

 

It also seems to imply that middleware can't utilize SHAPE, and third party devs heavily rely on middleware like FMOD to simply cross platform development, so a lot of games may outright ignore it. 

 

Quote:
WASAPI (Windows Audio Session API) can be used for any custom, exclusively software-implemented pipeline. WASAPI provides audio endpoint functionality only. Decompression, sample-rate conversion, mixing, and digital-signal processing, as well as interactions with Durango’s audio hardware components, must be implemented by the client. WASAPI is most typically used by audio middleware solutions.

 

Given all that it's hard for me too excited about what it's capable of, and if it's taken out of the equation by middleware or devs opting for software to use more sophisticated processing.....I'd think whichever platform has the most overall computational resources is going to come out ahead for audio. 


Steam/PSN/Xbox Live: Darius510
bd2003 is offline  
post #7841 of 17130 Old 09-11-2013, 09:52 PM
AVS Special Member
 
Jeremy Anderson's Avatar
 
Join Date: Jan 2003
Location: Mobile, AL
Posts: 1,682
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 348
Quote:
Originally Posted by bd2003 View Post

Plain and simple, if you want the highest quality 7.1 48khz sound, heard on the A/V gear you're using today, you'd want to see them mixing internally at 96khz. Not that mixing at 48khz sounds terrible....it's merely "good enough."
Y'know what? I give... I hope one day someone creates a console that satisfies your completely unrealistic and ridiculous needs for formats that don't exist and frequency ranges that no monitors can reproduce. Good luck with that. Maybe they can power it all with unicorn dreams while they're at it. God forbid your golden ears should have to hear >24kHz distortion while you're playing Call Of Duty. rolleyes.gif
Quote:
Originally Posted by bd2003 View Post

The PS3 had HDMI a year after the release of the 360. You really think they needed a crystal ball to see 12 months out? They managed to get the HDMI video working at pixel perfect 1080p, despite component supporting a max of 1080i. Even the internal video scaler in the 360 was able to support scaling to 1080p, they didn't have any trouble "breaking that standard." My guess is that they simply don't put a premium on audio quality, dolby digital was considered "good enough."
First, component video supports 1080p - I know, 'cause I was running it before I went HDMI - and the 360 supported 1080p output over component with one of their first dashboard updates, with the launch consoles. That's with the original ANA scaler in the 360. The HDMI versions of the 360 used a slightly modified scaler (HANA) that did the same thing, but it didn't have to break any standard to do so... it just needed the chipset to handle TMDS. This was after the HDMI 1.3 spec was released... in June of 2006, 7 months after the 360 was released. You could argue that the original HDMI 1.0 spec allowed for 8 channels of LPCM... but no one was using it, just like no one at that time was using 1080p video. At the time (and through HDMI 1.2), HDMI was struggling and was very much a niche connection. Realtime Dolby Digital mixing and encoding was actually a pretty slick thing for them to pull off at the time the 360 was released, so saying that they "didn't put a premium on audio quality" is just poppycock. So yeah... it would've been a mean feat for Microsoft to go HDMI 7 months before the 1.3 spec was ratified or widespread support had taken hold. Just sayin'. But that doesn't change the fact that the current situation is completely different, since you're still asking them to support formats that don't currently exist and no one plans to implement... ever. And you can't seem to explain why you think that's going to happen, except MOAR CHANNELS!!!! It's good to want things, I guess.
Quote:
Originally Posted by bd2003 View Post

Just like I always say, the X1's graphics will still look great to most people despite being weaker than the PS4, and I'm sure the same will go for the audio. It's just that it's not a very aspirational product when it comes to A/V quality. Keep in mind where you're posting. tongue.gif
And again, "weaker than the PS4" is yet to be seen in practice. Only people who take spec numbers at face value and completely discount the supporting system architecture and its particular bottlenecks/efficiencies "know" that to be true... and since none of us are privy to either system's developer white papers or are actually developing on both systems, throwing statements out like that is just feeding the fanboy fire. And especially on an A/V forum, you would expect people to understand that instead of getting hung up on numerical dick-measuring.

"Never believe any quote you read on the internet." - Abraham Lincoln
Jeremy Anderson is offline  
post #7842 of 17130 Old 09-12-2013, 04:29 AM
AVS Special Member
 
Jeremy Anderson's Avatar
 
Join Date: Jan 2003
Location: Mobile, AL
Posts: 1,682
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 348
I read every word of it... They give you a dual DPU audio processor with more power than even the flagship sound cards on the PC, and it's just "good enough". So we should just agree to disagree. I ain't mad 'atcha! wink.gif

"Never believe any quote you read on the internet." - Abraham Lincoln
Jeremy Anderson is offline  
post #7843 of 17130 Old 09-12-2013, 05:14 AM
 
TyrantII's Avatar
 
Join Date: Apr 2006
Location: Boston
Posts: 10,114
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 832
Quote:
Originally Posted by c.kingsley View Post

That's great, but you spent a lot of time dodging the fact that the CPU is still an unknown. I'd hardly say the CPU isn't a huge deal, they didn't put 8 core Jaguars in them for nothing. If the CPU was so irrelevant they could have just gone single core and put a massive GPU in its place. The truth is, while the PS4 has better Powerpoint specs released, there is more final, technical information available about the XB1. Sony made all sorts of absurd claims about he PS3 that never materialized. Until they release some actual technical documentation beyond "TEH TFLPZ" I will remain skeptical about anything Sony has to say.

We're missing the official clocks for the CPU and exactly what their dedicated sound chip is (some people have said it exists). First, me stating that again isn't dodging anything. Second that's hardly a laundry list of more specs.

We only got the any of the real specs from Xbone just a few weeks ago at gamerscom.

The CPU isn't irrelevant. It will just become more so as the generation trudges on and GPGPU makes it way into PC gaming. It's a bit of insurance right now, and since the capabilities will be there long term it means more performance long term for the consoles.

And if you're skeptical, go play The Last of Us. No really. It's their claims come to life in a masterpiece of a videogame that everyone should play.

They were right about their tech 100%. They were totally, flat out , embarrassing wrong about their ideas about the industry and game development; hence their 180 going into next gen and going with much simple and direct architecture.
TyrantII is offline  
post #7844 of 17130 Old 09-12-2013, 07:54 AM
AVS Special Member
 
Kimeran's Avatar
 
Join Date: Sep 2009
Posts: 1,204
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 4 Post(s)
Liked: 115
So it took them 8 years to get a game to take full advantage of the cell processor.....

I am not bashing the PS3 because I have one and enjoyed it for many years. However, no one can argue that the approach that Sony took was a little, shall we say, ambitious? Most of the games that were multi-platform were, in my opinion, worst on the PS3. Why? Cause no one could program for the processor effectively. It's the same reason that it is a complete waste of money for someone to build a gaming computer on the i7 when not a single game can take advantage of the hyperthreading.

Clock speeds are not everything, look at the bulldozer vs. the i5 CPU for instance...

Do clock speeds matter? YES but not as much as what everyone is making them out to be.

I am with c.kingsley as being skeptical about Sony's claims. I am also skeptical about the X1...

We simply will not know until we see benchmarks of games running on the different systems and which can keep a higher frame rate.

Trying to enjoy the simple things in life.

 

Steam: madbrayniak

Kimeran is offline  
post #7845 of 17130 Old 09-12-2013, 09:11 AM
 
TyrantII's Avatar
 
Join Date: Apr 2006
Location: Boston
Posts: 10,114
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 832
Quote:
Originally Posted by Kimeran View Post

So it took them 8 years to get a game to take full advantage of the cell processor.....

Not at all, just pointing out the crown jewel of the efforts and thought behind the design. ND has been pushing the console envelope since Uncharted 1, and while it might not be as impressive now, at the time it was beating out other developers on the same hardware and on the 360. But it also shows the problems inherit to the philosophy behind the PS3, as only Sonys 1st party studios had a gripe on the tech and were willing to get the most of it. It was that way up until the recent Tomb Raider I'd say (which is no slouch on the 360 either).
Quote:
However, no one can argue that the approach that Sony took was a little, shall we say, ambitious? Most of the games that were multi-platform were, in my opinion, worst on the PS3. Why? Cause no one could program for the processor effectively. It's the same reason that it is a complete waste of money for someone to build a gaming computer on the i7 when not a single game can take advantage of the hyperthreading.

I think that's being kind. Arrogant and presumptuous is more like it. Even worse they intentionally held back developer support before launch while allowing their ICE team to work only with first party studios (who still were a year out from release on their titles at launch). The thinking was 3rd party would be forced to step up to their level. The reality was it led to that 24-28 month drought, pitiful games from 3rd parties, and the persistent idea that "PS3 doesn't have any games" that you still hear today.
Quote:
Clock speeds are not everything, look at the bulldozer vs. the i5 CPU for instance...Do clock speeds matter? YES but not as much as what everyone is making them out to be. I am with c.kingsley as being skeptical about Sony's claims. I am also skeptical about the X1...

Bandwidth seems to be king, and no, a minor 50-100Mhz differences in clocks won't be anything. The 50% more CU's though is a different story and there's no way around that, nor the dedicated CU's for asynchronous compute. Still, people are very right that the laymen won't see much of a difference, nor will mom care what game is 60fps variable, or 60 locked when buying for johnny.

But what does strikes me as unusual is there's a number of dev's putting their reputations on the line and speaking up of the development differences in those communities. Just last week another Dev, a Gears of War Judgement Dev, came out and said it. Rumor is Cerny wasn't kidding and time to triangle is much, much quicker on PS4 (be it the tech or the API's or both) and performance off the bat is much better. And if its cheaper and more timely to get games running, one might think it gives them more time to optimize and add bells and whistles as well.

There's some pull and tug there as well as they don't want to differentiate too much or run the possibility of creating a bad version of their product. But the also run the same risk for intentionally regulating one as well.

Either way, it's going to be interesting.
Bazylik likes this.
TyrantII is offline  
post #7846 of 17130 Old 09-12-2013, 09:22 AM
AVS Special Member
 
c.kingsley's Avatar
 
Join Date: Jul 2005
Posts: 1,664
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 11 Post(s)
Liked: 202
Quote:
Originally Posted by TyrantII View Post

But what does strikes me as unusual is there's a number of dev's putting their reputations on the line and speaking up of the development differences in those communities. Just last week another Dev, a Gears of War Judgement Dev, came out and said it. Rumor is Cerny wasn't kidding and time to triangle is much, much quicker on PS4 (be it the tech or the API's or both) and performance off the bat is much better. And if its cheaper and more timely to get games running, one might think it gives them more time to optimize and add bells and whistles as well.

There's some pull and tug there as well as they don't want to differentiate too much or run the possibility of creating a bad version of their product. But the also run the same risk for intentionally regulating one as well.

Either way, it's going to be interesting.
That's grasping at straws there. There may be some actual technical developers who have made comments, but quoting the former "Creative Director" of a studio is not convincing. He has so much experience with the hardware that he says "...all next-gen AAA devs I talk to..." instead of, "in my experience...."

He works in the art department pushing pens around on a desk, he doesn't have any actionable knowledge. rolleyes.gif

Here is what Carmack had to say:
Quote:
It’s almost amazing how close they are in capabilities, how common they are,” Carmack said. “And that the capabilities that they give are essentially the same.
c.kingsley is offline  
post #7847 of 17130 Old 09-12-2013, 09:34 AM
 
TyrantII's Avatar
 
Join Date: Apr 2006
Location: Boston
Posts: 10,114
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 832
A Creative director doesn't push pens around, but regardless that's not even the point.

Someone in the industry, with other contacts in the industry, confirming what 20 or so Developer have been saying on the net isn't grasping at straws. It's reporting what people are finding. Also note, this isn't that the PS4 is 50% faster or that games will be 50% faster. This is a comment on getting first boot software up and running on the new consoles for the first time. It's also coming on the heels of info that CODG took 3 weeks and was up and running at 60FPS for first boot on PS4, while it took 4 months and was running at 15FPS on the XBone first boot. eSRAM seemed to be the problem, as you need to heavily optimize code around it.

Take the rumors and industry comments as you will, but they're out there. I'm not just throwing out FUD.

As for Carmack, he's been a businessman first and foremost for a very long time. He's now pushing Oculus Rift and I'm sure he wants to be on the good side of both console manufactures (make it happen MSONY!). And no, he's not even wrong. The consoles are very similar. But they do have their key differences that might mean performance differences. Gaming PC rigs are very similar as well, but they don't all run the same or have the same hardware. His comment was from months ago, before E3, and as far as I know he doesn't have anything in the pipe for next gen consoles outside of possible OR support, which isn't game software.
TyrantII is offline  
post #7848 of 17130 Old 09-12-2013, 09:43 AM
AVS Special Member
 
Kimeran's Avatar
 
Join Date: Sep 2009
Posts: 1,204
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 4 Post(s)
Liked: 115
Havent seen this posted:

http://www.eurogamer.net/articles/digitalfoundry-xbox-one-cpu-speed-increased-in-production-hardware

Not much of a boost but it's still a boost.

On a note to PC....

With both consoles being based on x86, you have to wonder how PC gaming is even going to change with the new generation of consoles?

Will the ports be better?

Will there be more ports?

Will PC games take sales away from the consoles and make the cost of games higher?

Trying to enjoy the simple things in life.

 

Steam: madbrayniak

Kimeran is offline  
post #7849 of 17130 Old 09-12-2013, 09:48 AM
AVS Special Member
 
c.kingsley's Avatar
 
Join Date: Jul 2005
Posts: 1,664
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 11 Post(s)
Liked: 202
Quote:
Originally Posted by TyrantII View Post

A Creative director doesn't push pens around, but regardless that's not even the point.
That's a sarcastic reference that people in technical work sometimes use to refer to upper management. Humor: It is supposed to elicit laughter not robotic confusion. wink.gif

He's an art director, I get that. And he likes to blab on Twitter. But he's claiming to represent experiences of other people, not his own. He could claim that developers told him that the PS4 contains the nuclear launch codes, but we can't verify any of it. And, really, who cares how fast they had COD running at 60FPS? What does that prove? What matters is the end product on both consoles. Period, the end. As they say, "There are many roads to Rome."
c.kingsley is offline  
post #7850 of 17130 Old 09-12-2013, 09:50 AM
 
TyrantII's Avatar
 
Join Date: Apr 2006
Location: Boston
Posts: 10,114
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 832
I think well see more ports going both ways. With the consoles catching up to mid-high range, we should also see better assets and shaders being developed and used in PC's.

If MSONY can get some really good exclusives out early (and it's looking like that may be the case) and with the relative price / performance of the new consoles I think we might see a move back to consoles once again for a few years a bit sooner this time. Then 4-5 years out a move back to PC as people pick up GPGPU based videocards among other new hardware ideas.

PC / console cycles tend to be cyclical, with the fat middle being ruled by the console during their stride and the very start and ends as the time when people look towards upgrading their PC's.
TyrantII is offline  
post #7851 of 17130 Old 09-12-2013, 09:50 AM
AVS Special Member
 
c.kingsley's Avatar
 
Join Date: Jul 2005
Posts: 1,664
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 11 Post(s)
Liked: 202
Quote:
Originally Posted by Kimeran View Post

Havent seen this posted:

http://www.eurogamer.net/articles/digitalfoundry-xbox-one-cpu-speed-increased-in-production-hardware

Not much of a boost but it's still a boost.

On a note to PC....

With both consoles being based on x86, you have to wonder how PC gaming is even going to change with the new generation of consoles?

Will the ports be better?

Will there be more ports?

Will PC games take sales away from the consoles and make the cost of games higher?
Hopefully the 8 core CPUs in these consoles will nudge multi-threaded development into the mainstream. With a few exceptions, developers have not been taking full advantage of modern CPUs and that is a shame. Gotta love seeing games peg 1 core of a 4+ core CPU...
c.kingsley is offline  
post #7852 of 17130 Old 09-12-2013, 09:59 AM
AVS Special Member
 
c.kingsley's Avatar
 
Join Date: Jul 2005
Posts: 1,664
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 11 Post(s)
Liked: 202
Quote:
Originally Posted by TyrantII View Post

I think well see more ports going both ways. With the consoles catching up to mid-high range, we should also see better assets and shaders being developed and used in PC's.

If MSONY can get some really good exclusives out early (and it's looking like that may be the case) and with the relative price / performance of the new consoles I think we might see a move back to consoles once again for a few years a bit sooner this time. Then 4-5 years out a move back to PC as people pick up GPGPU based videocards among other new hardware ideas.

PC / console cycles tend to be cyclical, with the fat middle being ruled by the console during their stride and the very start and ends as the time when people look towards upgrading their PC's.
It will be interesting to see if the consoles drive AMD's market share in the PC arena. They're sort of running away from Intel with their parallel computing strategy. Intel's on chip video is pretty horrendous for gaming at this point. Right now the discrete GPUs are faster but I think they are on the right track with unified memory structure. Eventually on PCs they're going to bottleneck moving all that data from system memory into GPU memory. There is very impressive potential for AMD and they have a huge head start in this space due to both these consoles.
c.kingsley is offline  
post #7853 of 17130 Old 09-12-2013, 09:59 AM
 
TyrantII's Avatar
 
Join Date: Apr 2006
Location: Boston
Posts: 10,114
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 832
Quote:
Originally Posted by c.kingsley View Post

That's a sarcastic reference that people in technical work sometimes use to refer to upper management. Humor: It is supposed to elicit laughter not robotic confusion. wink.gif

He's an art director, I get that. And he likes to blab on Twitter. But he's claiming to represent experiences of other people, not his own. He could claim that developers told him that the PS4 contains the nuclear launch codes, but we can't verify any of it. And, really, who cares how fast they had COD running at 60FPS? What does that prove? What matters is the end product on both consoles. Period, the end. As they say, "There are many roads to Rome."

Sorry, sometimes humor or sarcasm doesn't come across in text. how is it the 21st century and the internet hasn't found a way to denote that without a emoticon!? wink.gif

Still, he's much more than an art director. He builds and runs game companies, and he works closely with every department to bring his creative vision to fruition. Think Gabe Newel, Cliffy B, David Jaffe, Tim Schafer, or Ken Levine. Carmack even fits in there, but he's been drifting more towards the technical side of the business as time goes on (ID Tech) and now is exclusively in hardware with OR after he left ID.
Quote:
Originally Posted by c.kingsley View Post

Hopefully the 8 core CPUs in these consoles will nudge multi-threaded development into the mainstream. With a few exceptions, developers have not been taking full advantage of modern CPUs and that is a shame. Gotta love seeing games peg 1 core of a 4+ core CPU...

It's funny you mentioned Carmack above, because he along with Gabe from Valve both derided multithreading for games back in the day when PS360 were pushing it. They especially hated the PS3 Cell setup, but were no fans of the 360 multicore processor as well.

And you're right, it still hasn't caught on. At least using the CPU super efficiently. Quite often most devs will run most of their code on one thread fully, and just throw stuff off to the others when needed inefficiently. It's a terrible waste of what they have.
TyrantII is offline  
post #7854 of 17130 Old 09-12-2013, 10:21 AM
Member
 
chmorgan's Avatar
 
Join Date: Feb 2013
Location: Webster, Mn
Posts: 171
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 32 Post(s)
Liked: 39
http://www.expertreviews.co.uk/games/1302265/xbox-one-review-hands-on

Nice reference to Forza having Dolby True HD...

XBL: octavarium1/PSN:octavarium1970
chmorgan is offline  
post #7855 of 17130 Old 09-12-2013, 10:41 AM
AVS Addicted Member
 
michaeltscott's Avatar
 
Join Date: Aug 2001
Location: San Diego, CA, USA
Posts: 17,261
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 704 Post(s)
Liked: 676

A search of the thread for "1.75" reveals that the info was posted and discussed 9 days ago.

Mike Scott (XBL: MikeHellion, PSN: MarcHellion)

"Think of the cable company as a group of terrorist (sic)." -- hookbill
michaeltscott is online now  
post #7856 of 17130 Old 09-12-2013, 10:41 AM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 10,603
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 308 Post(s)
Liked: 1426
Quote:
Originally Posted by chmorgan View Post

http://www.expertreviews.co.uk/games/1302265/xbox-one-review-hands-on

Nice reference to Forza having Dolby True HD...

That's prob a misunderstanding, there's no reason they'd need to use any sort of compression.
Jeremy Anderson likes this.

Steam/PSN/Xbox Live: Darius510
bd2003 is offline  
post #7857 of 17130 Old 09-12-2013, 11:15 AM
AVS Special Member
 
pcweber111's Avatar
 
Join Date: Dec 2001
Location: In a van, down by the river.
Posts: 3,568
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 26 Post(s)
Liked: 265
Well it's possible there's more than one audio track. Uncharted had them so it's not unprecedented. Still you're probably right there doesn't seem to be a need for even a lossless compressed track. Just use PCM and be done with it.

edit The again Uncharted had lossy DD and DTS to ensure backwards compatibility and uncompressed for those that could take advantage of it so doing a TrueHD track doesn't make much sense for that when you can just do DD instead.
pcweber111 is offline  
post #7858 of 17130 Old 09-12-2013, 11:28 AM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 10,603
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 308 Post(s)
Liked: 1426
Yeah...any receiver that does TrueHD can also do PCM.

Steam/PSN/Xbox Live: Darius510
bd2003 is offline  
post #7859 of 17130 Old 09-12-2013, 11:31 AM
Member
 
Dr.Savage's Avatar
 
Join Date: Jul 2009
Posts: 118
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 20 Post(s)
Liked: 23
Guys, I am at a total loss.

This is the weirdest time. I have reviewed websites from the complete out of the end of the branch (dGPU) websites, to the more reputable ones with actual people that presumably work in tech etc.

They can't make heads or tales of what Microsoft is doing. Al is on neogaf saying, wait for it, wait for it (kicking the can), we are working with tech to find the best way to tell you the truth (thus looking for a way to put themselves in best light), saying they need a few weeks...one recent post from "credible" website is presuming maybe the chip is bigger than 363 mm2. Another saying two APUs Another saying XB1 may use double-pumping the SRAM storage cells which presumably allows "SRAM to present two external ports, each capable of performing one transaction per clock cycle". I assume that is a good thing, my take being that makes SRAM twich as effective (???). Now I am not saying this to argue some pie in the sky new dGPU rumor, but I am saying this because the tech guys are so at a loss.. Thus, they may be creating stuff out of thin air that maybe doesn't even exist. My main point is pointing out their complete confusion. Why should Al muddy the water if it is in fact less powerful, just focus on games, Live, Kinect convenience, and say you got Titanfall and Halo. Boom!

Is there some NDA? Is the XB1 actually less powerful than is currently believed. More powerful? What the heck!!! I can't fathom (again i am a novice even less than a novice at this tech stuff, but I have built PCs so understand general PC stuff), but again I can't fathom how the XB1 can only have this one chip that has been shown that can hoist 3 OSs with a 8 core Jaguar and a 6000 to 7000 GPU and still show games like Ryse (plus snaping things in and out), which from videos that have been described as running on XB1s not devkits. Granted it is 30fps, but still. Something doesn't add up. Am I the only one that is thinking that there is more going on than shown, especially since Microsoft seems completely self destructive with the whole neoGaf deal. Unless, they are going for the nice guy gets attacked by weirdos angle to get sympathy. Don't get me wrong, I like Al, and I have sympathy for him with that deal, but I was going to get XB1 anyway. Sorry, just wanted to vent, this information lock down is getting annoying. Is there NDA, is the XB1 weaker than presumed, stronger than presumed, and I understand the answer is most likely wait and see rolleyes.gif Al is a marketing guy, how do you consider doing the Rope a Dope with tech weirdos a good marketing strategy. That strategy only works if he fires the right hand off the rope at some point and knocks them out (Ali/Forman)...

FACEPALM...sorry just wanted to vent! Haha smile.gif
ufcraig21 likes this.
Dr.Savage is offline  
post #7860 of 17130 Old 09-12-2013, 11:37 AM
AVS Special Member
 
c.kingsley's Avatar
 
Join Date: Jul 2005
Posts: 1,664
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 11 Post(s)
Liked: 202
Quote:
Originally Posted by Dr.Savage View Post

Another saying XB1 may use double-pumping...
Now things are getting interesting. eek.gif

Honestly, until they release their supposed tech heavy explanation in the future, you should ignore 99% of that stuff. It'll just make your head explode because it is all over the map.
c.kingsley is offline  
Reply Xbox Area

Tags
Monoprice , Xbox One Console Day One Edition

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off