4k by 2k or Quad HD...lots of rumors? thoughts? - Page 3 - AVS Forum
Forum Jump: 
Reply
 
Thread Tools
post #61 of 3692 Old 03-25-2011, 01:27 PM
Senior Member
 
CatBus's Avatar
 
Join Date: Feb 2011
Posts: 454
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 26
Quote:
Originally Posted by Dan Filice View Post

Since Star Wars came out in VHS, I think I've purchased 6 different versions

You didn't stop with the Cowclops edition? Shame on you!
CatBus is offline  
Sponsored Links
Advertisement
 
post #62 of 3692 Old 03-26-2011, 11:27 AM
Advanced Member
 
RLBURNSIDE's Avatar
 
Join Date: Jan 2007
Posts: 729
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 23 Post(s)
Liked: 126
if only for the cinema projector or PC monitor market, where it makes more sense than home movies. Although, disc capacity will be huge...why not? I'd love a copy of LOTR or Aliens in quad HD. In fact, they showed some screens of quad hd LOTR a while back and it looks incredible. If the screens get higher and higher DPI, why not sit closer or get a bigger screen? I sit pretty close to my 46 inch plasma at 1080p, but I sometimes with I could have higher rez and slightly bigger, just for games and more Windows real estate. I read somewhere that 4k is the optimal res for 70mm movie transfers.

They will have to go Quad from a pure marketing gimmick POV as well, especially for huge screen sizes when 1080p is just not enough for PC use. But for now let's get perfect 1080p displays : true blacks, good 3D, 120hz for each eye. There's no reason why animated movies in 3D can't be rendered in quad HD, what with CPUs getting better and better. Real time content such as games, not movies, is where I see > 1080p res getting traction initially, IMHO.
RLBURNSIDE is offline  
post #63 of 3692 Old 03-27-2011, 07:28 AM
AVS Special Member
 
specuvestor's Avatar
 
Join Date: Jun 2010
Posts: 2,399
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
Incidentally I just recently realized why these high performance CPU or GPU cannot be in the TV package (also for purpose of deinterlacing or scaling:

How much power do you think these processors currently consume?
specuvestor is offline  
post #64 of 3692 Old 03-27-2011, 08:31 AM
AVS Special Member
 
Nielo TM's Avatar
 
Join Date: Oct 2005
Location: London
Posts: 1,479
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
The CPU and GPU inside TVs are usually low power SoCs. They are by no means identical to the ones found in games console or PC (excluding the Cell TV).

SoC/embedded platform are designed for a specific use with strict set of standards. But SoCs are becoming more powerful and have started to feature specs identical to fully fledged CPUs and GPUs. The SoC embedded within the PSP2 is identical to modern GPU/CPU and the same could be said for components within the iPAD2.

The newly developed AMD Fusion can also be embedded within SMART TVs that require low-power high performance processor.
Nielo TM is offline  
post #65 of 3692 Old 03-27-2011, 09:56 PM
AVS Special Member
 
specuvestor's Avatar
 
Join Date: Jun 2010
Posts: 2,399
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
Quote:
Originally Posted by Nielo TM View Post

The CPU and GPU inside TVs are usually low power SoCs. They are by no means identical to the ones found in games console or PC (excluding the Cell TV).

agree, I was thinking about our last discussion why TVs are not able to deinterlace as proficiently as you claim possible with current algo. Until I happen on this thread that says 2011 50" plasma are drawing around 200W to as low as 149W. I think 1/2 of 2011 plasmas are not even inverse telecine capable, not sure about LCD.

The Fusion APU draws 18W if I read correctly vs 200W for high end gaming, but have yet to figure out their performance in real world (think Atom). People and regulators are concerned abut power use in TVs, and since AVR does not have this "spotlight" it may actually be an opportunity for them to incorporate better processors. But I am watching the anecdotal evidences and trying to reconcile the theory and with what is being done in the real world.

Eventually SoC will get more powerful with less power consumption as nodes go below 32nm, but think will not be anytime soon.

http://www.tomshardware.com/reviews/...ards,2849.html
specuvestor is offline  
post #66 of 3692 Old 03-28-2011, 04:03 AM
AVS Special Member
 
Nielo TM's Avatar
 
Join Date: Oct 2005
Location: London
Posts: 1,479
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
There's no need for powerful processor for de-interlacing and scaling. It is the quality of algorithm what matters. BTW the new 2011 Panasonic NeoPlasma (42GT30) has passed both 2:2 and 2:3 detection

http://www.hdtvtest.co.uk/news/panas...1103281070.htm



So that only leaves out LG plasmas.
Nielo TM is offline  
post #67 of 3692 Old 03-28-2011, 04:45 PM
AVS Special Member
 
specuvestor's Avatar
 
Join Date: Jun 2010
Posts: 2,399
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
Quote:
Originally Posted by Nielo TM View Post

There's no need for powerful processor for de-interlacing and scaling. It is the quality of algorithm what matters. BTW the new 2011 Panasonic NeoPlasma (42GT30) has passed both 2:2 and 2:3 detection

http://www.hdtvtest.co.uk/news/panas...1103281070.htm

So that only leaves out LG plasmas.

Ok timely review cause I read earlier in the panny thread that they will defect to Sammy as it does not support 24p. So that leaves ST series in panny? What about LCDs?

IMHO I don't think it is that simple as software or firmware issue, otherwise the GPU guys would have long taken over the TV video market instead of distinct camps with ATI (now AMD) and Nvidia on PC, Mediatek, trident, Mstar, anchor bay, Qdeo, lumagen, farudja, and other segmented players on the TV. The market doesn't seems to suggest it is so straightforward.

PS would be interesting to also follow how other sets stack up in cadence detection BROADCAST.
specuvestor is offline  
post #68 of 3692 Old 03-28-2011, 05:35 PM
AVS Special Member
 
Nielo TM's Avatar
 
Join Date: Oct 2005
Location: London
Posts: 1,479
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
I'm not sure I understand regarding the GPU.

GPU and Video Processors are very different. And the latest GPUs from AMD and ATI have dedicated (fully programmable) Video Processors on-board (called UVD and PureVideo). So features can be added via up-date but certain features are hardware based.
Nielo TM is offline  
post #69 of 3692 Old 03-28-2011, 09:22 PM
AVS Special Member
 
specuvestor's Avatar
 
Join Date: Jun 2010
Posts: 2,399
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
Quote:
Originally Posted by Nielo TM View Post

GPU and Video Processors are very different.

Totally agree... it is just that in a previous thread you were referring to GPUs. I had been thinking why Nvidia and ATI couldn't capture the TV VP market since their video processing is so effective as claimed.

"because the $30-40 ATI HD5450 can correctly de-interlace the majority of interlaced content"
http://www.avsforum.com/avs-vb/showt...7#post20084797

In case anyone think I am going off topic, I am just saying that besides larger TV to be here for 4K res to be viable and perceivable, the VP must be powerful enough yet not power hungry. If power consumption is not an issue then the processing prowess is already available today.
specuvestor is offline  
post #70 of 3692 Old 03-29-2011, 02:58 AM
AVS Special Member
 
Nielo TM's Avatar
 
Join Date: Oct 2005
Location: London
Posts: 1,479
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
The ATI HD5450 has UVD 2.0, which is a dedicated video processor. So by purchasing the GPU, you have access to the UVD, which isn't sold separately.

The same applies to NVIDIA's PureVideo VP.

Both both do use the GPU to accelerate video decoding, but that's a different story.
Nielo TM is offline  
post #71 of 3692 Old 03-29-2011, 04:12 AM
AVS Special Member
 
specuvestor's Avatar
 
Join Date: Jun 2010
Posts: 2,399
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
Actually I was hoping you would tell me why Nvidia and ATI are not in TVs or AVR I remembered Nvidia did try.

With discrete graphic cards going the way of discrete sound cards, catering to just very high end users, they sure would need this additional source of income if possible, though their focus is on mobile devices now.
specuvestor is offline  
post #72 of 3692 Old 03-29-2011, 04:27 AM
AVS Special Member
 
Nielo TM's Avatar
 
Join Date: Oct 2005
Location: London
Posts: 1,479
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
Actually discrete graphic cards will be very much alive for years to come. Sound-card was a different story and far less complicated and easy to implement. But dedicated sound cards are still superior (especially when EQ is utilized)

IGPs of today and tomorrow will only benefit the low-end user. Mid-range and high-end users will have to opt for discrete graphics. After all, no IGP can match the HD5670 yet.

ATI was in the VP business and there are rumors that Samsung is using ATI/AMD's patents, but I'm not sure about their current status. I guess they can't compete against the major brands since neither (AMD or NVIDIA) have their own fabrication plant.


AMDs and NVIDIA's biggest fear is the rise of console gaming. But thankfully modern GPUs capable of more than just rendering visuals. In fact, GPUs are more powerful than todays CPUs (not to mention cheaper). GPUs are also used in supercomputers due to their excellent parallel processing capability.
Nielo TM is offline  
post #73 of 3692 Old 03-29-2011, 04:44 AM
AVS Special Member
 
specuvestor's Avatar
 
Join Date: Jun 2010
Posts: 2,399
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
Sure they will be alive few years down just as soundcards are alive now Sales for discrete has been falling last few years which drove ATI into AMD and creating Fusion, and Nvidia into mobile.

I think most VP players are fabless (AMD do have fab) except for Samsung LSI and maybe Sony and Panny (not sure about japanese companies)
specuvestor is offline  
post #74 of 3692 Old 03-29-2011, 05:00 AM
AVS Special Member
 
Nielo TM's Avatar
 
Join Date: Oct 2005
Location: London
Posts: 1,479
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
Fusion was created for a completely different reason. CPUs are excellent serial processors (as in one task at a time). But GPUs are excellent parallel processors (simultaneous processing). Merging the two will provide huge performance increase.

The problem now is the utilization of the GPU and because of various APIs (openCL, CUDA and DirectCompute) and driver issues only a small number of applications are utilizing the GPU (excluding the OSX)

http://www.youtube.com/watch?v=QlDhe1IKyVk


Hopefully Windows 8 will provide native GPU pipeline so developers can utilize the GPU for everyday applications (like the IE9 for an example).

GPU will also boost the quality of motion interpolation as it can simultaneously process multiple frames more accurately.



The fall of the dedicated graphics cards is due to number of reason and must not be compared to soundcards. At least not yet. And the decline is only temporary because we'll be using 3d graphics for more than gaming.
Nielo TM is offline  
post #75 of 3692 Old 03-29-2011, 05:07 AM
AVS Special Member
 
Nielo TM's Avatar
 
Join Date: Oct 2005
Location: London
Posts: 1,479
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
Quote:
Originally Posted by specuvestor View Post

I think most VP players are fabless (AMD do have fab) except for Samsung LSI and maybe Sony and Panny (not sure about japanese companies)

AMD uses 3rd party fabs to manufacture the LSIs as far as I'm aware (TSMC and Globalfoundries).
Nielo TM is offline  
post #76 of 3692 Old 03-29-2011, 07:45 AM
AVS Special Member
 
Nielo TM's Avatar
 
Join Date: Oct 2005
Location: London
Posts: 1,479
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
Forgot to add, TVs will be incorporating 3D graphics hardware in the near future (especially when they incorporate HTML 5.0 with WebGL support). So that will boost AMD and NVIDIA revenue.

But AMD and NVIDIA may have to compete with other brands such as Imagination Technologies.
Nielo TM is offline  
post #77 of 3692 Old 03-29-2011, 06:25 PM
Advanced Member
 
RLBURNSIDE's Avatar
 
Join Date: Jan 2007
Posts: 729
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 23 Post(s)
Liked: 126
I learn something new almost every time I log on here. Incidentally, speaking of ATI vs Nvidia, I just upgraded my HTPC from an nvidia 8800 to an AMD 6950 and the postprocessing video shaders were all turned on by default, making me do a double take as everything seemed clay-like due to the denoising being a bit too aggressive.

They really need to work on these a bit more. Denoising a highly compressed video signal is not the right time to do it, IMO, denoising should be done during re-compression as a first step, otherwise it makes it look fake / wierd. I don't know what algorithm AMD uses by default in their video driver, but I'd be willing to bet it isn't as good as some of the state-of-the art non-local means denoising algorithms available in the academic literature. I began implementing some of this on a kinect title but the Xbox GPU was too limited. Maybe on this new ATI card I'll be able to get it working at realtime speeds.

Anyway, my point is, there is definitely a ton of improvement to be made in video quality, the question is who will get it done first and/or most efficiently for inclusion inside TVs, where this really belongs. (well, I like tinkering with shader settings, but most people don't). There are a lot of Blurays out there with a ton of film grain and other noise which really bugs me.
RLBURNSIDE is offline  
post #78 of 3692 Old 03-29-2011, 07:22 PM
AVS Special Member
 
Nielo TM's Avatar
 
Join Date: Oct 2005
Location: London
Posts: 1,479
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
Yes, ATI's drivers and setting can be quite aggravating. But you'll adapt to it
Nielo TM is offline  
post #79 of 3692 Old 03-30-2011, 12:55 AM
AVS Special Member
 
specuvestor's Avatar
 
Join Date: Jun 2010
Posts: 2,399
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
Quote:
Originally Posted by Nielo TM View Post

Fusion was created for a completely different reason. CPUs are excellent serial processors (as in one task at a time). But GPUs are excellent parallel processors (simultaneous processing). Merging the two will provide huge performance increase.

The problem now is the utilization of the GPU and because of various APIs (openCL, CUDA and DirectCompute) and driver issues only a small number of applications are utilizing the GPU (excluding the OSX)

Hopefully Windows 8 will provide native GPU pipeline so developers can utilize the GPU for everyday applications (like the IE9 for an example).

GPU will also boost the quality of motion interpolation as it can simultaneously process multiple frames more accurately.

The fall of the dedicated graphics cards is due to number of reason and must not be compared to soundcards. At least not yet. And the decline is only temporary because we'll be using 3d graphics for more than gaming.

IIRC ATI/AMD merger was not a long thought out process. It was forced by circumstances. AMD was also faltering after Intel Core2 Duo release (and AMD's now infamous 30% market share prediction) CPU is MPU while GPU is dedicated like the now obsolete maths processors of the 8088 era. They are inherently very different as they serve different purposes. It is of course optimal for everything integrated in a die but that is only possible with technology advancement in node shrinkage. (Next to go will be North and South Bridge integration) There is also synergy by removing the bus transport between GPU and CPU. In any case Intel was the one who started integrating graphics alomost 10 years ago and both AMD and ATI had declared it is no threat then. Differences between sound cards and graphics cards market dynamics are obvious, but I assure you that units sold for discrete graphics card will not exceed their former glory with desktop volumes stagnating at 200mio.

"ATI has been mentioned before as a takeover target. Unimpressive financial results during 2005 and slower-than-expected growth fueled speculation that AMD, Intel, Broadcom, and Texas Instruments were contemplating a play for the graphics company."

http://arstechnica.com/old/content/2006/05/6950.ars

There are many reasons why Macs are the primary computer for graphics (and Adobe :P ) even before OSX, which includes 64bit threading. With their architecture moving to Intel 86 architecture, we can be sure it is not a hardware ie graphics card issue. It is as you say software or API issue.

Mstar the market leader has stated very clearly that their edge is hardware. TV set makers just develop their firmware or engine layer.

Quote:
Originally Posted by Nielo TM View Post

AMD uses 3rd party fabs to manufacture the LSIs as far as I'm aware (TSMC and Globalfoundries).

Sorry I keep forgetting AMD spin off their foundries to arab backed globalfoundries in 2009 because of financial difficulties (think core2 duo) ATI historically has been fabless and use TSMC and UMC as foundries

Quote:
Originally Posted by Nielo TM View Post

Forgot to add, TVs will be incorporating 3D graphics hardware in the near future (especially when they incorporate HTML 5.0 with WebGL support). So that will boost AMD and NVIDIA revenue.

Not too sure what you mean here but 3D video and 3D graphics are very different. Will AMD and Nvidia get into TV sets? Maybe as things get more integrated but power consumption is most important IMHO. So it will not be within next 5 years I think.
specuvestor is offline  
post #80 of 3692 Old 03-30-2011, 01:15 AM
AVS Special Member
 
specuvestor's Avatar
 
Join Date: Jun 2010
Posts: 2,399
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
Quote:
Originally Posted by RLBURNSIDE View Post

Denoising a highly compressed video signal is not the right time to do it, IMO, denoising should be done during re-compression as a first step, otherwise it makes it look fake / wierd.

Absolutely, even for standard VP processes, deinterlacing first and scale later is VERY different from scaling first and deinterlace later. Throw in compression and other legacy issues like interlaced cameras, overscan, gamma encoding and it makes very complicated brew. We discussed this in another thread which I think progessive source should be the end game ie simplify by removing one part of the value chain)

Artifacts will be even more obvious in Quad resolution or >70" displays.
specuvestor is offline  
post #81 of 3692 Old 03-30-2011, 12:59 PM
AVS Special Member
 
barrelbelly's Avatar
 
Join Date: Nov 2007
Posts: 1,665
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 46 Post(s)
Liked: 223
Didn't Toshiba try this with SRT in an attempt to extend the life of DVD patents, after the demise of HDDVD? They purportedly super processed 480i signals to create a totally new 960p image that could be upscaled to 1080p for near "Full HD" resolution & quality. I know they launched a few DVD players that purportedly had a weaker execution of tis system in 2009. But they alledgely were holding back the big guns to include it in their new HDTV's, to create new SRT images from 1080i/p and upconvert to higher resolution (4k by 2K?). Is this the same stuff or different from 4K by 2K?
barrelbelly is offline  
post #82 of 3692 Old 03-31-2011, 08:10 AM
AVS Special Member
 
Nielo TM's Avatar
 
Join Date: Oct 2005
Location: London
Posts: 1,479
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
Quote:
Originally Posted by specuvestor View Post

IIRC ATI/AMD merger was not a long thought out process. It was forced by circumstances. AMD was also faltering after Intel Core2 Duo release (and AMD's now infamous 30% market share prediction) CPU is MPU while GPU is dedicated like the now obsolete maths processors of the 8088 era. They are inherently very different as they serve different purposes. It is of course optimal for everything integrated in a die but that is only possible with technology advancement in node shrinkage. (Next to go will be North and South Bridge integration) There is also synergy by removing the bus transport between GPU and CPU. In any case Intel was the one who started integrating graphics alomost 10 years ago and both AMD and ATI had declared it is no threat then. Differences between sound cards and graphics cards market dynamics are obvious, but I assure you that units sold for discrete graphics card will not exceed their former glory with desktop volumes stagnating at 200mio.

I don't understand your post above or how it relates to fusion. Fusion isn't just a GPU embedded within the CPU die. IT is much more than that and modern GPUs are capable of more than just rendering pixels. For an example, sound card can only process sound and nothing else. It is also very simple and low-cost to implement. But modern graphics cards are different. They are no longer single purpose devices. They can execute codes once exclusive to CPUs. Firefox, IE and Chrome all utilize the GPU and more and more applications will tap into the power of the GPU.

BTW, Intel started integrating GPU into CPU die just a short while ago. Before that, GPU was integrated into the NB. Also, only the latest Sandybridge features GP-GPU.




Quote:
Originally Posted by specuvestor View Post

Not too sure what you mean here but 3D video and 3D graphics are very different. Will AMD and Nvidia get into TV sets? Maybe as things get more integrated but power consumption is most important IMHO. So it will not be within next 5 years I think.

It didn't say anything about 3D video. and I am aware of the difference between 3D video and 3D graphics.

HTML 5.0 supports WebGL, which is now supported by Firefox and Chrome

http://webglsamples.googlecode.com/h.../aquarium.html

http://www.chromeexperiments.com/webgl

https://demos.mozilla.org/en-US/?WT....WT.mc_ev=click
WebGL is a Web-based Graphics Library for browser based 3D content (graphics). As Smart TVs evolve to include such features and including gaming, GPUs will make their way into TVs.
Nielo TM is offline  
post #83 of 3692 Old 03-31-2011, 10:37 AM
AVS Special Member
 
specuvestor's Avatar
 
Join Date: Jun 2010
Posts: 2,399
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
My point is this: Fusion was not a planned thing as you think being a natural progression to integrate GPU with CPU. it was more an offspring of a "shotgun" marriage Put in another way, if TI or even Apple took over ATI, would Fusion even exist? It was an afterthought from the logic of the merger, but IGP paved the way.

GPU had long surpassed CPU in number of transistors per die and sheer processing prowess. But neither ATI nor Nvidia can make an x86 based CPU to compete. It's not as portable as you assume. They now have a chance under the ARM mobile architecture, a space that intel also tried and failed with it's OMAP processors. Different market segments, including TV VP, continue to have it's idiosyncrasies until technology advancements can further integrate them in future.

I am sure HTML 5 will be here. But are dedicated GPUs even required, or can my 15 year old 486 run it? (no kiddin it's running my abandonware games and firefox)
specuvestor is offline  
post #84 of 3692 Old 03-31-2011, 11:15 AM
AVS Special Member
 
Nielo TM's Avatar
 
Join Date: Oct 2005
Location: London
Posts: 1,479
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
Quote:
Originally Posted by specuvestor View Post

My point is this: Fusion was not a planned thing as you think being a natural progression to integrate GPU with CPU. it was more an offspring of a "shotgun" marriage Put in another way, if TI or even Apple took over ATI, would Fusion even exist? It was an afterthought from the logic of the merger, but IGP paved the way.

It is a natural progress to integrate GPU and CPU as I've stated earlier. It helps to lower manufacturing costs and power consumption. Thanks to its low TDP, it can be embedded within devices that could not host a dedicated GPU die. In addition, having two types of processor each with their unique attributes (serial and parallel) will help to boost performance. For an example, you can outsource tasks that preform well on parallel processors (e.g. transcoding) to the GPU while the CPU is free to process serial tasks.

But you DON'T need AMD's fusion to outsource tasks to the GPU. All (current) GPUs from AMD and NVIDIA are capable of processing GPC. But Fusion is ideal for ultra thin laptops, tablets and HDTVs.

Unfortunately Intel's has never produced a respectable GPU (or IGP) and NVIDIA has never produced a CPU. But AMD has both, which allowed them to create FUSION. But FUSION is not a replacement for dedicated GPU (as I've stated earlier).

And the new 11.3 driver is a major step forward in terms of utilization of OpenCL

"Highlights of the AMD Catalyst 11.3 Windows release includes:

New Features:

Seamless GPU Compute support

The AMD Accelerated Parallel Processing (APP) OpenCL runtime is now enabled by default within AMD Catalyst. Applications that leverage OpenCL for GPU based compute tasks will automatically benefit from the significant performance boost that this provides."


Quote:
Originally Posted by specuvestor View Post

I am sure HTML 5 will be here. But are dedicated GPUs even required, or can my 15 year old 486 run it? (no kiddin it's running my abandonware games and firefox)

To fully take advantage of all the features offered by HTML 5, you'll need a modern GPU. WebGL atm is doesn't require powerful GPU mainly due to the bandwidth limit. But that will change shortly.

Click on the links I posted. The 3D graphics are all rendered by the browser.
Nielo TM is offline  
post #85 of 3692 Old 03-31-2011, 11:21 AM
AVS Special Member
 
Nielo TM's Avatar
 
Join Date: Oct 2005
Location: London
Posts: 1,479
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
Quote:
Originally Posted by specuvestor View Post

GPU had long surpassed CPU in number of transistors per die and sheer processing prowess. But neither ATI nor Nvidia can make an x86 based CPU to compete. It's not as portable as you assume. They now have a chance under the ARM mobile architecture, a space that intel also tried and failed with it's OMAP processors. Different market segments, including TV VP, continue to have it's idiosyncrasies until technology advancements can further integrate them in future.

That's where the FUSION comes in. IT features x86 core and PPU (HD6000 GPU core).

But most of todays computers all ready contain x86 CPU and PPU (GPGPU core).

We not talking about running x86 on a GPU or GPUs replacing CPUs. It is about utilizing the CPU and PPU to optimize performance.
Nielo TM is offline  
post #86 of 3692 Old 03-31-2011, 05:01 PM
AVS Special Member
 
specuvestor's Avatar
 
Join Date: Jun 2010
Posts: 2,399
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
Don't get me wrong, I'm not saying SoC are not a good solution or eventuality. I'm saying the timing is not there yet for ATI or Nvidia to take over TV VP even with their superior processing power or API. I'm saying it will not happen in the next 5 years until they can have a good balance trade off between processing prowess and power consumption, while competitors are not standing still either.

The reason i brought this up is your assumption that VP processes is perfected now with all the GPU buzz and Fusion. However having the best tech does not immediately means victory, as we've seen repeatedly in past 30 years. It's about implementation in an environment with constraints as well. I'm betting none of the major brand TVs will be using ATI or Nvidia solution next year either. It also remains to be seen if x86 PC platform or ARM based processors will win the mobile device war. IMHO I'm gunning for the latter because x86 inherently is not designed for low power.

PS OpenGL didn't need dedicated GPU to render 3D either. BTW I'm sure HTML 5 will come to pass at adobe's detriment because it is anchored by google and apple
Quote:
Originally Posted by Nielo TM View Post

To fully take advantage of all the features offered by HTML 5, you'll need a modern GPU. WebGL atm is doesn't require powerful GPU mainly due to the bandwidth limit. But that will change shortly.

specuvestor is offline  
post #87 of 3692 Old 03-31-2011, 05:15 PM
AVS Addicted Member
 
rogo's Avatar
 
Join Date: Dec 1999
Location: Sequoia, CA
Posts: 30,070
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 40 Post(s)
Liked: 388
I have a very tough time believing TV mfrs. that have multiple sources for all the parts they need for a TV are even contemplating going to ATI or Nvidia anytime. Why would they mess with what works and even up spending likely more money?

There is no difference in HDMI cables. If you can see the picture without visible dropouts or sparklies, the cable is working at 100%. No other cable will display a better version of that picture. You're simply wrong if you think there is a better digital cable than one that is already working.
rogo is offline  
post #88 of 3692 Old 03-31-2011, 05:19 PM
AVS Special Member
 
Nielo TM's Avatar
 
Join Date: Oct 2005
Location: London
Posts: 1,479
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
Quote:
Originally Posted by specuvestor View Post

Don't get me wrong, I'm not saying SoC are not a good solution or eventuality. I'm saying the timing is not there yet for ATI or Nvidia to take over TV VP even with their superior processing power or API. I'm saying it will not happen in the next 5 years until they can have a good balance trade off between processing prowess and power consumption, while competitors are not standing still either.

The reason i brought this up is your assumption that VP processes is perfected now with all the GPU buzz and Fusion. However having the best tech does not immediately means victory, as we've seen repeatedly in past 30 years. It's about implementation in an environment with constraints as well. I'm betting none of the major brand TVs will be using ATI or Nvidia solution next year either. It also remains to be seen if x86 PC platform or ARM based processors will win the mobile device war. IMHO I'm gunning for the latter because x86 inherently is not designed for low power.

AMD's and NVIDIA's VP is different to the GPU. It is just a sub component within the GPU die. It can be used as a stand alone LSI, but no TV thus far featured AMD or NVIDIA VP.

4 year ago they may have made a difference, but the 2010 Samsung Valencia LSI is superior to ATI and NVIDIA's VP. The C580 I reviewed passed the HQV test with flying colors. I haven't tested the new Panasonic VP, but I'm sure it will be identical.

But when Smart TVs evolve and require powerful GPUs to render 3D graphics, ATI and NVIDIA LSI's will make their way inside HDTVs. After all the TDP of HD5650 is only 15-19 Watts.

http://www.amd.com/us/products/noteb...00-5600.aspx#3

2-3 years from now, HD8650/9650 will have performance identical to todays $200-300 USD cards
Nielo TM is offline  
post #89 of 3692 Old 03-31-2011, 05:24 PM
AVS Special Member
 
Nielo TM's Avatar
 
Join Date: Oct 2005
Location: London
Posts: 1,479
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
LG already started to integrate games and other rich content but it is based on ARM (akin to all TVs).

And from what I gather Samsung want the TV to become the central platform. I wouldn't be surprised if Samsung and LG started marketing TV as something more.

PS: AMD and NVIDIA aren't alone in the GPU business (as I've stared earlier). Someone could beat them to it. But in the PC industry, both are safe and secure.
Nielo TM is offline  
post #90 of 3692 Old 03-31-2011, 10:36 PM
AVS Special Member
 
specuvestor's Avatar
 
Join Date: Jun 2010
Posts: 2,399
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
Everyone wants a slice of the living room. That was the theme behind the console war (The central device which everything connects to) which ended with a whimper, same thing what Apple TV is trying to SLOWLY do

Quote:
Originally Posted by Nielo TM View Post

But when Smart TVs evolve and require powerful GPUs to render 3D graphics, ATI and NVIDIA LSI's will make their way inside HDTVs. After all the TDP of HD5650 is only 15-19 Watts.

2-3 years from now, HD8650/9650 will have performance identical to todays $200-300 USD cards

Correct me if I am wrong but TDP is the thermal envelope and not power consumption?

But what power would HD8650/9650 draw? The past 5 years have shown a trend that power consumption has been going up with new processors unlike the Athlon2 and Core2 Duo era, suggesting less efficiency for more computing power.

FWIW I am believer of SoC. I think we both agree on the eventuality but I disagree with the timing. Heck I even bought a UART card to improve my 9600baud US Robotics communication speed some 20 years ago when I/O cards were also standard. Now all these are integrated into the south bridge. In another 20 years time I am not too sure if discrete graphic cards will still be around.

We could say Plasma is better than LCD, and discrete sound cards and graphics cards are better than built in but they will never be mass market in forward projections anymore. That's a reality as your utility curve tapers off.

Similarly it's difficult to see beyond Quad HD. We can only see a difference in that much resolution. Same thing happening in the audio space... I mean how much better can you get beyond LOSSLESS?
specuvestor is offline  
Reply Flat Panels General and OLED Technology

Tags
Samsung Bd D7000 3d Blu Ray Disc Player Silver , Samsung Pn51d8000 51 Inch 1080p 3d Ready Plasma Hdtv 2011 Model , Displays , Pioneer Pdp 5080 Hd
Gear in this thread

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off