or Connect
AVS › AVS Forum › Display Devices › Flat Panels General and OLED Technology › 4k by 2k or Quad HD...lots of rumors? thoughts?
New Posts  All Forums:Forum Nav:

4k by 2k or Quad HD...lots of rumors? thoughts? - Page 89

post #2641 of 3670
Quote:
Originally Posted by sytech View Post

This is very misleading. They will be using 6 4K cameras, but they are not broadcasting or streaming a 4K h.265 signal. They are only going to use the 4K cameras to get cleaner and clearer instant replay calls. So now when they have to zoom in to see if a player's foot is in bounds or not, it will not be the usually pixelated mess. Just like with HD, I suspect the NFL will be one of the first to make the switch over to 4K and then just downrez it for broadcast. It will probably be late 2015/early 2016 before we see a few channels like ESPN, HBO and a few PPV channels broadcast in 4K, but we should have 4K blu-ray by the end of the this year to hold us over.



Yeah I know it's not being broadcast in 4k cause, rightt now, they can't.

But it is noteworthy as it is progress towards the 4k standard.

We move forward in little increments and not all at once.


GO NINERS!!!
post #2642 of 3670
Quote:
Originally Posted by Nitro67 View Post

I have both, but it is easier to get a Bluray changer, and put a disc in the machine. Than do the HTPC route. I primary use my htpc for audio only, it is easier. I use Control4 to access the bluray changers via the Ipad.
Sony 7000ES changers have some cool features though http://www.soundandvisionmag.com/article/sony-bdp-cx7000es-blu-ray-disc-changer?page=0,1
madVR on PC processes the image in 16-bit. Another benefit is that you can bypass the menus and start playing the film instantly. No disc access times either.
I know that some of the newer Sony Blu-ray players can do it, but can the disc changer output 1080p24 with DVDs?
Quote:
Originally Posted by esdwa View Post

I use PC for pc work and sometimes as transfer tool but never for playback. Having hot shoe box, booting in my living room usually far more than few seconds was never an option but I can understand some find it acceptable.
Well modern PCs can be almost silent, and put out very little heat with media playback, but I don't keep my HTPC in the living room - that's what long HDMI cables or HDMI over CAT6 are for.
Quote:
Originally Posted by esdwa View Post

Instead I use standalone networked media players around my house one in every room where display is present. Comparing to typical HD/3D capable HTPC these are cheap, small, almost invisible unlike most HTPC and all feed from one large hidden multidisk media storage. Gigabit wired LAN ensure smooth playback with instant access to my entire library from anywhere in the house. It is fast, easy and super convenient. Without spending hundreds for HTPC upgrades I can use time to enjoy watching my library instead of spending precious time for its maintenance.
Well if you built a system that can handle Blu-ray, what do you need to upgrade it for? And I don't know what you mean about "maintenance" I haven't had to change a thing since setting it up. Building the system would certainly be cheaper than a $1900 disc changer, and there's no reason you can't use the HTPC to serve content to media streaming boxes.
post #2643 of 3670
Quote:
Originally Posted by esdwa View Post

For the record, neither I have bd changer. I just understand approach of those who have it.

I use PC for pc work and sometimes as transfer tool but never for playback. Having hot shoe box, booting in my living room usually far more than few seconds was never an option but I can understand some find it acceptable. Instead I use standalone networked media players around my house one in every room where display is present. Comparing to typical HD/3D capable HTPC these are cheap, small, almost invisible unlike most HTPC and all feed from one large hidden multidisk media storage. Gigabit wired LAN ensure smooth playback with instant access to my entire library from anywhere in the house. It is fast, easy and super convenient. Without spending hundreds for HTPC upgrades I can use time to enjoy watching my library instead of spending precious time for its maintenance.

The advantage of the 400 Disc changers is that you have a premier video processor. Sorry, your network player won't ever match. Another advantage is that the audio. The main reason that I went to a changers, was for the HD audio with Bluray. My changer is equal to 20TB of storage, and I have 3 changers. So, have you priced 60TB of online storage? Another advantage is electricity, here is the down fall of running a large array. It consumes lot of electricity. I had 32TB server online, and it would eat around $60 a month of electricity. So, I recently redesigned my server to be my main PC. Changed the power supply, and I just turn it on as required. Before I would run the server 24/7 and i noticed the electricity was going through the roof. Now, it is cheaper. Eventual plans is to have the server come on from a touch of ipad. Well, Wake on Lan feature.

The HTPC was redesigned so it uses a passive power supply, and SSD drives. It is very silent. I can stream movies if I want to do it, but I tried both. The best option is the changers. It is much easier to manage.
post #2644 of 3670
Quote:
Originally Posted by Nitro67 View Post

The advantage of the 400 Disc changers is that you have a premier video processor. Sorry, your network player won't ever match. Another advantage is that the audio. The main reason that I went to a changers, was for the HD audio with Bluray. My changer is equal to 20TB of storage, and I have 3 changers. So, have you priced 60TB of online storage? Another advantage is electricity, here is the down fall of running a large array. It consumes lot of electricity. I had 32TB server online, and it would eat around $60 a month of electricity. So, I recently redesigned my server to be my main PC. Changed the power supply, and I just turn it on as required. Before I would run the server 24/7 and i noticed the electricity was going through the roof. Now, it is cheaper. Eventual plans is to have the server come on from a touch of ipad. Well, Wake on Lan feature.

The HTPC was redesigned so it uses a passive power supply, and SSD drives. It is very silent. I can stream movies if I want to do it, but I tried both. The best option is the changers. It is much easier to manage.
Well in reality, most Blu-ray discs don't even approach 50GB in size, and if you want to rip the main film only, you can often shave off another 10GB or more. I have about 350 titles (full disc) and that's just over 10TB. And while you have 1200 discs in a changer, I'm willing to bet that they're not all Blu-rays. (I would have to question your taste in films if they were) A DVD takes up the same amount of space as a Blu-ray in a changer, you can store 5-10 DVDs for every Blu-ray on a HTPC, so it goes both ways as well. (and you get much better handling of DVDs with HTPC)

But if you want to price it out like that, you would be $2800 for 60TB of storage - though I am sure you could get a bulk discount on that if you looked harder, I just pulled prices off newegg - and that would consume 16W of power. (20 drives pulling 0.8W) I expect you could get away with 40TB of storage and still have plenty of space for 1200 titles though.

Add in the rest of the HTPC hardware and you're still going to be under $4000 compared to the $6000 you have spent on changers.


I can accept that changers work out better for you, but I think they are an overpriced solution that is less than ideal, and clearly the market has shown that there's minimal demand for them. But whatever works for you.
The other advantage of a HTPC is that it's a multifunction device rather than being constrained to only playing DVD/Blu-ray. I can browse the internet on it, play games etc. too.
post #2645 of 3670
Quote:
Originally Posted by Chronoptimist View Post

madVR on PC processes the image in 16-bit. Another benefit is that you can bypass the menus and start playing the film instantly. No disc access times either.
I have Jriver, but I found that MadVR is more of joke at this point. Lately, I been using another render through the custom menus. As far as comparison, then I rather use the Sony changers. I can do that too, but if I am watching Bluray. The best quality comes from the changers.
Quote:
Originally Posted by Chronoptimist View Post

I know that some of the newer Sony Blu-ray players can do it, but can the disc changer output 1080p24 with DVDs?.
My TV won't do 1080p24, so for me it doesn't matter. Although, that feature is in the menus, if I recall correctly.
Quote:
Originally Posted by Chronoptimist View Post

Well modern PCs can be almost silent, and put out very little heat with media playback, but I don't keep my HTPC in the living room - that's what long HDMI cables or HDMI over CAT6 are for.
You can build the HTPC to be zero decibels, but mine is around 30 decibels. The only thing that I don't have is the passive CPU fany, which requires a new HTPC case as well. There is a distance that HDMI maxes out. Typically around 50 feeet, if I recall. Yes,you can buy longer cables, but I am getting this from my installer that does high end builds. HDMI over CAT6 is not a solution, it is more of a temporary solution. Fiber is going to be the next solution, because of bandwidth. I built my new HTPC and it has the the Thunderbolt cable. The fiber optical cable hasn't been released yet.[/quote]
Quote:
Originally Posted by Chronoptimist View Post

Well if you built a system that can handle Blu-ray, what do you need to upgrade it for? And I don't know what you mean about "maintenance" I haven't had to change a thing since setting it up. Building the system would certainly be cheaper than a $1900 disc changer, and there's no reason you can't use the HTPC to serve content to media streaming boxes.
Actually, if you do lot of streams of bluray it can be costly. People tend to build their machines with cheap hard drives that last under a year. To build it right, then you have to get enterprise drives. So, you are looking at drive cost of about $415 for 4 TB drive, and then add in cost of array card. You can see it adds up. Most people tend to under build their system, and lose complete collections of movies. Actually, I got 2 of the changers for $550 each last year. Sony was getting rid of them. I should have bought more.
Edited by Nitro67 - 2/4/13 at 11:13am
post #2646 of 3670
Quote:
Originally Posted by frostylou View Post

Hello all,

Quick question.
There is this brilliant theater near me called Cinetopia. All of their theaters have 4K projection. My question is this.
Will 4k picture look even Clearer, sharper, and more detailed on a smaller screen then it will on a much larger screen?
More specifically they have an 85 foot screen in in their largest theater and a small screen in what is called one of their movie parlor theaters.
Will 4k Look more amazing on the smaller screen… ie..In the way that a 1080P picture looks great on 32 inch screens and 42 inch screen and 50 inch screen… But the more you go up in screen size To the really big big screens.. there is less detail and less pop And starts to look less amazing
Thx

Ufff, think a moment that how things look depends on the viewing distance. It might be too difficult for your to grasp this but the truth is if the viewing distance is kept constant wrt to the screed size, pictures on all screens will look the same biggrin.gif.
post #2647 of 3670
Quote:
Originally Posted by Nitro67 View Post

I have Jriver, but I found that MadVR is more of joke at this point. Lately, I been using another render through the custom menus. As far as comparison, then I rather use the Sony changers. I can do that too, but if I am watching Bluray. The best quality comes from the changers.
A joke at this point? You must have something misconfigured. madVR beats anything I have compared it with.
Quote:
Originally Posted by Nitro67 View Post

My TV won't do 1080p24, so for me it doesn't matter. Although, that feature is in the menus, if I recall correctly.
$6000 on changers and your TV can't do 1080p24!?
Quote:
Originally Posted by Nitro67 View Post

You can build the HTPC to be zero decibels, but mine is around 30 decibels. The only thing that I don't have is the passive CPU fany, which requires a new HTPC case as well. There is a distance that HDMI maxes out. Typically around 50 feeet, if I recall. Yes,you can buy longer cables, but I am getting this from my installer that does high end builds. HDMI over CAT6 is not a solution, it is more of a temporary solution. Fiber is going to be the next solution, because of bandwidth. I built my new HTPC and it has the the Thunderbolt cable. The fiber optical cable hasn't been released yet.
I don't really recommend 100% fanless systems. Passive devices in the case are good, but they need some airflow. Buy Noctua fans if noise is a concern. (though I keep the HTPC out of the room)

There are active HDMI extenders you can use, if 50ft is not long enough. CAT5e/6 extenders can cover about 800ft and fiber extenders about 1000ft.

Thunderbolt is not a display standard. It happens to accept displayport signals, but is essentially an external connection to the computer's PCI-E bus and a complete waste of money right now. The only potentially exciting thing about Thunderbolt is being able to have something like an 11" MacBook Air dock to a full blown desktop graphics card when you bring it home and hook it up to a large display. But Thunderbolt bandwidth has already halved since its introduction.
Quote:
Originally Posted by Nitro67 View Post

Actually, if you do lot of streams of bluray it can be costly. People tend to build their machines with cheap hard drives that last less than a year. To build it right, then you have to get enterprise drives. So, you are looking at drive cost of about $415 for 4 TB drive, and then add in cost of array card. You can see it adds up. Most people tend to under build their system, and lose complete collections of movies. Actually, I got 2 of the changers for $550 each last year. Sony was getting rid of them. I should have bought more.
I have never had a hard drive last less than a year - you run a full disk check when it's new, and if it passes that it is not likely to fail prematurely. Modern hard drives are extremely reliable if they are kept at a reasonable temperature. Buying 4TB drives is a massive waste of money, because many of them cost more than 2x a 3TB disk and would only save you 4W on power consumption. But yes, if you are buying enterprise drives, which are designed for 24/7 operation you will more than double your cost and have ridiculous power consumption. (over 100W if you keep all the drives spinning, which is what enterprise drives are designed for)
post #2648 of 3670
Quote:
Originally Posted by Nitro67 View Post

The advantage of the 400 Disc changers is that you have a premier video processor. Sorry, your network player won't ever match. Another advantage is that the audio. The main reason that I went to a changers, was for the HD audio with Bluray. My changer is equal to 20TB of storage, and I have 3 changers. So, have you priced 60TB.

Congratulations. But to be honest, I am not impressed. I also do not care of your player video processor neither audio, TB equivalents and everything else. They are irrelevant and may be just advantages in your mind sold by smart product marketing. Also, you do not know what MP I use but assuming is a privilege for everyone.
So please do not quote me in this and lets move forward. Cheers.
post #2649 of 3670
Quote:
Originally Posted by Nitro67 View Post

Yes, I was looking at the numbers last night. My room be perfect size.

Your room is highly unusual, let alone the wall space required for such a huge display. Hey, wait for 8K in another 5-10 years (if by then), it makes no difference to me. But if you think 4K is not coming to the U.S., you're going to be mistaken.
Quote:
Originally Posted by Nitro67 View Post

There is several broadcasters that stated that they didn't want to switch to 4k. Then a few years later to switch to 8k. I stated Sharp, and you state Samsung??? Sharp did it 2012. NHK has already demonstrated 8K at the olympics in 2012.

Can you post a link relating to the 'several' broadcasters not wanting to switch to 4K? Other than ESPN, which as of last week said they would wait, I'm not aware of others. Either way, ESPN and a couple of others will not hold up 4K in the U.S.

If you really think it's more likely we'll have 8K here than 4K, you're certainly entitled to that opinion. It's everyone's right to ignore the release of 4K displays, the presence of 4K software, the imminent arrival of 4K camcorders, etc. As I said, we all have to make decisions and you've chosen to gamble on 8K.
post #2650 of 3670
Quote:
Originally Posted by frostylou View Post

Hello all,

Quick question.
There is this brilliant theater near me called Cinetopia. All of their theaters have 4K projection. My question is this.
Will 4k picture look even Clearer, sharper, and more detailed on a smaller screen then it will on a much larger screen?
More specifically they have an 85 foot screen in in their largest theater and a small screen in what is called one of their movie parlor theaters.
Will 4k Look more amazing on the smaller screen… ie..In the way that a 1080P picture looks great on 32 inch screens and 42 inch screen and 50 inch screen… But the more you go up in screen size To the really big big screens.. there is less detail and less pop And starts to look less amazing
Thx

Absolutely. This phenomena is not restricted to 4K, you can see an increase in apparent sharpness at almost any resolution as screen size diminishes. The bottom line is that yes, 4K should look sharper on a smaller screen. OTOH, as the screen size shrinks, you'll need to sit closer to derive all the details that are present. Just keep in mind there is a difference between 'apparent sharpness' and true detail.
post #2651 of 3670
Very interesting. Thx Ken!!
post #2652 of 3670
As someone using a screen that varies from 80" 16:9 to 136" (and larger for scope) one of the interesting things I've found is how well sharpness can hold up into huge sizes. You do start to loose a certain amount of "punch" and vividness as you zoom larger and larger with a projector, but in terms of sharpness some sources seem virtually as sharp at massive sizes as they do at smaller size. What happens is the image looks sharp on the smaller size, but as I expand it, the image seems to retain that sharpness but all those details that were becoming hard to register at the smaller size come into view. It's really wild. I played the latest Mission Impossible Blu-Ray for guests at 10 feet wide and the one feature they couldn't miss remarking on was how incredibly sharp and clear the image was. (It goes without saying that this is very source dependent, and the variability of source quality/detail is one reason why I vary my image size to begin with).

This isn't to say I'm not aware of the limitations of 1080p which I have been for quite a while (I'm just comparing certain 1080p images at various sizes in my system). And on that note of 1080p limitations, it's intriguing to me how my modest experience watching the Sony 4K display/sources several times has subtly ratcheted my expectations for detail upwards. I was just checking out screen grabs of the new Bond SkyFall Blu- Ray on the Blu-Ray forum and they look like a superb transfer. But, while at one point I would have marveled at the detail available on such a source, I found myself noting the limitations in resolution. For instance shots of the exterior of mansions looked almost "SD" when compared to the type of resolution I've seen of similar images on the Sony 4K.

Damn you march of technology. Love it. Hate it. Love it....
Edited by R Harkness - 2/3/13 at 2:04pm
post #2653 of 3670

Rich Harkness:   OT, but thanks again Rich for introducing me to the idea of a hybrid screen size, to be able to get the best size for 16x9 and for 2.35 pics with ones given room.     It's really been ideal for me.

post #2654 of 3670
Quote:
Originally Posted by R Harkness View Post

As someone using a screen that varies from 80" 16:9 to 136" (and larger for scope) one of the interesting things I've found is how well sharpness can hold up into huge sizes. You do start to loose a certain amount of "punch" and vividness as you zoom larger and larger with a projector, but in terms of sharpness some sources seem virtually as sharp at massive sizes as they do at smaller size. What happens is the image looks sharp on the smaller size, but as I expand it, the image seems to retain that sharpness but all those details that were becoming hard to register at the smaller size come into view. It's really wild. I played the latest Mission Impossible Blu-Ray for guests at 10 feet wide and the one feature they couldn't miss remarking on was how incredibly sharp and clear the image was. (It goes without saying that this is very source dependent, and the variability of source quality/detail is one reason why I vary my image size to begin with).

This isn't to say I'm not aware of the limitations of 1080p which I have been for quite a while (I'm just comparing certain 1080p images at various sizes in my system). And on that note of 1080p limitations, it's intriguing to me how my modest experience watching the Sony 4K display/sources several times has subtly ratcheted my expectations for detail upwards. I was just checking out screen grabs of the new Bond SkyFall Blu- Ray on the Blu-Ray forum and they look like a superb transfer. But, while at one point I would have marveled at the detail available on such a source, I found myself noting the limitations in resolution. For instance shots of the exterior of mansions looked almost "SD" when compared to the type of resolution I've seen of similar images on the Sony 4K.

Damn you march of technology. Love it. Hate it. Love it....
Resolution has nothing to do with sharpness. Most display devices on the market today have square hard-edged pixels. It doesn’t matter how large you make them, those pixels will be just as sharp.
Increasing resolution is about displaying finer details, smoother lines and gradations, removing the grid-like pixel structure from the image. For many people this results in making the image look more natural - even if they aren't aware of why that is.


It was different when we were dealing with CRTs where larger displays typically had a much harder time of putting out as sharp an image when you scaled them up, but it doesn't make a difference with flat panels and digital projectors - though with some projectors, the optics might get worse if you are using a zoom lens to increase the size, but that has nothing to do with resolution.
post #2655 of 3670
Quote:
Originally Posted by Chronoptimist View Post

Resolution has nothing to do with sharpness.

I don't see why you feel the need for this reply. I was responding to Ken's discussion of sharpness and image size, and it should be apparent that neither of us is confusing resolution with sharpness. Ken explicitly notes the difference, and I explained that with a good transfer sharpness can look pretty consistent between a smaller and larger image size, but what you start to appreciate at the larger sizes is resolution of the detail that is harder to appreciate at smaller image sizes. I could hardly have bothered to make such a point if I were saying (or even implying) that sharpness and resolution were the same thing.

That out of the way, I think you only muddy the issue when you start with: "Resolution has nothing to do with sharpness."
Because that is patently wrong. It would be more precise to say that in terms of image quality parameters, "resolution is distinct from sharpness" just as one would point out "contrast is distinct from sharpness." However, when one is considering what bears upon our perception of image sharpness, you have to know that contrast is intimately linked to our perception of image sharpness. Which is why a number of methods of increasing perceptual image sharpness (for the same source/resolution) tend to rely on manipulating image contrast (generally, locally, etc). And that increasing image sharpness can aid our ability to resolve fine detail - in other words, in perceptual terms (the end that really matters to us), increased sharpness can increase resolution of detail (detail that would otherwise for us go unresolved or less resolved).

If you are talking about only the detail that is contained in the source and resolvable by the display, with no regard "sharpness" and "resolution" as it regards our perceptual system, then you are not in the same discussion that Ken and I are having. We are including in this discussion our perception of image quality.

Cheers,
Edited by R Harkness - 2/3/13 at 11:54pm
post #2656 of 3670
Quote:
Originally Posted by Chronoptimist View Post

A joke at this point? You must have something misconfigured. madVR beats anything I have compared it with.
Nope! I get my configuration setup directly from Matt at J river. I current use a different render that has better quality than MadVR.
Quote:
Originally Posted by Chronoptimist View Post

$6000 on changers and your TV can't do 1080p24!?
I have about 3k in changers, but Sony XBR2 70" doesn't do 1080p 24. No need to replace it, until it dies.

Quote:
Originally Posted by Chronoptimist View Post

I don't really recommend 100% fanless systems. Passive devices in the case are good, but they need some airflow. Buy Noctua fans if noise is a concern. (though I keep the HTPC out of the room)

Just using stock fan on Ivybridge, and it works fine. Kingwin Stryker passive power supply is great! There is only 2 cases that I found that would work for a passive system.
The cost is $$ for the case, then you have to use their components. One case is from Austria, so it is hard to obtain here. Currently, I am using old lian-li case and it works fine. I only use 2 SSD drives, but this the quietest pc that I ever owned.
Quote:
Originally Posted by Chronoptimist View Post

There are active HDMI extenders you can use, if 50ft is not long enough. CAT5e/6 extenders can cover about 800ft and fiber extenders about 1000ft.
You can buy lot of things, but what works is not what you expect. You can buy HDMI cables up to 125 feet, but my installer has problems with hdmi switching at 50 feet. Typically, the standard now is CAT6A, which has higher bandwidth than CAT6. My audio/video installer just installed a system with an HDMI switch was a Key Digital, which is very high end. Actually, it is was the first hdmi switch to do 4k video. What he is looking for is reliability of the system.
Quote:
Originally Posted by Chronoptimist View Post

Thunderbolt is not a display standard. It happens to accept displayport signals, but is essentially an external connection to the computer's PCI-E bus and a complete waste of money right now. The only potentially exciting thing about Thunderbolt is being able to have something like an 11" MacBook Air dock to a full blown desktop graphics card when you bring it home and hook it up to a large display. But Thunderbolt bandwidth has already halved since its introduction.

Thunderbolt is being put on lot of devices, but it seems you are not following the correct companies. The rumor is that it will be on the back of new Apple iPanel TV. Savant Systems is in process of adopting it. You have to dig in their brochures to see it. There is several devices that it is being put on. Here is a new cable that just came out. http://appleinsider.com/articles/13/01/08/cornings-thunderbolt-and-usb-optical-cables-transmit-data-over-hundreds-of-feet
Areca Arc-8050 has nice fast thunderbolt box, etc.
Quote:
Originally Posted by Chronoptimist View Post

I have never had a hard drive last less than a year - you run a full disk check when it's new, and if it passes that it is not likely to fail prematurely. Modern hard drives are extremely reliable if they are kept at a reasonable temperature. Buying 4TB drives is a massive waste of money, because many of them cost more than 2x a 3TB disk and would only save you 4W on power consumption. But yes, if you are buying enterprise drives, which are designed for 24/7 operation you will more than double your cost and have ridiculous power consumption. (over 100W if you keep all the drives spinning, which is what enterprise drives are designed for)

I had a 3U Rackmount server with 16 (2TB) Hitachi drives, and 4 fans, and zippy power supply. The biggest power consumption was with the the power supply, not the hard drives. I have been in computers since 1982. If you plan to use a raid card, then it won't run on cheap drives. You literally have to match the drives to the Areca raid card, but it seems you don't know that. I switched to an ATX power supply and it works fine at less cost. Changed the case to a chenbro, and works great.
post #2657 of 3670
Quote:
Originally Posted by Ken Ross View Post

Your room is highly unusual, let alone the wall space required for such a huge display. Hey, wait for 8K in another 5-10 years (if by then), it makes no difference to me. But if you think 4K is not coming to the U.S., you're going to be mistaken.

Haven't you read about NHK? Here is article on NHK and the research that was developed in the past. NHK started in 1995. http://www.sidmembers.org/idonline/article.cfm?year=2012&issue=12&file=art6
The USA always wanted 4k, but Japan wanted to skip 4k and go to 8K. I think that 4k would had a better chance in 2010, and if we didn't know about 8K. Then people would upgraded to 8K later in 2020. Majority of the public will only replace the TV, when it fails.
Quote:
Originally Posted by Ken Ross View Post

Can you post a link relating to the 'several' broadcasters not wanting to switch to 4K? Other than ESPN, which as of last week said they would wait, I'm not aware of others. Either way, ESPN and a couple of others will not hold up 4K in the U.S.
It was posted in this thread, but it was around Amsterdam that several broadcasters stated that they wanted to skip 4k and go to 8k primary because of the costs involved. Broadcaster is spending millions, but if there is no way to transmit to a large audience. Then they won't take the gamble. 4k be fine in Europe, because of the satellites. USA is going to be limited till 2016. There is not media format based on 4k, and you need a standard format. Not Sony streaming a few movies.
Quote:
Originally Posted by Ken Ross View Post

If you really think it's more likely we'll have 8K here than 4K, you're certainly entitled to that opinion. It's everyone's right to ignore the release of 4K displays, the presence of 4K software, the imminent arrival of 4K camcorders, etc. As I said, we all have to make decisions and you've chosen to gamble on 8K.

I might do 4k Computer monitor, but that is different. I jumped in early on HD DVD and Bluray and there was only handful of movies. So, I think waiting is the best idea. A year ago, I was hoping that 4k would be here, but everything seems to be delayed now.
post #2658 of 3670
Look--if they already compress the tar out of 1080p there is no way that they're going to go to 8K--they can't even do 4K without compressing it SO MUCH that it won't be worth a hoot--but 8K--c'mon--join the real world!
post #2659 of 3670
Quote:
Originally Posted by esdwa View Post

Congratulations. But to be honest, I am not impressed. I also do not care of your player video processor neither audio, TB equivalents and everything else. They are irrelevant and may be just advantages in your mind sold by smart product marketing. Also, you do not know what MP I use but assuming is a privilege for everyone.
So please do not quote me in this and lets move forward. Cheers.

I am really not impressed with this reply All the media players compress the formats. I compared the video processor chips, and they all compress the formats. So, you are taking a BD disk that is already compressed and making it smaller to stream? This is like my home automation company told me to compress all my CD's to mp3, so they can play on their player. So, I cut them out of the picture, and use J River. Now I can listen to uncompressed music.
Quote:
Originally Posted by esdwa View Post

So please do not quote me in this and lets move forward.

Last time, I looked. I have freedom of speech. So, I can quote who I want. biggrin.gif
post #2660 of 3670
Here is a good article on HEVC and the satellite industry answer on 4k and 8K.

http://www.satelliteprome.com/tech-features/hevc-the-satellite-industrys-answer-to-supporting-ultrahd-video-delivery/
post #2661 of 3670
Quote:
Originally Posted by Nitro67 View Post

Nope! I get my configuration setup directly from Matt at J river. I current use a different render that has better quality than MadVR.
There's nothing even remotely comparable to madVR on the PC. (or otherwise, in my experience)
Quote:
Originally Posted by Nitro67 View Post

You can buy lot of things, but what works is not what you expect. You can buy HDMI cables up to 125 feet, but my installer has problems with hdmi switching at 50 feet. Typically, the standard now is CAT6A, which has higher bandwidth than CAT6. My audio/video installer just installed a system with an HDMI switch was a Key Digital, which is very high end. Actually, it is was the first hdmi switch to do 4k video. What he is looking for is reliability of the system.
With active extensions, rather than just long cables, length has no impact on things like HDMI switching.
Quote:
Originally Posted by Nitro67 View Post

Thunderbolt is being put on lot of devices, but it seems you are not following the correct companies. The rumor is that it will be on the back of new Apple iPanel TV. Savant Systems is in process of adopting it. You have to dig in their brochures to see it. There is several devices that it is being put on. Here is a new cable that just came out. http://appleinsider.com/articles/13/01/08/cornings-thunderbolt-and-usb-optical-cables-transmit-data-over-hundreds-of-feet
Thunderbolt is just DisplayPort plus data and power, as far as video is concerned. It makes zero sense to put that on the back of a TV when HDMI has won out over DisplayPort - even Apple has conceded that fact by finally adding HDMI ports to their latest notebooks, 5+ years late. The only reason for Apple to do that, would be if they intend on using it as a large monitor, so you only need to hook up one cable and can also have external storage connected. (because that's about all anyone is doing with Thunderbolt just now)
Quote:
Originally Posted by Nitro67 View Post

I had a 3U Rackmount server with 16 (2TB) Hitachi drives, and 4 fans, and zippy power supply. The biggest power consumption was with the the power supply, not the hard drives.
Modern power supplies are over 90% efficient now. Even if you assume 80% efficiency, that goes from 16W to 20W power consumption. A "1000W" rated power supply is the maximum load it can handle not its operating power consumption, for example.
Quote:
Originally Posted by Nitro67 View Post

If you plan to use a raid card, then it won't run on cheap drives. You literally have to match the drives to the Areca raid card, but it seems you don't know that.
Sounds like you are talking about SAS vs SATA, though I was under the impression they were mostly interchangeable on RAID cards these days. It also sounds like you are speccing for a 24/7 operation datacenter, not a home media streaming setup.
post #2662 of 3670
Quote:
Originally Posted by Chronoptimist View Post

There's nothing even remotely comparable to madVR on the PC. (or otherwise, in my experience)
Currently MadVR freezes up on DVD's, but Matt the creator of J River. Recommended to try Enhanced Video Render, it is in the custom menu of J River. No freezes or crashes..
Quote:
Originally Posted by Chronoptimist View Post

With active extensions, rather than just long cables, length has no impact on things like HDMI switching.
Key Digital is like an industrial swtich, it starts around $4500 and it is priced on up. http://www.keydigital.com/items.asp?ItemCode=kd4x4&Company=KEY Uses Balums. Actually, his company was doing 4K video installs in early 2012.
Quote:
Originally Posted by Chronoptimist View Post

Thunderbolt is just DisplayPort plus data and power, as far as video is concerned. It makes zero sense to put that on the back of a TV when HDMI has won out over DisplayPort - even Apple has conceded that fact by finally adding HDMI ports to their latest notebooks, 5+ years late. The only reason for Apple to do that, would be if they intend on using it as a large monitor, so you only need to hook up one cable and can also have external storage connected. (because that's about all anyone is doing with Thunderbolt just now)

Apple has not conceded, but they are putting both on. Here is the latest Apple Specs on the back Mac Mini. See you have both... http://www.apple.com/mac-mini/specs.html
The latest rumour from media is that apple is putting thunderbolt from the iPanel or whatever they call their tv.

Quote:
Originally Posted by Chronoptimist View Post

Modern power supplies are over 90% efficient now. Even if you assume 80% efficiency, that goes from 16W to 20W power consumption. A "1000W" rated power supply is the maximum load it can handle not its operating power consumption, for example.
Sounds like you are talking about SAS vs SATA, though I was under the impression they were mostly interchangeable on RAID cards these days. It also sounds like you are speccing for a 24/7 operation datacenter, not a home media streaming setup.
No, when you buy a raid card. You need to watch the model numbers. SATA or SAS doesn't matter that much anymore. I use Areca, because I can use it on a mac OSX or windows. Go to Areca support section, and you see the HDD compatibility list.
For example, I had some Samsung drives in the past, and Acreca wouldn't work with those drives. I got a good deal on the HIachi drives, so I went with them. Anyway, match it up with the model number, and it works fine.

NItro
post #2663 of 3670
Is this the 4k by 2k thread or is this the HTPC forum? I'm confused.

I agree with Ken Ross that 4k will be used in the medium to long term and that 8k might not ever happen. I don't know why some posters think one broadcaster in Japan will drive the standard for the world. Japan has a debt problem and a demographic problem and I can assure you that 8k video plus 22.2 channel audio will not be on the government's top priority list over the next 10 years.

4k just won't be that hard to do in the next few years. The combination of 4k monitors, the HEVC codec and Blu-ray discs will cement 4k as the standard in the medium to long term. Other than movies, people can view the photos from their high megapixel cameras on their new 8MP(4k) displays.
post #2664 of 3670
Quote:
Originally Posted by Nitro67 View Post

Haven't you read about NHK? Here is article on NHK and the research that was developed in the past. NHK started in 1995. http://www.sidmembers.org/idonline/article.cfm?year=2012&issue=12&file=art6
The USA always wanted 4k, but Japan wanted to skip 4k and go to 8K.

Is this all supposed to convince us that 8K is where the U.S. is going as opposed to 4K? It does nothing of the sort.
post #2665 of 3670
Quote:
Originally Posted by mfogarty5 View Post

Is this the 4k by 2k thread or is this the HTPC forum? I'm confused.

I agree with Ken Ross that 4k will be used in the medium to long term and that 8k might not ever happen. I don't know why some posters think one broadcaster in Japan will drive the standard for the world. Japan has a debt problem and a demographic problem and I can assure you that 8k video plus 22.2 channel audio will not be on the government's top priority list over the next 10 years.

4k just won't be that hard to do in the next few years. The combination of 4k monitors, the HEVC codec and Blu-ray discs will cement 4k as the standard in the medium to long term. Other than movies, people can view the photos from their high megapixel cameras on their new 8MP(4k) displays.

I guess you missed the Olympics in 2012... http://www.bbc.co.uk/blogs/researchanddevelopment/2012/08/the-olympics-in-super-hi-visio.shtml

Here is the audio standardization paper. Actually it is for both. UHDTV1 and UHDTV2... 4k and 8k.

http://www.nhk.or.jp/strl/publica/bt/en/fe0045-6.pdf

If you use Google and want to find more papers from nhk. Then go to advanced.. Here is your keywords. nhk.or.jp/strl/publica/bt/en

Dolby has developed Atmos 25.3, but it hasn't set a standard for home audio.
post #2666 of 3670
Quote:
Originally Posted by Ken Ross View Post

Is this all supposed to convince us that 8K is where the U.S. is going as opposed to 4K? It does nothing of the sort.

Only a teaser of article... Do some research on it. You be surprised. I gave you the keywords.

Who invented 4k? Well, here is the first 4k computer monitors. http://en.wikipedia.org/wiki/IBM_T220/T221_LCD_monitors
IBM won't state when it started, but probably early 1990's. Oh, the patents were sold to Japanese later.
post #2667 of 3670
No Nitro, I'll be surprised if 4K doesn't get a foot hold in the relatively near future. I'll be even more surprised if 8K supplants 4K, rendering 4K 'still-borne'.

Your 'arguments' are far from convincing. smile.gif
post #2668 of 3670
The first 4k panel at Berlin 2007. 80" LED.

post #2669 of 3670
Quote:
Originally Posted by mfogarty5 View Post

Is this the 4k by 2k thread or is this the HTPC forum? I'm confused.

I agree with Ken Ross that 4k will be used in the medium to long term and that 8k might not ever happen. I don't know why some posters think one broadcaster in Japan will drive the standard for the world. Japan has a debt problem and a demographic problem and I can assure you that 8k video plus 22.2 channel audio will not be on the government's top priority list over the next 10 years.

4k just won't be that hard to do in the next few years. The combination of 4k monitors, the HEVC codec and Blu-ray discs will cement 4k as the standard in the medium to long term. Other than movies, people can view the photos from their high megapixel cameras on their new 8MP(4k) displays.

I agree. Barring something unexpected (Like a war between China and Japan), 4K will be the new standard. Unlike OLED, there's nothing standing in the way of mass production at reasonable prices.
post #2670 of 3670
I have been in this thread since it began, and was PRO 4k. The problem is not the 4K TV, but the is no media or broadcasting in the USA
Directv has stated 2016. EchoStar could be earlier, but it be around 2015. You do have h.265, but it is the devices that are holding you up.
http://broadcastengineering.com/video-transport/itu-approves-h265-video-codec This article states that the new devices won't get here to mid 2014 onwards. There currently is no standard format for media, so have speculated on Bluray or HVD. Last year, NHK stated it take 3 years for Holographic Versatille disc would get here.. So, you are looking at 2016.

Then of course, you might want it. You ask your wife and she want to spend it on new car. So, you still don't get your new toys.
New Posts  All Forums:Forum Nav:
  Return Home
AVS › AVS Forum › Display Devices › Flat Panels General and OLED Technology › 4k by 2k or Quad HD...lots of rumors? thoughts?