or Connect
AVS › AVS Forum › HDTV › HDTV Technical › Resolutions of embedded video on the internet
New Posts  All Forums:Forum Nav:

Resolutions of embedded video on the internet

post #1 of 8
Thread Starter 
I do hope this is the proper sub-forum for this topic. If not, direct me elsewhere.

I recently set up a new laptop to integrate with my HDTV. The laptop is 1366x768 and my Pioneer plasma is native 1080p. I keep the laptop at 1366x768 when sending video to the plasma. I am simply trying to determine the best way to push video to the plasma ("best" meaning highest image quality). Me and a friend werre discussing/debating this, which resulted in more questions.

I watch a lot of internet video on my plasma. So when I'm watching an embedded video on a web page (e.g., YouTube, Vimeo, QuickTime), what happens to that signal before and after it passes to the plasma?

If the video is not blown up to full screen, I think the image size is maintained, right? It's not really scaled because it doesn't need to be. Now, if it's blown up to full-screen (which is what I do), is:

1. The laptop scaling/stretching the signal to fit the screen,
2. Is the QuickTime video player doing the scaling, or
3. Is it just sending the 480p (for ex.) signal to my plasma and letting the plasma scale it?

My Pioneer has an above average scaler and I find that it usually performs better than my other devices so I would like it to do the scaling.

Hoping someone knowledgeable of both PCs and HDTVs can comment.
post #2 of 8
Quote:


The laptop is 1366x768 and my Pioneer plasma is native 1080p. I keep the laptop at 1366x768 when sending video to the plasma. I am simply trying to determine the best way to push video to the plasma ("best" meaning highest image quality).

If you want the best quality, set your video card to ONLY output to the tv. Outputting to 2 different displays at different resolutions is hard enough for a desktop video card, let alone a laptop one.

Quote:


If the video is not blown up to full screen, I think the image size is maintained, right?

Should be.

Quote:


1. The laptop scaling/stretching the signal to fit the screen,

Depends on the video player you are using.

Quote:


2. Is the QuickTime video player doing the scaling, or

If using quicktime or vlc, the player does the scaling. If using stock Windows Media player, Windows itself is. However if you are using a codec pack like Sharky's, FFDShow will be doing the scaling, which I recommend because it does a really good job.

Quote:


3. Is it just sending the 480p (for ex.) signal to my plasma and letting the plasma scale it?

If the video is full screen, then its sending the video at whatever resolution the video card is set to output.
post #3 of 8
Suggest you try measuring the various resolutions using an optical comparator with a reticle, along with resolution-burst patterns. Suggested this technique in the Blu-ray Software forum , ideally using numerous resolution/frequency burst patterns stored on a test disc. (A link I provided in that thread shows patterns on the S-M test disc and the thread outlines how to convert frequencies into resolutions.)

With a PC already linked to a display, resolution bursts, such as those on AVS member dr1394's site , or elsewhere, could be downloaded and used instead of an optical disc. Ran the terms (optical comparators reticle) on Google shopping (low-to-high pricing) and found loupes for ~$6 and up. An Edmund Optics engineer recommended a costlier 9X model that accepts their 27mm reticles . The scalings of computer settings and display resolution could be worked out once the hardware/software is at hand. No doubt skill at measuring the effective resolutions of the finest details seen on screen would increase quickly. -- John
post #4 of 8
Thread Starter 
Quote:
Originally Posted by sitlet View Post

If you want the best quality, set your video card to ONLY output to the tv. Outputting to 2 different displays at different resolutions is hard enough for a desktop video card, let alone a laptop one.

What do you mean by "hard enough"? Will the image sent to my TV improve by only outputting to TV? Or do you mean it would just make my laptop run slower with 2 outputs?

Here is a link to my Digital Storm laptop. It has a dual graphics card config., wherein the Intel is an integrated graphics card (built into the processor) and the NVIDIA is stand-alone. When outputting to both displays, wouldn't the NVIDIA card handle the TV output because it's rated at higher resolution and the Intel handle the laptop output?
Quote:
Originally Posted by sitlet View Post

If using quicktime or vlc, the player does the scaling. If using stock Windows Media player, Windows itself is. However if you are using a codec pack like Sharky's, FFDShow will be doing the scaling, which I recommend because it does a really good job.

Good info. What video player does YouTube use? Also, I watch a lot of NBA feeds which use Silverlight player. Since this is a Msoft video player, will Windows scale it just like WMP?
Quote:
Originally Posted by sitlet View Post

If the video is full screen, then its sending the video at whatever resolution the video card is set to output.

Ok. So if I am playing a 480p YouTube vid, is there any way at all to offload the scaling (480p-->1080p) to the plasma? From your post, it seems I could accomplish this by setting the laptop to 720x480?

Thanks for your insight.
post #5 of 8
Thread Starter 
Quote:
Originally Posted by John Mason View Post

Suggest you try measuring the various resolutions using an optical comparator with a reticle, along with resolution-burst patterns. Suggested this technique in the Blu-ray Software forum , ideally using numerous resolution/frequency burst patterns stored on a test disc. (A link I provided in that thread shows patterns on the S-M test disc and the thread outlines how to convert frequencies into resolutions.)

The scalings of computer settings and display resolution could be worked out once the hardware/software is at hand. No doubt skill at measuring the effective resolutions of the finest details seen on screen would increase quickly. -- John

Yes, John, I read through your discussion on the S&M's HD Benchmark thread back when you posted it. I certainly think a loupe and reticle would be the definitive way of measuring the effective resolution (i.e., resulting resolution on the HDTV). However, for the moment, I'm primarily interested in getting the settings right on my Windows 7 PC so that I can achieve the highest quality on my HDTV. I've never used a loupe/reticle device before, so how easy would it really be count the effective lines of resolution of a 480p YT vid upscaled to 1080p? I feel like some signals/internet feeds are so poor to begin with that even with a magnification device, it would be difficult to determine.
post #6 of 8
^^^Yeah, the S-M disc creators outline potential difficulties, too, (at my link). But, if you see differences between 480p and the best-resolution 1080i/p sources it should be possible to measure it (max effective resolution). The oft-repeated garbage-in/garbage-out. -- John

EDIT:
Quote:


I've never used a loupe/reticle device before, so how easy would it really be count the effective lines of resolution of a 480p YT vid upscaled to 1080p? I feel like some signals/internet feeds are so poor to begin with that even with a magnification device, it would be difficult to determine.

The link above to a 9X comparator and fine-ruled reticle shows what you'd see with the lens against a direct-view screen (sans image). You Tube etc. videos are indeed often fuzzy. Scaled to 1080p by your display, the finest visible details remain fuzzy; they're just larger on a screen meant for consumer motion video. While a detail, say a human hair, is fuzzy, placing the reticle over it gives you a reference measurement: how many reticle-line divisions wide.

That reference measurement could be compared with test patterns until a similar or identical width is found. (Referring here to fine vertically oriented details--a hair--to measure maximum effective horizontal.) The horizontal rule in the reticle (linked above) would be turned 90 degrees to measure vertical resolution detail, then compared with vertical-resolution bursts. Here are the vertical and horizontal bursts on the S-M test Blu-ray. My link earlier to the S-M test disc thread discussion mentions how to convert the frequencies given to lines of resolution.

For a 1080p display, of course, the 37.09-MHz vertical-lines test pattern (2nd row down, far right) should have 1920 B&W lines across your screen width--perhaps confirmed by counting lines per inch and calculating per screen width. But as the S-M test disc developers point out, that's not the purpose of their resolution-burst patterns. Some years back one AVSer mentioned he used a pro computer graphics program to create his own lined patterns to measure resolution. Also, an upscaled 480p image measurement on a 1080p display could be compared with similar measurements but wouldn't represent the original 480p resolution. Ideally, this technique would be used to compare mostly 1080 image details, factoring in upscaling if other capture/production resolutions are used.

But a measurement such as this could provide the equivalent (test pattern) maximum horizontal resolution, just for comparison purposes, of what might appear to be the finest detail on a You Tube image (for similar upscaled comparisons). A Blu-ray movie, even with all the resolution limitations mentioned in the S-M disc discussion, could be similarly measured for what appears to be some of the finest details--ditto for live or recorded 1080i broadcasts--although live programs would need frame grabbing, DVRing, etc. for measurements. -- John
post #7 of 8
Quote:
Or do you mean it would just make my laptop run slower with 2 outputs?
Yes. Especially if you are running two resolutions, then the video card has to work twice as hard.

Quote:
What video player does YouTube use?
Flash Player, which outputs the video at whatever resolution the video card is set to.

Quote:
Ok. So if I am playing a 480p YouTube vid, is there any way at all to offload the scaling (480p-->1080p) to the plasma? From your post, it seems I could accomplish this by setting the laptop to 720x480?
Yes, to let the tv do the scaling, you would have to set the computer to output 720x480, but then you would be changing the resolution every time you want to watch a different video. But honestly, youtube has some of the worst quality of all the streaming sites. Letting your tv scale, or letting the computer scale, isnt going to make any difference. Its still going to look like garbage.
post #8 of 8
Thread Starter 
Quote:
Originally Posted by sitlet View Post

Yes, to let the tv do the scaling, you would have to set the computer to output 720x480, but then you would be changing the resolution every time you want to watch a different video. But honestly, youtube has some of the worst quality of all the streaming sites. Letting your tv scale, or letting the computer scale, isnt going to make any difference. Its still going to look like garbage.

Totally agree. Thanks for the info. It really is impossible to make some YT vids look even "OK."

Also, I was watching March Madness this weekend in Extended Desktop mode with my laptop set to 1366x768. When I hit the info pane button my plasma, it said "1080i." Was it actually detecting the broadcast resolution of the game and not the output resolution of the laptop??
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: HDTV Technical
AVS › AVS Forum › HDTV › HDTV Technical › Resolutions of embedded video on the internet