AVS Forum banner
Status
Not open for further replies.
1 - 12 of 12 Posts

·
Registered
Joined
·
63 Posts
Discussion Starter · #1 ·
Hi guys here I'm not im my habitat since I play with a LCD projector, but I'm courious and want learn more about CRT.


Am I wrong when I say that CRT have a native resolution?


If not, can somebody say how it can be calculated?


thank you
 

·
Premium Member
Joined
·
17,215 Posts
They have a maximum resolution spec, but the more applicable number is the number of horizontal lines which can be individually resolved.


This number of lines is dependent on both the designed capability of the projector model, plus the quality of the installation and setup. It starts with a nice tight spot beam. A good tight focus will be able to yield more resolved lines.


So then you look for the sweet spot... if you feed your CRT too few horizontal lines (say 480), then you'll see scanline gaps. If you feed it too many (say 1000; unless you've got a nice 9" setup... Art can see scanlines at 1080p on his G90s), then you'll get overlap, giving a softer image.
 

·
Registered
Joined
·
63 Posts
Discussion Starter · #3 ·
can I calculate resolution dividing vertical dimension of projecting field by vertical dimension of the thickest beam?

In this way overlap and scan gap would be avoided...


EG

on lens the vertical field dimension is 120 mm and on the lens the beam is 0.2 mm so the resolution is 600 lines.


I hope I was clear :eek:



what do you think about?
 

·
Registered
Joined
·
838 Posts
This, IMHO, is one of the major benefits of CRT projectors - they do not have native resolutions. Only maximum resolutions. This allows one to actually see different resolutions. Digital displays, with fixed formats, are forced to scale every input image to its' native resolution. Often this is some strange non-ATSC resolution like 1388 x 768. This introduces scaling artifacts to the image. Thus, if you feed a CRT 480i, 480p, 720p or 1080i - you see that resolution (so long as you do not exceed the CRT's maximum resolution).


While the industry has decided to call virtually anything that can display 720p or above an "HDTV" - people with digital displays using non-ATSC native resolutions (i.e. the vast majority of them), have not actually seen any of the HDTV resolutions, but rather a scaled version of the resolution.


As digital, fixed format, displays move to 1920 x 1080p, they will finally be capable of displaying the "full" resolution of the HDTV signal (at least a de-interlaced version of 1080i and an up scaled version of 720p).


Ed
 

·
Registered
Joined
·
2,041 Posts
Regarding the lenses, a HD10GT series lens can resolve 12 linepairs per millimeter, that means 24 lines per mm, resulting in 2880 vertical pixels if phosphor height is 120mm...

Roland
 

·
Registered
Joined
·
63 Posts
Discussion Starter · #8 ·
Thank you guys, I'm happy about your interest in my personal culture! :D


In any field it is important clearify the meant of terms used.

So, for native resolution I meant the maximum you can reach avoiding any artifacts. So, edsuski, in my mind maximum is equal to native, I don't know if that match with technical custom....


I think any device has a physical limitation. It is easy for anybody find it when the object is a DLP or a LCD. It is hard when object is a CRT projector.


I know that a TV CRT has a front panel and a cellular phosphor layer on it (in one cell there are 3 kinds of fluorescent material, one for each R, G, and B). In this case the physical limit is the cell dimension. Many TV CRT also have a metallic mask, so the limit is the grid.


In CRT projector I think there are not fluorescent cells nor masks since is made by three tubes.

Any way a limit must exist, where is it? An engineer which design a projector must know that limit, with no subjective will.



I find that page http://www.*********.com/performanceratings.html, scroll down...




bye and sorry ;)
 

·
Registered
Joined
·
63 Posts
Discussion Starter · #9 ·
Quote:
Originally posted by Clarence
I understand what you're saying, but how do you accurately measure 0.2mm?


I think it'd be easier to work in the opposite direction...

For example, say my projector can nicely resolve 960 lines without overlap.

So 120mm / 960 lines = 0.125mm per line
but how can you count the lines on the screen?


cathode ray beam dimension would be a note measure...

let me search...
 

·
Registered
Joined
·
50 Posts
Counting the lines is simple.


If the projector is scanning below its ability (lower resolution) you will see them on the screen with dark gaps between - easy to measure.

You can also send a known image from a computer and measure gap between known features in image (for higher resolutions).


You shouldn't really be getting too worried about the exact minimum electron beam spot size (unless this is just for a debate/bet) - there are many factors which determine the best image overall, and spot size (which is affected by focus type ES/EM; age of tubes; scanning frequency; contrast setting; etc) is just one.


The main point for any debate on CRT resolution is that it is truly FLEXIBLE. (As is the aspect ratio, unlike digital).


Also note - the projector must be able to scan to high enough horizontal frequency, or the higher resolution wont be available. Almost all CRTs worth over $25 will scan high enough for HDTV (1080i or 720p).
 

·
Registered
Joined
·
317 Posts
Quote:
Originally posted by Clarence


So then you look for the sweet spot... if you feed your CRT too few horizontal lines (say 480), then you'll see scanline gaps. If you feed it too many (say 1000; unless you've got a nice 9" setup... Art can see scanlines at 1080p on his G90s), then you'll get overlap, giving a softer image.
Clarence, are there known sweet spots for CRTs without straining my eyes and going to iterations of geom/conv settings. Any general guidelines for brands/model of CRTs. (ie I have a Barco 1208/2). I'm trying save tweak time to spend on other HT issues.
 
1 - 12 of 12 Posts
Status
Not open for further replies.
Top