Originally Posted by scowl
All the more reason to have the widest range of values to digitally sample these analog signals with.
Yes - assuming that clipping stuff outside the display range won't adversely impact stuff within the display range - but that isn't always the case.
I always thought black shouldn't be visible on a properly set-up display. :What should black look like? confused:
I agree - black should be that. Nothing should appear blacker (hence PLUGE using sub-black to assist you in aligning black correctly) However clipping stuff you can't see can create artefacts that you can see.
Are you saying that analog equipment will have problems if a signal from a digital device is at zero volts instead of something slightly higher?
What I'm saying is that if you agree on 0V as black level, and have an analogue signal generated in the real world, that has passed through real cables, DAs, EQs etc., then you would not be surprised to see some transient excursions below 0V in a 'real world' signal, particularly on sharp edge transitions.
Clipping these excursions rather than preserving them will introduce further artefacts (thinking about the signal in the frequency domain you are adding square wave frequency components when you clip something - adding frequency components to a signal that wouldn't otherwise be there), particularly when you converted the clipped signal back to analogue (or filtered the signal in the digital domain in a DVE or similar)
Just as you allow headroom in an analogue audio signal when you digitise it (to avoid clipping and creating square wave distortion on peaks) you sort of need to do the same with digital video.
Even if your camera black and white levels are set perfectly in the CCD or tube (this stuff dates back to broadcast tubed cameras), an analogue signal will have been filtered as it passes through a cable/RF channel, and this will cause sharp edges to be 'softened' and they will have transients that over/undershoot.
If you have the output from an analogue remote OB coming on an analogue microwave circuit, and used a digital frame store to synchronise it to lock it (synchronisers were one of the early applications of digital video in the mid-70s) to local syncs, and then this was used to output a composite analogue signal for use elsewhere you wouldn't be surprised to see that the source video signal wasn't a nice, clean perfect 'nothing below black, nothing above white' signal.
If you clipped this analogue signal when digitising it, so no information was preserved below standard black and white levels, you'd clip transients (spikes if you will, or imperfect edges), and this would introduce ringing in many cases when you converted back to analogue.
I have dim recollections of seeing this demo-ed in the 80s when I was learning about digital video.