The trick here is that you are fiddling with both ends of the signal and so you may be causing problems at one end that you are then trying to bandaid fix at the other end.
Normally for people with commercial players I'd recommend you keep the commercial player set at its factory defaults (except possibly for having to change 0 IRE vs 7.5 IRE for analog video output). The factory default settings for most modern DVD players really are pretty likely to produce the best signal that player is capable of -- particularly if you pick a "picture mode" on the player described as doing the LEAST to the image. This is quite unlike the case with TVs where the default factory settings are likely to be truly awful.
Then, trusting that your player is already doing the best it can, you now make ALL calibration adjustments at the display end.
What you are doing on the other hand is adjusting your CRT to show it's full dynamic range and then separately adjusting your computer-based player to calibrate the signal from the DVD against that CRT range.
Just to give you an idea of what might go wrong here, suppose the 100% = "white" level on your CRT happens to be driving the CRT a bit too hard so that you are getting beam focus "blooming" and perhaps even modest levels of geometry distortion as the CRT's power supply tries to keep up. Or perhaps the CRT will start to show color shift if you drive the phosphors that hard for very long.
Now when you play Avia test charts via your computer, you will see such problems and will turn down the top end of the output range of the computer player so as to eliminate these problems -- effectively driving the CRT at less than the 100% you previously set up. But that means you have just compressed the dynamic range of the signal coming out of your player. That may in fact be the best way to image perfection, but another possibility is to adjust the top end of your CRT to a lower light output level and leave the player's end at a wider dynamic range so that "white" from the player is a higher voltage but only drives the display at, say 95% of it's maximum light output.
Since neither your player nor display is dependably calibrated to begin with, you might need to play around to find which combo works best.
But on most TVs, for example, you would be very badly served if you set up the TV to show "white" from the DVD as the maximum possible light output level.
A "safer" starting point would be to set the CRT such that the 100% signal level generates an image which is just beyond the point where you perceive it as "gray". I.e., "white" but only just barely so -- as opposed to maximal white. This would be the equivalent on a home TV of lowering Contrast with the intention of watching the TV in a dimmed room.
Ideally you would need a video signal scope and a light sensor to do both ends of this calibration. First making sure your computer-based player is sending the best signal it can, with proper linearity of the gray steps, and then adjusting your CRT to accurately reflect that signal as steps of light output.
I don't know enough about the nature of your computer based setup to know if it has dependable defaults. But if it does, then you might also try leaving the player end at the default settings and doing all calibration adjustments at the display end, just as if you were using a commercial player.
Also note that if both your computer based player and your display are equally happy with analog video signals at 0 IRE and 7.5 IRE voltages, then after you calibrate you will be unable to distinguish on the basis of any test pattern or image exactly which voltage you ended up using. The images will look identical once properly calibrated each way.
In particular, any given test pattern off a DVD can't know what voltage standard you are using on the analog video cable. So a test chart that identifies "Black" as "0" for example, doesn't necessarily mean that the signal for black on the analog video cable is 0 IRE (voltage). It could just as easily by 7.5 IRE (voltage).
So when you say you have set up your component connections to use 0 IRE, you can only know this for sure if your test image generator is driven by the 0 vs 7.5 IRE selection, and thus changes according to which voltage you select for output -- which won't be the case for any DVD calibration disc.
The only problem to watch out for is that not all players and not all displays work equally well with both voltages. The player or display might clip BTB data at one voltage and not the other for example, or the display might not have enough calibration range to be adjusted properly at one voltage when it works fine with the other.
The most important thing is to pick an output voltage standard that allows you to get the display into proper calibration at both ends of the gray scale -- both blacks and whites. If only one voltage level allows for that, then that's the one you've got to use unless you can find a setting in your display that lets you use the other. Even if that means you lose BTB data.
Having found a voltage that lets you calibrate blacks and whites, then you should also check that BTB data and Peak White data is getting through properly. If it isn't there may be some setting on either the player or display which enables it -- without forcing you to use a voltage level that won't calibrate.
The black levels test chart in the THX Optimizer on some commericial DVDs is OK to use to double check that BTB data is being passed, but it shouldn't be used for actually setting calibration levels because it is not as dependably accurate as what you'll get off of Avia or DVE. On some commercial DVDs the THX images have been massaged as part of the final editing process that created the DVD, and thus inaccurate levels are introduced.