"Blacker than black" and "enhanced" black are two completely different things. Blacker than black refers to signals that are encoded lower than nominal black on the disc. For example, a PLUGE pattern has a blacker than black bar to help you find the proper mapping between disc black level and monitor black level. This bar *should* be reproduced by the player no matter what the "enhanced black" setting is. Unfortunately, in some cases players won't reproduce blacker than black signals if the "enhanced" black switch is turned on (which makes no sense at all), and some players won't reproduce blacker than black no matter what (which makes even less sense).
As Stacey has pointed out, many films actually have blacker than black information encoded in the dark areas of the image. If those blacker than black areas are clipped too hard, the result can be artificial-looking, where the black areas look strange and not smooth, especially on a CRT monitor.
"Enhanced" black is just a switch that changes the nominal black level output by the player from 7.5 IRE to 0 IRE. Some players have the control labelled "0 IRE" and "7.5 IRE," and some players have it labelled "darker" and "lighter". Whatever the labels are, it's easy to figure out which is 7.5 IRE and which is 0. Whichever one produces the brighter image when you switch between them is 7.5 IRE.
As mentioned more than once in this thread, you can choose 0 IRE or 7.5 IRE nominal black level, and it really shouldn't make any difference to the picture as long as you calibrate the monitor and then leave the switch alone from then on. I disagree with VE on the "dynamic range" argument. Yes, going from 0 to 7.5 IRE compresses the dynamic range, which would be an issue on analog recording and RF transmission equipment, but it's a complete non-issue for DVD. When you are in 7.5 IRE mode, it just changes the bias voltages applied at the end of the D/A process. The bits stay the same all the way up to and into the DAC. And the cables that move the image from the DVD player to the TV have so much bandwidth that the "dynamic range compression" is immaterial.
If you have blacker than black data on the disc, and "enhanced" black is turned ON, the blacker than black areas will dip below 0 IRE, down to maybe -2 or -3 IRE at most. If you have "enhanced" black turned OFF, the blacker than black areas will dip below 7.5 IRE, down to 5.5 IRE perhaps, but not all the way down to 0.
So if the S&V disc says to turn "blacker than black" ON, that makes perfect sense. Most DVD players don't even have this control, but certainly if yours does you should turn it on when you're calibrating. And then when you're done calibrating, you should leave it on, unless it's a Toshiba 6200 or 9200. In that case, you should get a new player. Just kidding. :)
But "enhanced" black or "darker" or "0 IRE" or whatever you want to call it is pretty much a non-issue except to make the DVD player closer to your other equipment.
Hmmm... there is one issue that could make a difference. If the signal is going to go through another analog to digital conversion after leaving the DVD player, then you *could* potentially get slightly better results with the player set for 0 IRE. For example if you have a digital display (plasma, LCD, DLP, etc.), or if you are outputting an interlaced signal into a deinterlacer (iScan, Omega One, Faroudja), or outputting an interlaced signal into a TV with an internal deinterlacer (pretty much all modern HDTVs), then the expanded range you get with 0 IRE might make a tiny difference. Whether it's a visible difference is debatable.
The thing is, any time the signal has to be redigitized, the A/D converter has to be calibrated to a particular range of voltages. It's going to map that range internally into (most likely) an 8-bit value. And probably the range of voltages the designers will choose will be from slightly under 0 IRE to slightly over 100 IRE. Any voltage under or over that range will be clipped and information will be lost.
So if your DVD player is only outputting 7.5 IRE (5.5 IRE with blacker than black) to 100 IRE, then some of the lower end of the possible range will be wasted. And since the signal is going to be quantized to an 8-bit number, that could make a difference. On the other hand, if you set the player to 0 IRE, and it in fact is outputting -3 IRE, the digitizer next in the chain might just clip off all the lowest blacks.
And on the high range, some DVD players have their white level set at 108 or higher, and then they go higher than that when there are peaks on the disc. Stacey has seen discs where the whitest parts of the image are 120 IRE, on a good player. On a player that already has a white level that is nominally at 108, the end result would be a lot higher than any digitizing system would be designed to handle.
I've seen this personally on my Sony VW10HT LCD player. Feed it a signal that is too low or too high, and the lowest and highest parts get clipped. Similarly, I have an iScan Pro, and it will clip anything that is a little below 0 and anything much higher than about 115-120.
The answer to all of this is to carefully check (with test patterns) to make sure none of the image (including blacker than black) is being clipped, and if necessary use the image controls on the player (if any) to adjust the black level and/or white level to avoid clipping.
If you have a progressive player going straight into a CRT television, whether direct-view or projector, it's highly unlikely that any extra digitizing is going on. The signal path is typically analog from the back of the DVD player all the way to the electron guns.
In the end, I very much doubt that even with the D/A conversion the quantization difference between 7.5 IRE and 0 IRE black level is visible. The clipping issue, though, can be very visible. If you have a digital display or a deinterlacer in the signal path, it's important to check for clipping and do something to eliminate it if it's there.