Last night I watched Casablanca again. It was the first time I had watched it using the ATI player on my NEC XG135LC. I kept marveling at how wonderful it looked. Black level, shadow detail, overall detail, everything, looked great.
But even with the wonderfully sharp transfer, I couldn't see any sign of edge enhancement. It didn't have an edgy "video" look. As much as anything I've seen on video, it looked like film. I kept thinking "this is what it's all about!"
My question is, why does this particular transfer look like this and not others? MGM is not known for consistently excellent transfers (indeed, it's been severely criticized for some of its efforts). Other studios that have a better reputation for their transfers have done transfers that have engendered complaints about EE. This despite the fact that they will sometimes specifically deny the use of EE. An example was in the recent Home Theater Forum chat with Peter Staddon, in which he specifically stated that no EE was used on Die Hard With a Vengeance, and that problems people were seeing were in the source elements.
So what is the cause of EE? It can't be studio specific, because I've seen EE AND a lack of it from the same studio. I don't know if it's machine specific either, because it's reasonable to assume that studios use the same transfer/compression devices on all their films (or maybe it isn't!).
Is it operator specific? Possibly. Are different films from the same studio transferred by different operators? What about Peter Staddon specifically stating no EE was employed?
Is it film-specific?? Maybe some films are just more amenable to transferring/compression without EE than others? It truly is an interesting question.
[This message has been edited by RobertR (edited 08-06-2001).]