Except that over a certain point, the increased bitrate might not produce as noticeable an improvement.
JPEGs might prove a useful analogy here (though I'm no audio expert, so this doesn't necessarily hold). Anyway, most can tell the different between saving a .jpeg with a quality of 40 vs. a quality of 80. (40 is more compressed, so more artifacts, etc.). Above 80 or 85 though, and the improvements aren't easy to see. File size increases dramatically though.
So, the question is whether DTS at 1.5Mbps is at a similar sweet spot (bitrates below it are noticeable, but bitrates above it might not be, even if the bitrate is substantially increased).
Without knowing which is playing, can the average person really pick out the difference between DTS-HD MA, DTS-HD HR, DTS-HD "core only", or even DTS over SPDIF @ 1.5 Mbps? Peterjcat raised an interesting point here.