The problem is, who knows what you can achieve on your DD encoder @ 448 kbps vs. what a mastering house achieves on its DD encoder @ 448 kbs. Normally, you would suspect there should be no difference if the number is the same, but that is entirely up in the air unless you are using the very same encoder in hardware + software...and how could you guarantee that?
Hence, I would just use the highest available bitrate for a lossless to lossy conversion. If you are dropping the lossless soundtrack down to dvd-quality audio, then you are most certainly changing the sound of the material (and hard to gauge if it will be an "identical" change that occurred when they made the dvd soundtrack for the dvd product). If you are deliberately changing the sound, then that breaks the test altogether.
Why not just use the lossy tracks that are already on a bluray disc as the reference? Those are the ones that are arguably "transparent", in the first place, and they won't have the Cinavia watermark (as I understood this topic is saying Cinavia is only being applied to the lossless track, but maybe I am wrong).
Margot Robbie is THE most exquisite human creature on the face of the planet, in this moment in time.