This has been gone over a few times. It comes down to this:
- Decoding from any format to LPCM must be done somewhere, the algorithm to decode is known, licensed, and should be identical between machines doing the decode. As such, it should not matter who does the decoding to LPCM with regards to sound quality.
- However, only the player has additional information to assist its mixing in extra content as it decodes to LPCM. So, if you want that content, you must decode in the player, since that meta-info isn't sent to the receiver and, therefore, is lost if you decode there.
- There is a set of people that insist the jitter on the HDMI line is sufficent to cause LPCM to sound worse than the encoded data. The claim is that when the data is decoded in the reciever there is less possibily for jitter to muck things up, as the receiver gets the encoded packets and must decode there. I suppose in theory this is possible, I don't entirely buy into it. I figure that if there were enough jitter to mess up the LPCM, it would mess up the encoded stream too and rather than get worse sound, you'd just get drop outs as the checksums on the encoded data (assuming there are checksums) wouldn't match and the receiver would throw them away. But, I haven't studied how the lossless codecs are transmitted through HDMI enough to know for sure...it'd be interesting to hear from someone who does.
Summary - your PS3 can decode just fine, why not let it. Further, you can get mixed in content by letting it decode that you can't if you decode in the receiver, so why decode there? It only has the possibility to have less information, and "no" possibility to have better sound or more information; so unless you just like to see the DD-HD or DTS-MA lights on your receiver, or your player can't decode one or the other to LPCM, I don't see a reason to send them to the receiver.