Copy of a post of my reply to a similar question elsewhere:
The outputs of the video decoder blocks on the SoCs are bit-accurate for H.264 and VC-1, meaning the decoded video quality at that point is exactly the same for all players.
So, the differences in players is in the post-processing of the video, such as scaling, deinterlacing, edge enhancement, noise reduction, color correction, etc. This is where the "art" comes in.
Some player manufacturers (and consumers) may also tweak various settings (such as brightness, contrast, hue, saturation, sharpness, edge enhancement, 2D/3D noise reduction, gamma, color conversion, etc.) to achieve the specific "look" to the video that they want. The most accurate picture, although possibly not the most personally visually pleasing picture, is when all those type of controls are bypassed or zero'd out.
"Despite the video decoders outputting the same video quality can some players give a slightly better image due to internal circuitry, better power supply, etc. or is all of that pretty much irrelevant with digital?"
Once the power supply is of sufficient threshold in quality to reliably handle the player, and a good PC board design is used, it is largely irrelevant with digital.
However, they could affect the drive electronics as that contains analog circuitry. Once some of this analog front-end stuff is integrated onto the main SoC decoder, these things become more important.
These things can also affect the analog audio and video outputs, however, as the audio and video DACs should have nice, clean power as any power supply noise is pretty much coupled directly onto the analog outputs.
The quality of the HDMI output is important, as that could be considered analog due to the frequencies and levels involved. We have seen differences in HDMI outputs in their ability to drive various cables and cable lengths, and some HDMI switchers, without introducing some sparkling in the video.