Some of that info was ok, but some of it is sort of misleading in the sense that it keeps using the most extreme examples.
I agree with most of the generalities of the statements in those documents, but calibration certainly can make the image more realistic looking if the color is FAR off enough in the default modes of your TV and your "by eye calibration" is faulty enough, but of course within the limitations of how realistic the content you are watching is (and some content certainly is fairly realistic looking).
One article states film looks the most realistic (and this may be true), but film has been abandoned and now movies are filmed digitally (since about 2009). As a matter of fact, very little film is even used anymore.
If you put in a disc like Tree of Life which is a near-reference level 35mm film for the REALISM factor, then you will see realistic images and skin tones because of the neutrality of the natural D65 lighting and the great care that was taken into keeping the skin tones realistic looking. The main reason for the loss of realism is how the cinematographers and others decided to shoot the lighting and color biases, as well as the overall quality of the equipment and the skill of the persons involved, as well as director's intent or the cinematographer and editors trying to add original flavor to the color. Although FILM does add to the realism look, it certainly isn't the only factor and probably not even the main factor. There are also issues like pixel fill, sharpness, false edges, ringing, and COLOR LIGHTING in a SCENE etc... It's true the director's intent often affects it as well, but in most cases I am not happy with EXTREMES that these director's often attempt with color to add flavor (sometimes it works certainly, but I think most director's are color blind and add too much yellow into movies). TOL was an example where the director was going for a purist realism look and just added flavor to the camera tones when he thought it was needed (which was rarely), and there are other examples beyond TOL that I didn't bother naming. That is an example of smart and good camera work, where you limit the amount of color bias. Movies these days are sometimes corrupting the entire visual experience with the director's intent, color in some movies have become nearly unbearable to me.
Calibration doesn't always increase the punch or POP factors, but it certainly can with some devices and it just depends on the PRE-CAL to POST-CAL and the presets vs. post-calibrated user or ISF modes. Punch is a lot about contrast. Some calibrations (many) will inevitably reduce contrast at the exchange for more accurate color, since the more inaccurate modes often have higher contrast on a set and the most accurate modes have lower contrast (sometimes it's close). This is one reason of many why some people may not at first like a calibrated image so much on some TV's compared to how they had it set before. Most people once they have gotten used to a really well calibrated image, they will generally start preferring it.
The reason sets do not have a perfectly calibrated mode is NOT because at the showroom floor they want it to look VIVID. That makes little sense considering there are VIVID modes and more accurate modes, so they can still have both. There are devices that are much closer to D65 and Rec 709 modes and that have improved much over time compared to even 5 years ago as far as accuracy at a given price range. The problem is the MFR variance and the cost involved and how accurate they can even get it anyways. Even if they could get it as perfect as possible, regular TV's are not nearly perfect so they cannot really claim it to be a perfect calibration based on an imperfect set. You need a human to work within those imperfections, as the MFR's would need to optimize the design process too much which increases the cost, as well as overly confine their MFR processes which are mostly outside the US (almost entirely actually). So there are a lot of reasons this problem exists. Most of today's sets often have uneven SAT. tracking issues, so even a good gamut could be a partly faulty calibration as far as Rec 709 goes. We are seeing pre-calibrated modes getting closer to D65 in the lower-end equipment, but it is a slow progression.
On cheaper sets, many lack the proper post-MFR testing by the correct experts, happens in projectors as well. We often see 1:1 pixel mapping issues, color and gamma errors, and stuff that should have never gotten to the end-product. It's all about money, or a lack of margins, just as it is with any product. TV's are often manufactured with cross-brand parts and put together in an almost kit-like manner, so it's also often a jumbled and messy job that they tried to squeeze profit from in one of the toughest businesses in the world to make profit as a manufacturer (which is making TV's). They'd make a lot more in the MFR of Smart Phones or something similar, TV manufacturing is just too competitive.
Some projectors like a Sony hw30 actually have fairly accurate OOTB modes as well as do some high-end TV's.