Have any of the upcoming TV's themselves stated explicit support for 1080p48? IOW, is it soon to be considered a requirement for it to be listed out on TV's and Blu-Ray players, etc.?
Speaking of resolution effects.
If any of you get a chance to view the remastered (HD) version of the Star Trek (TOS), it's a little humorous.
They opted to redo the ships/planets/space in CGI. I really would love to see what the models looked like.
But get this: the make-up is hysterically bad. You can see the off-and-on shaving shadow on the faces (depending on what time of day it was filmed). And everything just looks much more toyish than normal. Like most TV shows (when did this tip to mostly digital---10 years ago?), they filmed in 35mm film, but they just were never expecting it to be seen higher than the SD quality of the era.
Really great to see in its own way. Big plus, and kind of a funny minus.
Is that really true? Wow. I had thought that there were 1080p60's made.
Perhaps this is appropriate to bring up here.
When my TV has it's motion handling turned up, the SOE is pronounced.
When I pause the video feed, the frozen picture I see has the same level of SOE. It's not related to frame rate per se in this case, it's the motion handling / interpolative effects: the drawing of each of the images is altered and the effect is visible even one frame at a time.
I'm not saying that boosted native frame rate can't cause SOE or an SOE-like effect. I'm just giving an observation.
No, I don't think that's quite right.
The soap opera effect isn't because of fluid motion per se, though in the case of TV's it's caused by motion processing often. Before motion processing, it's the "shot in video" look of a soap opera compared to things that were shot in film. Motion per se, isn't the culprit in TVs, it's the side effect of the motion processing. On a non-motion-processing CRT, if you look at a stationary background scene of a soap opera (or today of any scene in an SOE riddled TV), it doesn't suddenly lose it's stark effect in comparison to the objects moving in the foreground.
The whole image is uniformly "SOE'd".
When I watch a movie on my TV and see it totally SOE'd up, if I hit pause on the feed, the SOE remains. It doesn't suddenly look like a film again. If you can't believe me on that, then don't. Instead, go back to the CRT days and video vs. film was still evident:
The stark look of the stationary items in the scene of a video shot (like from a soap opera) stay the same starkness as the moving items. They don't suddenly go to film like quality. Even the still stuff looks like video. As with a soap opera.
No, I'm saying that frame rate is not the cause of SOE, at least I can't see how given my observations.
AFAICT, there are two direct causes of SOE:
Motion interpolation (and other motion algorithms) do *something* to the images themselves to enhance their look as if they were shot on video. Even *if* the source feed of the movie Looper on HBO is put on pause (at my DVR), there's something going by the TV on with each of those duplicate identical frames sent to the tv that keeps it consistently SOE. Otherwise things would be non-uniform between moving and stationary objects (in screen space).
So I suppose to boil it down, my guesses are:
Not convinced you're reading this right. He said "much more lifelike", never said "video like" or anything akin to what we mean when we say SOE. Video [of yore---not the carefully mastered stuff of movies today] is different in that we get something that people refer to as "too real", but I think that's only because conventional film is so "not real". The overly real effect of the HFR is different to what we've been calling SOE.
I'm fairly convinced that all the calculations, audience reactions, and metrics are going to take some time to hone down on what we consider to be inarguable reasons for things. HFR/theater, HFR/TV, 4K, pulse, etc., just hasn't been around long enough IMO.