Originally Posted by madshi
Last time I checked, the Radiance was still using a Gennum/Sigma chipset, which didn't support Anime cadences at the time. But that was quite a while ago. Could be they are using a chip now with Anime cadence support. If not (and the Lumagen website still only claims support for typical movie cadences), you're probably seeing video mode deinterlacing for many Anime DVDs, which is "ok", but not as good as it could be.
Well it does better than anything I've seen on the PC side, at least anything real time. AVIsynth with offline processing can do pretty well, but you have to pick the correct setup ahead of time, and even then I'm not sure it's actually better. And you have to do it ahead of time and recompress in the process.
In any case, of course an external VP is quite comfortable, less pain than an HTPC. But then, you also lose playback flexibility.
I'm not sure what flexibility I'm giving up, I haven't run into any limitations yet. In fact IMO I get more flexibility this way (I'll elaborate below).
And what's going on my nerves is now the newest Radiance models have a 4K HDMI output, but only up to 30fps (and no word on bitdepth), and they don't even have 4K HDMI inputs yet. So basically if you buy a Radiance today you already know that you'll have to replace/upgrade it again soon. And the upgrades are not going to be cheap.
FWIW, I don't think the "real" 4K Radiances are out yet. I keep hearing rumors, I expect there to be an update to the XE with 4K input and output. I'm not sure what the point of the 204x series was actually. I'm guessing the 4k output was "free" to add to that architecture, but they clearly don't have the FPGA/Video processing engine/horsepower for 4K input yet, besides as Nev mentions, they came out before HDMI 2.0 was released, so they couldn't support 4K60.
I once had a Radiance myself, but I sold it, because it had only HDMI 1.1 ports (no lossless audio support) and upgrading it would be been too expensive. I suppose there'll be a new Radiance with 4K input next year. And then one with 4Kp60 input/output the year after. And then one year later it will be 4Kp60 with 3D support. And then one year later it will be 4Kp60 with 3D and DeepColor.
I expect there will be a 4K Radiance XE, maybe they'll announce it at CEDIA this week, maybe CES (just guessing), and I'd expect it to support the full set of HDMI 2.0 features.
No thanks. I'd much rather upgrade a budget GPU PCIe card once a year, which is really cheap and always comes with the latest HDMI version.
That's great and all but that's not really a "fair" comparison, though I'm sorry I derailed this thread into a PC vs external video processor debate. The point of my post was more to illicit a response from the OP as to what settings they tweaked in VLC, since out of the box it's just as terrible as I remember it being.
It is my aim to simplify life for HTPC users, at least in the long run. There's really no reason why deinterlacing should be so painful to get right with an HTPC. I should be able to fix that, by doing all the deinterlacing work with my own algorithms. And I believe I can do better than some of the algorithms used in hardware video processing chips / Radiance. But in the end I'm only a single developer and it's only a hobby project, so it's going to take time...
Don't get me wrong, I think what you're doing is great, you've done some amazing things, but in the end you're very limited in what you can do with a custom video renderer on the PC. And really this is one of my biggest problems with PCs for playback. madVR is awesome, but it's a custom video renderer that needs application support to be used, this seriously limits things.
For example I can (and have played with) measure my display and generate a 3D LUT for madVR to use to provide perfectly calibrated video output. Problem is that only works in apps that can use madVR, so if I go to watch TV, I'm not getting a calibrated image. If I go watch a Blu-ray disc (I'm not going to waste my time ripping Netflix discs just so I can watch them) that's not calibrated. So my ripped discs can look great but everything else doesn't get the benefits.
I've got a constant height setup in my HT, so I need to be able to horizontally squeeze non scope content and vertically stretch scope content. Well that doesn't work (last I tried) in the commercial BD softwares. madVR can help (I think) in the software that supports madVR but again, that's only a portion of my content.
There's a theme here, the PC playback environment is fragmented, madVR is great when you can use it, but you can't always use it. So that means you really can't take advantage of the CMS features for example since that would result in different video output for content from the same box which makes calibration later down the chain near impossible.
In contrast a Radiance can do everything for any source that's connected. I calibrate my projector using the Radiance's built in test patterns (and if I had a less well calibrated projector than my Planar 8150, I could use the 125 point auto CMS with Chromapure) and then I can calibrate each input (and each resolution at that), and I can connect any player, any device, and have a calibrated picture with full aspect ratio controls, and any other adjustments I want.
I've got an Xbox 360, SageTV HD300, Pioneer BDP-51FD, and a OpenELEC PC all connected, keep thinking about adding a Roku, all calibrated, and all with robust aspect ratio controls. Yes it's expensive, no it's not for everyone (not even for most), but yes it is upgradable.
And if tomorrow something super awesome is announced at CEDIA, or some super awesome killer app is released that runs on a PC, some new super great projector comes out, I can integrate it into my system without having to worry about if the display is or can be well calibrated out of the box, if the aspect ratio controls work at 1080p, if they work when 3D is enabled, or if the new player device/software can be calibrated, or if madVR works with it. That's what I call flexibility. I've bypassed projectors before (almost got a JVC RS2 at one point, and a TruVue Vango at another) for lack of CMS or aspect ratio controls, but that's no longer an issue.
Originally Posted by Nevcairiel
They may not go for the software update, if they are really greedy though.
It's my understanding that HDMI 2.0 will require new HDMI chips, that it can't be a software update.