Originally Posted by Livin
If you call not reading over 420 posts lazy, sure... I just call it having a life.
Hi, Livin. It seems you have a healthy dose of skepticism about the Darblet. Nothing wrong with that. I'm sure there are others who share it (myself included). However, I would hope it would be possible to want to see some real-world results and have your questions answered, without becoming confrontational or abrasive.
There are many pages with many posts of non-relevant chit-chat within them, I scanned the thread, and read many of the posts - including the reviews I did find. The reviews are very much "it looked better" - yeah, that's useful.
I have to disagree with you there. I have read the 420 posts, and my sense was completely different than yours. What I saw were fairly detailed comments relating to the specific ways in which it looked better (or worse), dependent on the source material, and the output displays. As well as ranges of settings that impacted those results.
Does your $20 card have the ability to perform USM-type processing, on 8 MBs of image data, in real-time? If so, then you may be correct.
It's not impossible to perform DarbeeVision-type processing on video data, using PCs. They've been doing so for years with non-realtime rendering farms on some films. The question is what you can do in <17 mS.
Any who thinks Patents make something good - or better than a similar process or device - is obviously NOT an engineer or scientist.
True. But I suspect the point that RonF was trying to make was that the DVP was not just another "sharpener", similar to those that came before it, but had some unique capabilities that set it apart, and warranted Patent protection.
And that "Whitepaper" is an advertisement, NOT a whitepaper. An engineer would laugh (I did) when I read that. I know what a Whitepaper is, that is NOT EVEN CLOSE.
I have to agree with you that the WP had a distinctive advertising flavor to it. But I have to cut Paul a bit of slack, because it's not all that easy to try and describe the results of an imaging process in a way that folks can comprehend. He did wax lyrical a bit though.
Looking at everything post on the Darbee site and in this thread - and since Darbee has not published ANY TECHNICAL DATA on the process or even layman's terms to dispute my own conclusions
Did you read the Patent filing? There's a significant amount of tech detail there.
- I'm concluding that a $20 video card from NVidia or ATI can do exactly what this does, and likely MUCH more such as different types of deinterlacing, denoise, etc.
Well, as far as the "much more", it's well known that it can do all that. It's even possible that using the GPUs on some video cards, the folks at DarbeeVision could come up with a DirectShow filter that incorporated their processing. I just don't know if it could be done in real-time. [see Addendum below.]
Maybe not, I don't claim to know the breakdown of withs/withouts... I'm one with, and I know several thousand (from several forums) that do. Actually the claim to compare a video card to the darblet is exactly accurate... a full PC (which still can be had for less $ than the darblet) can do 10,000x more than the darblet... the darblet is actually LESS CAPABLE (features/functions) than a video card - the darblet is only a video processor.
You're missing the point that not only do not all folks have HTPCs, most don't. They have a collection of independent, outboard components, with interconnects. The DVP-5000 module fits in well in those configurations, while an HTPC would not. People can easily add a DVP-5000 to their system. The same statement cannot be made about HTPCs.
Adding processing on top of processing is not a challenge at all... you'd provided ZERO data on what your device does to the frames, contrast, etc.
Again, check the Patent filing
I also love how the ONLY professional review you post on your forum does not even exist... DVn Wide Screen Review Feb 2012_gray.pdf ... I did a search for it on the web and it does not exist anywhere.
It might be worth looking again. The file does
exist, because I just downloaded it. It's really more of a pre-review, from the New Equipment section of the February WSR (a CES report). However, Gary Reber has been provided with a test unit (about 4 weeks ago), and I expect we'll see a full review in WSR in the coming months.
I have no idea what your background is other than what is posted on LinkedIn... but none of your posted work experience points or even hints at display technology.
Perhaps that's because Larry is the COO, not the CTO. He's taking his time to communicate with members of the AVS Forum, gather their suggestions and complaints, and provide as much information and answers as he can. I'd say he's doing a good job of it.
With all of this and the huge lack of evidence that this device is anything but a different incarnation of what has existed for many years in cheap video cards...
?? If what you're referring to are peaking circuits, that boost HF content, then you are incorrect.
- I advise people looking for "better PQ" to just get a cheap HTPC and tweak to your hearts desire.
With all due respect, that kind of advice is not very valuable. Beyond the many issues inherent in HTPCs that could result in a lot of grief for those unfamiliar with them, "tweaking to their hearts desire" is not really a very effective way to spend one's time. You indicated above that you "have a life", and no time for reading hundreds of comments. Fair enough. But that's completely at odds with your recommendation to spend a lot of time tweaking an HTPC. Many (most?) folks would rather just plug something in, and dial in the best setting.
Paul has spent literally years testing various algorithms and methodologies, in numerous combinations, to come up with a solution that a) provides enhancement capabilities while minimizing negative artifacts (a big win, unique to the Darbee process), and b) uses lightweight carefully-tuned algorithms that are implementable economically in existing real-time silicon. Even if someone stumbled across some tweaks with their HTPC and ffdshow to provide somewhat similar results, the number of hours required would greatly outweigh the costs of the stand-alone processor from DarbeeVision. And my guess is that while similar sharpening results may be obtainable, they would not be without increased artifacting (that the DVn avoids) that would negate the value of the increased detail.
I'll conclude by saying I'm open to being wrong... but I've seen no evidence to make it so.
That's cool, but why put yourself in that position in the first place? It's easy enough to ask questions and seek clarification, without making bold proclamations without evidence, and having to eat crow later.
If someone wants to provide a sample I'll do a side-by-side comparison of the Darblet vs ATI/AMD and NVidia... on a 65" 1080p screen.
Or, you could just order a unit, try it, and then return it for a refund if your suspicions proved to be correct. And then provide some solid information based on real testing, vs. uninformed speculation. Just a thought.
~~ Addendum ~~
I checked the DarbeeVision site, and found that the Darbee transform algorithms WILL run on some NVidia GPUs:
DVP™ processing on a GPU
DARBEE™ Visual Presence™ - GPU SOFTWARE
GPU firmware for Nvidia CUDA enabled GPUs.
Performance = "Knobless," automatic 1080iHD/60 processing in real time, 1080p/24 in real time, 1080pHD/60 gracefully dropping frames.
Note however, that it can't maintain real-time with 1080p/60, without frame-dropping. While the DVP-5000 can, because it has a dedicated FPGA.