AVS Forum banner
Status
Not open for further replies.
1 - 20 of 61 Posts

· Registered
Joined
·
5,004 Posts
Discussion Starter · #1 ·
I've found that any magazine that reviews a single product in a single article always says 'its great', unless there's an advertisement for that manufacturer in the same pages in which case it says 'its fantastic'. Useless.


However I believe magazine reviews that compare four or more of a product are reliable. That was how I picked my current DVD player and it's worked reliably for seveal years now.

I've been rumaging through stores looking for a magazine with another four DVD player comparison , and haven't found any.


What I did find was this
http://www.hometheaterhifi.com/cgi-bin/shootout.cgi

Which has an October 2004 shootout, as well as a query/search feature by price/brand. Nifty. They even did a review of my existing player, which according to them failed all kinds of things.
 

· Registered
Joined
·
1,103 Posts
Everyone that's a regular reader of this (DVD) forum is aware of the October Secrets DVD shootout. There are two current threads on the topic -- each having several pages worth of discussion.


One thread is called: "URL of latest Secrets DVD shootout?". The other is called: "Newest Secrets Benchmark is up".


The short story is Denon seems to be at the top of the list (again).


I agree that comparisons seem more reliable than a single review. But, I would still look at the ratings of each player compared to how each is advertised on the site performing the rating.


Finally, it seems that every player fails some part of the Secrets testing. It's a little frustrating that the "layman" has a difficult time determining how big an issue any given test will be seem in everyday use. (From your comment, it sounds like you're more satisfied with your player than the Benchmark might lead you to believe).
 

· Registered
Joined
·
549 Posts
There is also a matter of score weighting. My preferences are somewhat different from the weights used in the score calculation. For example, I could not care less for things like poor Layer Change, Responsiveness, Recovery Time, Image Cropping, Sync Subtitle to Frames - I weigh them zero. On the other hand, things like Video Levels, Blacker-than-Black, YC Delay I value the most. I think, same is with everyone, everyone has his own set of preferences.


Having said that, the Secrets shootout is still the best source of objective info and I sincerely admire it and people who did it.
 

· Registered
Joined
·
5,004 Posts
Discussion Starter · #5 ·
GreggPenn:


I started writing this thread because I was about to ask if anyone had seen a magazine article with 4 product comparisons, so I could go in search of that magazine. But at the last moment, halfway through typing the starting post, I went out and did some more searching. Which caused me to edit edit edit.


You're right that I'm content with my current player, although there are several things that could be better from the user interface, it seems to play well. But I rarely use it on my 10' screen because until this week I didn't have speakers for it -- somewhat of a drawback for movies.

(I've been using a PC with 5.1 speakers)

My current player is the JVC XV-S60, worth about $180 CDN when I bought it a few years ago.

I have a need for two DVD players in my home at the moment, so I'm buying the 3910 next week.


Although I've seen aliasing on some DVD players and projector combinations, I rarely see the other popular artifacts.



There's this quote from http://www.hometheaterhifi.com/volum...e-10-2000.html
Quote:
We also want to warn you: if you have a progressive-scan DVD player that you are happy with, you probably should not read this report. With some of these artifacts, you are much better off not knowing they are there, because once you start to notice them, they’ll drive you nuts, and then you will inevitably want to replace your player. We’re quite serious here – it’s happened to us.
 

· Registered
Joined
·
271 Posts
Quote:
Originally posted by GreggPenn
But, I would still look at the ratings of each player compared to how each is advertised on the site performing the rating.
Gregg, this is the second time I've heard this from you with respect to advertising and test results on the Secrets Web site. I discussed possible ways to establish bias here: http://www.avsforum.com/avs-vb/showt...76#post4470776


So far, no one has come forward and shown that the tests are somehow biased in favor of Denon. Have you run the same tests and come up with different results? Seeing as how the tests are measurable, quantifiable, and repeatable -- the player either meets the spec or it doesn't -- if you have different results tell us. There are a lot of people who would rather see a non-Denon player perform well on the tests and claim the top spot.


--Mike
 

· Registered
Joined
·
15,715 Posts
The Arcam notwithstanding, I would have been shocked if the Denon 3910 hadn't easily smoked the rest of that particular batch of players by any and all rights. Something would have been horribly wrong.
 

· Registered
Joined
·
1,866 Posts
Not trying to side with either Greg nor Kris (and Secrets), I suppose any magazine or website could be proven to be bias in favor a certain brand/machine depending on which tests they decided to include, and more importantly, which tests they decided to not include.


If a certain player failed several tests, those tests could very easily be not included in the final analysis so that any player could be "outstanding." I find this happening a lot in those glossy print magazines, whereas "Secrets" holds the players to some seemingly comprehensive tests. Are there tests that "Secrets" does not include which would make the Denons do poorly? I have no idea. But they were the first guys to identify the "Chroma Bug" and started publicizing those manufacturers who included it in their players--Denon included.


Finally, what do you think the percentage is of people who purchase DVD players and actually visit the "Secrets" website and base their buying decision on those reports? It would probably be very, very small. (I was "shocked" to learn that only about 4-7% of consumers buy their DVDs over the Internet. I had been buying mine off the Internet for years. But all those other poor suckers paying those high B&M prices...:p .)
 

· Registered
Joined
·
17,407 Posts
megamii


I think that had to do with correcting problems that the secrets site uncovered in their players.


Its my understanding that most manufacturers aren't all that interested in fixing anything (CUE for example) and that Denon showed an interest.


One thing that I wish there was an objective way of testing is after you're done with test equipment, how does the picture compare? Maybe that viewing is too subjective and doesn't lend itself to ratings.
 

· Premium Member
Joined
·
14,757 Posts
Quote:
In speaking of bias, did one of the members (I believe his name is Stacey Spears) of Secrets worked with Denon on improving its DVD players?
Denon asked Stacey for his help, along with Faroudja, on the macroblocking issue. Stacey was no longer working with Secrets anymore though.


We would be delighted to help out any DVD manufacturer improve their design though. We have helped Denon and Krell before. The whole point of the benchmark is to improve DVD performance by pointing out shortcomings to the manufacturers. The fact that people can access the results is just a plus. We have also gotten away from referring to it as a "shootout", this is inevitably because of the scoring system though. We had to rank them because of our search function so it would be easier to see what player did better at a given price point or with different chipsets.

Quote:
I agree that comparisons seem more reliable than a single review. But, I would still look at the ratings of each player compared to how each is advertised on the site performing the rating.
If you really think there is bias, that is fine. We got the same kind of comments when the Panasonic line was mopping the floor with everyone too. But those comments are a bit out of line since there is absolutely no way you could prove it since I do the same tests to every player every time. I also can't think of a test that I could add that would give Denon a disadvantage. I will be adding some new tests shortly but I have no doubt that Denon will do well. Their engineers have taken a liking to our benchmark and do their best to engineer their players to do well in it. They see it as a valuable resource. I wish all the other manufacturers were as receptive.
 

· Registered
Joined
·
2,374 Posts
In my humble opinion, all these shoot-outs, going by numbers, prove very little in the real world of watching movies and video in ones' home. It would be like judging audio amps going 'by the numbers'. I have always preferred valve (tube) amps and pre-amps. Todays' top rated solid state equipmen (judging by the numbers) would smoke this valve equipment. But, oh the sound is so wonderful when heard thru tubes over great speakers. Only way to judge DVD players is to view (and listen). Connect 4 DVD players to equal displays and play equal material over them and view and listen to them. That is judging in the real world and what really counts in the very end. And, of course, reliability in anything we buy comes first. Do not buy any equipment by the numbers from others. Go out and view and listen for yourselves. You can use 'numbers' as a guide but, in the end, let your ears and eyes be the final judge.
 

· Registered
Joined
·
408 Posts
to some average consumers,


they probably won't pay much attention to about the deinterlacer or chroma bug if its masked well. They probably thought the read streaks amd smears are compression artifacts or something which isn't true. I used to thought those red streaks that i see watching some movies in MPEG-1 in videogames CG video were compression artifacts.


It was very visible when watching it on non calibrated tv. Where contrast is way over the top, red is severely push, and etc. After turning off the red push in service menu, the flesh tones looks much more accurate and the red streak isn't as that severe, but still noticeable. It's not until later where the color temperature were adjusted to get to 6500k that I notice the red push on the same CGI is mild.


I couldn't believe that calibrate a tv can make the chroma bugs less visible.


and now dvd players with faroujda that mask the chroma bug so well that I will have a tough times telling them.


To hardcore videophile, this may be a problem to them. But to average consumers, they will not notice it.


Me, I am okay with if it's fail the chroma bugs and some deinterlacer tests, and if it's doesn't have faroujda for now. Until Blu Ray and HD-DVD player hit the market, and that's where I am more serious about its video performance and if the chroma bugs getting fix properly, etc...
 

· Premium Member
Joined
·
14,757 Posts
I think our tests relate completely to the onscreen image. Almost every drawback that you will see is covered in our tests. Contrast levels, black levels, Y/C Delay, frequency response, de-interlacer hiccups, CUE problems, they all directly related to what you will see on a display.


I guess it is really a question of what you are looking for; accuracy or something that looks good to you. We state specifically in our guide to the benchmark that most DVD players will look great the majority of the time. Personally I want a DVD player that is displaying the image the way it SHOULD be displaying it so that I can distinguish what is an issue with the player and what is an issue with the disc. Its one of those transparency things. This could also be said with the display as well.


But I also don't think our benchmark is a definitive guide to buying a DVD player. Video is the most important aspect of a DVD player but there is more in the package. Audio, features, useability, these are all very important and most of this is ignored in the benchmark. But that is because the benchmark is a guide for manufacturers to realize and address issues in the video playback chain. Besides, you can't write a definitive guide to features since everyone's preferences are different.
 

· Registered
Joined
·
1,156 Posts
Quote:
Originally posted by Kris Deering
Personally I want a DVD player that is displaying the image the way it SHOULD be displaying it so that I can distinguish what is an issue with the player and what is an issue with the disc. Its one of those transparency things. This could also be said with the display as well.
Are you saying this as "Kris Deering, DVD player analyst" or "Kris Deering, movie fanatic"? I think there's a distinction between the two since the movie fanatic will focus more on the movie than video quirks, barring blatant and obvious things like green halos outlining everyone and everything or massive pixelation issues, etc. The CUE isn't noticable to a lot of people as well as edge enhancement. I've been in the presence of a guy who couldn't see EE in a Star Trek DVD we were watching and declined my offer to point it out to him since that would detract from his movie viewing experience if he was looking for it.


I imagine when analyzing DVD video performance each pixel would be scrutinized but what about when you're watching a movie on a Friday night with the 'better half'? Or is it true that those who really wouldn't care most likely aren't reading this post since they are not browsing this forum?


Peace...
 

· Registered
Joined
·
1,103 Posts
Quote:
Originally posted by GreggPenn
I agree that comparisons seem more reliable than a single review. But, I would still look at the ratings of each player compared to how each is advertised on the site performing the rating.
My comment above was intended to be a general guideline. My message was to look at advertising on ANY given review site when evaluating reviews. Because I referenced "Secrets", I'm guessing it sounds more like I was targeting my comments directly at them. In this case, I was not.


When I posted specific comments in another thread about the great Denon performance @ Secrets, there were two basic responses. One said Denon performed well because they pay attention to "Secrets" feedback. The other response was to get my own measuring equipment and see if I get different results. Though this is a valid suggestion, it is unhelpful and impractical. It would be less expensive to buy (and return) all the products and demo them -- hoping to visually confirm/deny the results of the tests. In fact, I'm guessing that some "videoheads" here might do just that. My post hoped to solicit feedback from those types of sources who might confirm/deny the superiority of Denon's DVD players.


By reading posts in this forum, I have to say there are mixed reviews. Kris concluded by saying their tests aren't definitive. With this in mind, asking for additional feedback seems very reasonable and thorough.


Finally, some explanation is certainly present on the "Secrets" website regarding the problems being tested for. However, they are sometimes difficult to understand while often leading the reader to wonder how, when, and how often errors will be seen by the viewer. Others (including Kris) have suggested that overall perceptions could be added and audio results could be included. There is always room for feedback and there is always room for improving the method that facts are presented. There is certainly PLENTY of room for practical explanation of what is being seen, compared, and heard during comparison tests. Sure, this can be subjective, but that's not always bad.


gp
 

· Registered
Joined
·
2,374 Posts
Just seems, at times, many persons here, seem to be more interested in the equipment than in the content they are viewing. Of course, good equipment is a means to an end; the end being viewing and lsitening to a movie or whatever. I am still using 2 channel stereo (with valve equipment) and still enjoying it very much. If you have audio equipment that is capable of depth, you will hear all you care to on a recording or a soundtrack. Again, all these numbers..5.1..7.1..etc...mean very little to me. Content is the most important thing. I still watch old episodes of 'Gunsmoke"(in black and white) on the Westerns' Channel. Of course, to each his own.
 

· Premium Member
Joined
·
14,757 Posts
Quote:
Are you saying this as "Kris Deering, DVD player analyst" or "Kris Deering, movie fanatic"?
Both of course. The whole idea of a transparent system is so you can enjoy the movie without the hiccups or player induced artifacts. Granted DVD authoring contributes more then its own share.


But lets think about this a second. WHAT IS THE POINT OF REVIEWING ANYTHING IF IT DOESN'T MATTER ANYWAYS??? If you think that in the end it only matters what you think, then WHY THE HELL DO YOU CARE WHAT I WRITE OR WHAT THESE BOARDS HAVE TO SAY ABOUT ANYTHING?????


Don't take the caps as yelling, I am just putting an emphasis. The tests that we created are geared toward artifacts that we saw in the early stages of progressive DVD players and known standards as per the DVD spec. People that truly care about what they are dropping their hard earned cash for deserve to know the pluses and minuses about the player they are buying. How are they supposed to know that certain $8K players are nothing more then re-hashed $400 players with a better case?? The dealer sure as hell isn't going to tell them.


If you are not sensitive to some of the artifacts we test for, IGNORE THEM!! Just look at the ones that are important to you. I can't think of a single thing that people look for in a players picture that isn't covered in the benchmark!?! You want a sharp picture? Look for a player with a flat response that doesn't roll off in the upper end.


As for what are tests are really looking for, it is clearly laid out in the benchmark. For those that don't know where, here is the link:

Guide to the DVD Benchmark


For those that want to learn more about progressive scan and what it is and does, here is a link:

Progressive Scan


Again I don't think that subjective tests have their place in this format, those are reserved for our writeups.
 

· Premium Member
Joined
·
10,599 Posts
Quote:
Originally posted by Kris Deering
.... I can't think of a single thing that people look for in a players picture that isn't covered in the benchmark!?! ....
I would add...


Chroma frequency response, vertical filtering, memorizes last playback position of disks, all performance measured for both 4:3 & 16:9 displays & sources, and PAL to NTSC conversion to name a few. This is not to say the current tests are flawed. I would just like to see more.
 

· Premium Member
Joined
·
14,757 Posts
We are adding chroma frequency response and vertical response. Memorizing last picture position has nothing to do with picture quality, that is an extra feature that is a personal plus but means nothing in terms of video performance.


I don't know what you are referring to for 4x3 or 16x9 displays? The tests are in reference to both? I don't measure the players scaling ability in terms of letterboxing though.


PAL to NTSC conversion is mentioned quite a bit in our reviews though we don't have a test specifically for it. Since 99.999999999999% of the DVD players that I get in for review are not region free and since I don't have any discs that have PAL tests patterns, other then DVE, I don't really have a means of testing this. Plus, since this is a frame rate conversion as well as scaling issue, how am I supposed to test it?? I usually will pop in DVE and do some subjective tests but again, I don't like doing that in regards to the benchmark?


One other note, most consumer displays, with the exception of CRT, do PAL natively anyways. I went to CEDIA this year and I think I saw one manufacturer showcasing an analog display, Samsung. They had some direct view CRTs. NOT ONE SINGLE RPTV that was CRT based, but I may have missed some at Mitsubishi. I will try and include more information on whether a player can convert PAL to NTSC and some info on whether I thought it did a good job, but I don't think this will be added to the list of tests.
 
1 - 20 of 61 Posts
Status
Not open for further replies.
Top