or Connect
AVS › AVS Forum › Blu-ray & HD DVD › Blu-ray Players ›  Official OPPO BDP-105 Owner's Thread
New Posts  All Forums:Forum Nav:

Official OPPO BDP-105 Owner's Thread - Page 96

post #2851 of 10056
Quote:
Originally Posted by Retina4711 View Post

I think about owning a BDP-105 in the near future. Currently I check the net thoroughly about this player.
One question I have which I could not find any answer is about the ability of the Oppo to play FLAC 5.1 music files like these from 2L:

http://www.klicktrack.com/2l/search?c=FLAC+96kHz+24bit+5.1+surround

thx in advance !

Yeah, no problem playing flac 5.1's, just plays like any other flac file. I have (Piano Improvisations (5.1 surround)) from 2L and it hasn't presented any problems. I have several other 96/24 5.1 flac's from other sources and haven't had any problems either. I use HDMI 2 on the 105 going to my pre/pro. All my flac's are streamed via ethernet to the 105.

Quote:
Originally Posted by Bob Pariseau View Post


Quite a few of the posters here are happily using multi-channel FLAC files, so I expect some will pop up with answers if you need more details. Right now they are likely making popcorn to enjoy the lip-sync debate.

Ahh, the great lip synch debacle series XX continues to infinity and beyound. Haven't they just heard of "READ MY LIPS"!
post #2852 of 10056
Quote:
Originally Posted by wmcclain View Post

Bob P answers the "Why does playback stop when I turn off the TV?" question from time to time. Something to do with a naughty AVR. I forget the details, but he'll be around as soon as he wakes up. Night owl.
Hold on a second there - who gave Bob permission to sleep??? eek.gifconfused.gif
post #2853 of 10056
Quote:
Originally Posted by Bob Pariseau View Post

Playback is Paused during an HDMI handshake to avoid missing content you wouldn't be able to hear or see anyway due to muting imposed until copy protection decides it is happy again.
--Bob

Like I said, I think that's letting the implementers off too lightly. There is no possibility of HDCP on CD-DA, and the sound isn't being routed out HDMI-1 in any case, so IMHO it is unreasonable for HDCP handshaking on HDMI-1 output to interfere with playback of CDs over the analog output. As another poster pointed out, other similar equipment does not have this problem.

It's not like I'm sending mine back over this. The easy work-around is to make sure your display is already off before you listen to something.
post #2854 of 10056
Quote:
Originally Posted by oldav View Post

Yeah, no problem playing flac 5.1's, just plays like any other flac file. I have (Piano Improvisations (5.1 surround)) from 2L and it hasn't presented any problems. I have several other 96/24 5.1 flac's from other sources and haven't had any problems either. I use HDMI 2 on the 105 going to my pre/pro. All my flac's are streamed via ethernet to the 105.
Ahh, the great lip synch debacle series XX continues to infinity and beyound. Haven't they just heard of "READ MY LIPS"!

I would like to try the 5.1 FLAC via ethernet to the Oppo. How are you guys ripping the 2L material to 5.1 FLAC? My 2L material is on BluRay discs. Is there a ripping tool that can handle this?

Thanks
post #2855 of 10056
Jeff,
But have you turned HDMI Audio OFF? The player has no way off knowing the audio component of the HDMI output is not in use.
--Bob
post #2856 of 10056
Quote:
Originally Posted by delkat View Post

I would like to try the 5.1 FLAC via ethernet to the Oppo. How are you guys ripping the 2L material to 5.1 FLAC? My 2L material is on BluRay discs. Is there a ripping tool that can handle this?

Thanks

I download my FLAC files directly from 2L as well as other sites, well some may be WAV files in which there many converters out there that will easily convert WAV to FLAC. I use dBpoweramp for most of my conversions. I am not familiar with FLAC files on a BD, it could be as simple as locating the files on then disk and copying them to your media server. There are others here much more tech savvy than me that I am sure coiuld answer your question more directly.
post #2857 of 10056
Quote:
Originally Posted by Bob Pariseau View Post

First of all, welcome to AVS and to this thread!

Now that said, PLAYBACK SHOULD NOT STOP. It should PAUSE and then resume in a couple seconds. [...] NOTE: There is a known bug where SACD playback can Stop due to an HDMI handshake in certain configurations, but that should not be affecting CD playback.

The stops were with SACDs if I recall correctly. The pauses were with redbook. I'll confirm this later if it's repeatable.
Quote:
Originally Posted by Bob Pariseau View Post

Come to think of it, your issue with the player STOPPING when this happens could be due to enabling HDMI CEC in the OPPO. I.e., it is getting a Stop command from either the TV or the AVR.

You can disable HDMI CEC in the OPPO's own Setup menu (Setup > Device Setup).
--Bob

Not sure what CEC is, but I'll look into it in the manual or elsewhere in the thread (?). As far as my configuration, it's as simple as could be: HDMI from oppo to TV. Analog (RCA) out to integrated amp (not an AV product).

And in closing thanks for the welcome and the very detailed response. Hopefully it is useful for others as well.
post #2858 of 10056
Quote:
Originally Posted by bgunn View Post

The stops were with SACDs if I recall correctly. The pauses were with redbook. I'll confirm this later if it's repeatable.
.

This is a known issue with SACD, please report your experience to Oppo.
HDMI handshakes may crash SACD playback. You might find different results comparing with the player set to PCM output vs DSD output. May also be specific to specific hardware and setup. If you have both HDMI outputs connected, try removing HDMI-1 and just use HDMI-2.
post #2859 of 10056
Quote:
Originally Posted by HaroldKumar View Post

It worked into a Emotiva UPA-1. I cut open a USB cord and there was a black, red, green and white wire.

Spliced the black and red into the 12v trigger cable supplied with the amp. Voila!


Hi, this is a dumb question but, usb black --> emotiva black? And usb red---> emotiva red?
Thank you in advance.
I am planning to do this with emotiva UPA-700 and UPA-200. No pre amp in the middle.
I wonder how does UPA-1 sounds with bdp-105?
Did you have to make any special adjustments to oppo ?
Regards
Kris
post #2860 of 10056
HI. I just received my OPPO105 and have a couple of questions. I won't be able to hook it up to my AV system for a couple of days but I did connect it to a TV to do software update and test to make sure all types of discs played and inputs worked.. Do I need to connect it to an amp. in order to have it "burn" in? Or can I just load a CD on repeat without it connected to anything to have it burn in? Also, is there a way of disabling inputs that are not in use? It would be nice when pressing input it would only cycle through active inputs instead of all of them. Thanks in advance for any advice.
post #2861 of 10056
Quote:
Originally Posted by Jeff Baker View Post

I'm surprised you didn't mention the USB input, as it's the best. Optical and Coaxial both have clock domain problems that cannot be completely eliminated, thanks to the awful SPDIF protocol. The higher time-domain distortion when using SPDIF with the BDP-105 was noted in some magazine reviews. Of the three you listed, HDMI is clearly the best because it doesn't have the clock recovery issue, and it uses TMDS signalling, and is superior in all the other ways you would expect for a protocol that was design 20 years after SPDIF.
Quote:
Originally Posted by Bob Pariseau View Post


Thanks everyone

I actually didn't mention USB because of the limitations of the device I was using to input to the Oppo (Apple TV). I had a feeling it might be HDMI. Really was trying to determine between HDMI/SPDIF.
post #2862 of 10056
Hi;
I purchased the Oppo -105 as soon as it became available, my set up is running the Oppo out of HDMI 1 into my Onkyo TX-VR 809 Receiver, then HDMI out of the audio receiver into the TV (Sharp LC-60LE632U) for my video, should I run the HDMI out of the OPPO straight into the TV? If I do this, what is done about the audio for the Receiver?

I was also trying to run HDMI 2 out of the Oppo into my receiver for audio for my DVD-Audio and SACD playback because I have several discs and love the DSD sound, doing this, I had no sound coming out of HDMI 1 using the HDMI AV Split setting, I got audio when I went to the Dual Display HDMI setting, I got 2 different responses from Oppo when I talked to them, one was that by using the Dual Display setting I am not getting the best audio possible, another was that it was ok to use the dual display setting, I want the best audio on HDMI 2 and the best Video/Audio on HDMI 1.

I bought the Onkyo because of the DSD decoding capabilities. By using the HDMI 2 output instead of going analog out of the Oppo am I shortchanging the sound possibilities? Should I not be concerned about the DSD decoding in the Onkyo and just run analog cables? I believe the Onkyo does the DSD decoding over the HDMI, not sure if it will decode it over analog cables. Does the analog quality outweigh the HDMI DSD method?

If I go strictly HDMI, I am trying to go best possible Video for HDMI 1 and best audio for HDMI 2 but to use the AV Split setting, it would force me to pull out the HDMI connection in the back of the Oppo out and move it when I want to play blu ray and when I wand to play SACD which is inconvenient, The Oppo Tech said just leave it in HDMI 2, that the video improvement would not warrant the inconvenience, saying that the only difference on HDMI 1 is Contrast Enhancement, Noise Reduction and 4K capabilities.

I recently sold my home and moved into a new one, my basement man cave in the new home will not be finished until I get more funds, unfortunately I had to leave much of my home theater as well as my old PSB Stratus speakers as part of the selling price of my old house. I am currently using for my living room setup: Onkyo TX-NR 809, Wharfedale Diamond 8.4 for fronts, Wharfedale Diamond 10 CS for center, 2 JBL ES250 12" for Subs and my wife made me put speakers in the ceiling for aesthetics for surround, I think it is Dayton Audio Ultra Series, Acoustic Research AR-1 turnatable and now the Oppo-105.

What should my suggested setup and settings on the Oppo be? Not ready to run separate amps into Oppo, but should I go analog out for SACD/DVD-Audio playback? Since HDMI 1 will control the Video/Audio of the bluray, if I go analog, what do you use for the visual of the SACD menu and the DVD-Audio visual functions? The Oppo is capable of running HDMI 1 out and analog at the same time right?

Lastly, the audio/video sync issue on the Oppo I have had from day one is very concerning to me, I would think that this would have been resolved prior to it's release.

Be easy on me, I am not that astute at Audio/Video, but I love it ! Thanks .....Mark
post #2863 of 10056
Quote:
Originally Posted by FlatRocky View Post

Hi, this is a dumb question but, usb black --> emotiva black? And usb red---> emotiva red?
Thank you in advance.
I am planning to do this with emotiva UPA-700 and UPA-200. No pre amp in the middle.
I wonder how does UPA-1 sounds with bdp-105?
Did you have to make any special adjustments to oppo ?
Regards
Kris

it kind of depends on the cable, but for my cable it was
usb black->emotiva black
and usb red -> emotive yellow (there was no red wire)

in other cases ...
1. black to black. ( in some cases if there's not black, the 'black' will be bare unjacketed wire)

2. red to red or sometimes whatever is brightest usually, for instance, when I cut the cable of one of the emotiva supplied trigger cables it had a black and yellow, in this case yellow=red
The USB cord i used had four wires, black, green, white and red. In this case definitely use red!
Edited by HaroldKumar - 1/16/13 at 12:26pm
post #2864 of 10056
Quote:
Originally Posted by Torqdog View Post

Monoprice RedMere HDMI cables... I was quite surprised to find that indeed there was a slight improvement with black levels and contrast being most noteworthy. There is now a definite 3D effect that was not noticed before. .

I'm at a loss to understand why and what it is that is going on but my eyes ain't lyin to me.

Uh, yes, your eyes ARE lying to you. That's why we use meters to measure video displays... because meters cannot be fooled. Eyes are SO EASILY and SO COMPLETELY fooled, we can NEVER trust what we see. All you have to do is look online for optical illusions... they ALL fool our eyes. One of them even makes a series of dots completely disappear from our vision even though they remain displayed on the monitor. I ordered a few RedMere cables last week just because I needed some cables for relatives (teen with an XBox and composite video connection, etc.). I kept 1 for myself just to see what was going on with the new cable. Last night I used my $14,000 chroma meter and a freshly calibrated light meter (used for black levels as calibration meters typically aren't very good with black levels) to measure a plasma display with the RedMere cable plus a free cable that came in the box with an HD-DVD player (so old, it's not even a "high-speed" cable) and a "King Kong" high-end HDMI cable that will remain nameless. None of them changed the black level. None of them changed the contrast ratio. And just to be "complete" I ran a full set of grayscale/gamma measurements and CMS measurements. Everything was within the repeatability range of making measurements with a plasma display -- differences were below the threshold of being detectable to human vision. Patterns were generated by a high-end signal generator at 1080p60.

10 different cable manufacturers have admitted (to me personally, some "on the record", some "off the record"), that their HDMI cables, no matter how expensive or inexpensive, do not make any difference in digital video image quality. This includes some cable manufacturers who initially said their cables improved digital video image quality but within 6-12 months changed their minds and agreed that there was no detectable difference. In my experience, this has never happened before (any cable manufacturer admitting their cables did not "perform better" than cheaper or less-well-designed cables). One of these manufacturers even decided not to manufacture an HDMI cable after their first run of prototypes produced no differences in video image quality... much to their surprise. When cable manufacturers tell you that a cable they sell doesn't actually improve image quality, you know you have something very well-researched (by the cable manufacturer). In fact, the very best made HDMI cables have only 1 advantage over a free HDMI cable or over a $4 Monoprice cable... the expensive and carefully designed cable will likely operate without errors over longer cable runs (like 50 ft vs. 20 ft) than less well-designed cables -- and that's it. [Except for music playback... video is unaffected by the cable, movie sound does not change enough to worry about, but music playback is affected by the quality of the cable design and music playback is the only reason to spend more on an HDMI cable than Monoprice's low cost HDMI cables.] If you use analog or coax or USB for music, the HDMI cable you use for movies doesn't really matter at all. I'll happily use a $4 Monoprice cable or an expensive model and am confident that both cables deliver the same video image quality. When an HDMI cable does produce an error due to the cable run being too long, there are only 2 possible outcomes... total loss of video or "sparklies" where you get random pixels that are darker or lighter than they should be. The bright ones are more obvious and result in the "sparklies" name given to the effect when random pixels over the entire screen flash brightly for 1 frame at a time.

I don't like posting "contrary" opinions, but this thing (HDMI cable vs video image quality) is backed-up with measurements (which I've been doing since 2007 when I was very surprised to find that no HDMI cables made video images look better or worse) and statements from cable manufacturers which just doesn't happen unless they are certain.
post #2865 of 10056
Quote:
Originally Posted by Doug Blackburn View Post

Uh, yes, your eyes ARE lying to you. That's why we use meters to measure video displays... because meters cannot be fooled. Eyes are SO EASILY and SO COMPLETELY fooled, we can NEVER trust what we see. All you have to do is look online for optical illusions... they ALL fool our eyes. One of them even makes a series of dots completely disappear from our vision even though they remain displayed on the monitor. I ordered a few RedMere cables last week just because I needed some cables for relatives (teen with an XBox and composite video connection, etc.). I kept 1 for myself just to see what was going on with the new cable. Last night I used my $14,000 chroma meter and a freshly calibrated light meter (used for black levels as calibration meters typically aren't very good with black levels) to measure a plasma display with the RedMere cable plus a free cable that came in the box with an HD-DVD player (so old, it's not even a "high-speed" cable) and a "King Kong" high-end HDMI cable that will remain nameless. None of them changed the black level. None of them changed the contrast ratio. And just to be "complete" I ran a full set of grayscale/gamma measurements and CMS measurements. Everything was within the repeatability range of making measurements with a plasma display -- differences were below the threshold of being detectable to human vision. Patterns were generated by a high-end signal generator at 1080p60.

10 different cable manufacturers have admitted (to me personally, some "on the record", some "off the record"), that their HDMI cables, no matter how expensive or inexpensive, do not make any difference in digital video image quality. This includes some cable manufacturers who initially said their cables improved digital video image quality but within 6-12 months changed their minds and agreed that there was no detectable difference. In my experience, this has never happened before (any cable manufacturer admitting their cables did not "perform better" than cheaper or less-well-designed cables). One of these manufacturers even decided not to manufacture an HDMI cable after their first run of prototypes produced no differences in video image quality... much to their surprise. When cable manufacturers tell you that a cable they sell doesn't actually improve image quality, you know you have something very well-researched (by the cable manufacturer). In fact, the very best made HDMI cables have only 1 advantage over a free HDMI cable or over a $4 Monoprice cable... the expensive and carefully designed cable will likely operate without errors over longer cable runs (like 50 ft vs. 20 ft) than less well-designed cables -- and that's it. [Except for music playback... video is unaffected by the cable, movie sound does not change enough to worry about, but music playback is affected by the quality of the cable design and music playback is the only reason to spend more on an HDMI cable than Monoprice's low cost HDMI cables.] If you use analog or coax or USB for music, the HDMI cable you use for movies doesn't really matter at all. I'll happily use a $4 Monoprice cable or an expensive model and am confident that both cables deliver the same video image quality. When an HDMI cable does produce an error due to the cable run being too long, there are only 2 possible outcomes... total loss of video or "sparklies" where you get random pixels that are darker or lighter than they should be. The bright ones are more obvious and result in the "sparklies" name given to the effect when random pixels over the entire screen flash brightly for 1 frame at a time.

I don't like posting "contrary" opinions, but this thing (HDMI cable vs video image quality) is backed-up with measurements (which I've been doing since 2007 when I was very surprised to find that no HDMI cables made video images look better or worse) and statements from cable manufacturers which just doesn't happen unless they are certain.

Great post.
post #2866 of 10056
Quote:
Originally Posted by Jeff Baker View Post

Like I said, I think that's letting the implementers off too lightly. There is no possibility of HDCP on CD-DA, and the sound isn't being routed out HDMI-1 in any case, so IMHO it is unreasonable for HDCP handshaking on HDMI-1 output to interfere with playback of CDs over the analog output. As another poster pointed out, other similar equipment does not have this problem.

It's not like I'm sending mine back over this. The easy work-around is to make sure your display is already off before you listen to something.

... or do not turn on the display while you are listen to an CD, or something...
This problem (issue) occur in the main processor, and have nothing to do with that so called "HDMI hand shake". The way the main processor manage (or have to manage) the different data streams in real time, do not work as it should. This it can happen too because the software used to run that processor (firmware).
One can very well simulate the same using a computer. Play music from that computer and suddenly open an heavy software, or do something else to intense use that processor in your computer. You will experience the same phenomenon: drop out in the streaming music to your audio board... That processor needs time to fix its processes to do the tasks. This is normal, but the software writers/designers may prevent such things (drop out) to happen. I just think Oppo has a little problem right here...
Edited by Coris - 1/16/13 at 2:38pm
post #2867 of 10056
Quote:
Originally Posted by russ_777 View Post

Great post.

except for the can of worms opened up with this..... eek.gifbiggrin.gifsmile.gif
Quote:
[Except for music playback... video is unaffected by the cable, movie sound does not change enough to worry about, but music playback is affected by the quality of the cable design and music playback is the only reason to spend more on an HDMI cable than Monoprice's low cost HDMI cables.]
post #2868 of 10056
I'm considering the 105 to replace a number of other components and I wanted to see what you guys think.

Here's what I'm thinking the 105 can replace:
1. Panasonic DMP-BD605 blu-ray
2. Logitech Squeezebox Touch
3. Cambridge Audio DAC

I like the Panasonic better than the other (cheap) BD players I've had because the DVD upscaling is pretty good (I have a 32" Samsung LCD). I'm assuming the 105 would surpass the Panasonic.

I have my music files in FLAC/ALAC/AAC/mp3 stored on a QNAP NAS, which has a Twonky server installed but not used currently. From what I can tell, using BubbleUPnP as a control for Twonky would basically duplicate the functionality of the Squeezebox. Any opinions on this? Also, I currently use the Squeezebox over my wireless, will I need to run a wire for the Oppo (ie, how good is the wireless function)? I only have a few files that are high res (downloads from HDTracks).

I suspect the DAC in the Oppo is better than the Cambridge (mine is the DacMagic 100). I'm not totally sure the DAC is making a huge difference in my system since my amp is a sub-$300 Yamaha stereo receiver, but I may eventually upgrade that.

Any insights are greatly appreciated!
post #2869 of 10056
Quote:
Originally Posted by HaroldKumar View Post

except for the can of worms opened up with this..... eek.gifbiggrin.gifsmile.gif

A non-defective cable has a BER 10^-12. That's less than 1 bit per CD.

Once you get the data to the receiver it's reclocked.

End of story.
post #2870 of 10056
Quote:
Originally Posted by johsti View Post

Does the Oppo 105 upsample audio or play it natively? I plan to use the usb dac with my computer and wanted to make sure 44.1 is played natively at 44.1, 96 is played natively at 96, etc. etc. I understand the bit processing is done at 32 bits and I'm fine with that.

Any and all input is appreciated.

This is NOT really what you want... you may THINK you want it, but you really don't. You want the music played at the higest even multiple of the original recording... for some systems, that means 44.1 plays back at 88.2 and 48 plays back at 96, while in systems with 196 kHz DACs, you'd want to play 44.1 back at 176.4 and 48 at 192.

The reason for this is the filter that has to be used with lower sample rates... these brick wall filters (at 22,050 Hz for 44.1 kHz sample rates) create phase problems that propagate well down into the 1000s of Hz and possibly down into the 100s of Hz. The higher the sampling frequency, the gentler and less-intrusive the filtering can be. That results in removing all or nearly all of the phase problems caused by the brickwall filtering needed for lower sample rates. For that reason alone, you're better off with higher sample rates as long as you maintain even multiples... playing 44.1 at 96 is not the best choice, for example.

Some of the playback software (like JRiver Media Center for WIndows) allows you to specify the playback sample rate for each input sample rate... a really good feature IMO.
post #2871 of 10056
Quote:
Originally Posted by HaroldKumar View Post

I think XBMC http://xbmc.org/ really stands out as the best one and with very active development. Can run on multiple platforms.

Hmmm, didn't think about that! I'm testing it out for serving up concert videos and it does work just fine (so far) with the BDP-105. Didn't think to try it on linux as a DMC - good idea - thanks smile.gif

Styln
post #2872 of 10056
Quote:
Originally Posted by Doug Blackburn View Post

[Except for music playback... video is unaffected by the cable, movie sound does not change enough to worry about, but music playback is affected by the quality of the cable design and music playback is the only reason to spend more on an HDMI cable than Monoprice's low cost HDMI cables.

How so? It's still digital. If the video doesn't change with cable, and it's at a higher bandwidth, what would make the equally digital audio signal care what cable is used so long as the bits arrive intact?
post #2873 of 10056
Quote:
Originally Posted by ehlarson View Post

A non-defective cable has a BER 10^-12. That's less than 1 bit per CD.

Once you get the data to the receiver it's reclocked.

End of story.

Plus which, errors are corrected.
post #2874 of 10056
Quote:
Originally Posted by Doug Blackburn View Post

Uh, yes, your eyes ARE lying to you. That's why we use meters to measure video displays... because meters cannot be fooled. Eyes are SO EASILY and SO COMPLETELY fooled, we can NEVER trust what we see. All you have to do is look online for optical illusions... they ALL fool our eyes. One of them even makes a series of dots completely disappear from our vision even though they remain displayed on the monitor. I ordered a few RedMere cables last week just because I needed some cables for relatives (teen with an XBox and composite video connection, etc.). I kept 1 for myself just to see what was going on with the new cable. Last night I used my $14,000 chroma meter and a freshly calibrated light meter (used for black levels as calibration meters typically aren't very good with black levels) to measure a plasma display with the RedMere cable plus a free cable that came in the box with an HD-DVD player (so old, it's not even a "high-speed" cable) and a "King Kong" high-end HDMI cable that will remain nameless. None of them changed the black level. None of them changed the contrast ratio. And just to be "complete" I ran a full set of grayscale/gamma measurements and CMS measurements. Everything was within the repeatability range of making measurements with a plasma display -- differences were below the threshold of being detectable to human vision. Patterns were generated by a high-end signal generator at 1080p60.

10 different cable manufacturers have admitted (to me personally, some "on the record", some "off the record"), that their HDMI cables, no matter how expensive or inexpensive, do not make any difference in digital video image quality. This includes some cable manufacturers who initially said their cables improved digital video image quality but within 6-12 months changed their minds and agreed that there was no detectable difference. In my experience, this has never happened before (any cable manufacturer admitting their cables did not "perform better" than cheaper or less-well-designed cables). One of these manufacturers even decided not to manufacture an HDMI cable after their first run of prototypes produced no differences in video image quality... much to their surprise. When cable manufacturers tell you that a cable they sell doesn't actually improve image quality, you know you have something very well-researched (by the cable manufacturer). In fact, the very best made HDMI cables have only 1 advantage over a free HDMI cable or over a $4 Monoprice cable... the expensive and carefully designed cable will likely operate without errors over longer cable runs (like 50 ft vs. 20 ft) than less well-designed cables -- and that's it. [Except for music playback... video is unaffected by the cable, movie sound does not change enough to worry about, but music playback is affected by the quality of the cable design and music playback is the only reason to spend more on an HDMI cable than Monoprice's low cost HDMI cables.] If you use analog or coax or USB for music, the HDMI cable you use for movies doesn't really matter at all. I'll happily use a $4 Monoprice cable or an expensive model and am confident that both cables deliver the same video image quality. When an HDMI cable does produce an error due to the cable run being too long, there are only 2 possible outcomes... total loss of video or "sparklies" where you get random pixels that are darker or lighter than they should be. The bright ones are more obvious and result in the "sparklies" name given to the effect when random pixels over the entire screen flash brightly for 1 frame at a time.

I don't like posting "contrary" opinions, but this thing (HDMI cable vs video image quality) is backed-up with measurements (which I've been doing since 2007 when I was very surprised to find that no HDMI cables made video images look better or worse) and statements from cable manufacturers which just doesn't happen unless they are certain.
This may come as a shock but I totally agree. I finally had a chance to do some A/B testing using the Spears and Munsil disc and I could NOT see any difference between the cables.

It was a damned good bottle of red though. redface.gif
post #2875 of 10056
Quote:
Originally Posted by jimshowalter View Post

Plus which, errors are corrected.

I think the argument is if the data is PCM a poor cable can introduce jitter, especially on longer runs. I guess HDMI has an even bigger jitter problem than spdif. Some think jitter is audible, some don't.

One of the reasons I like the OPPO 105 with the audiophile analog outs is I can keep all the digital stuff in the OPPO and it gives me the bass management that a straight DAC lacks. There's no spdif or hdmi carrying any audio.
Edited by HaroldKumar - 1/16/13 at 9:51pm
post #2876 of 10056
Out of the box here is my 2 min review ..and yes I am echoing what most of you have said

Packaging : excellent, it was like xmas all over again and this time I was the kid
Setup pretty easy, although the firmware was way too old (I know I should have updated it before) but straight out of the box we should not have major issues, ie: lipsynch with HDMI in, making it unwatchable
Connection : BDP105 , HDMI 1 out to a Pana TH58PH10UKA at 720p, audio RCA to a poweramp Celeste 4070 and BW804

Playing cd was a charm, it did lack a bit of bass extension but I am convinced that with a few hours of burn-in that the sound will improve, I am comparing to my trusted Audio GD DAC 3DV. I never beleived in burn-in before getting this DAC but I was forced to admit that the sound changed radically after 200 hrs. Volume control is really good with variable output and can confirm that this is a great setup, skipping a preamp stage all together. I will try the XLR out in a few days.

Playing BR, the sound level for some reason is really low compared to cd's. It is not as crips and lacks extension, less engaging. This is odd because I am using the exact same setup as cd's, any clues why the sound level is lower ? I did flash January’s 2nd firmware ?? The colors are simply amazing and am unfortunately still getting some stuttering I guess it’s because the panel can't process 24p, would it be better if I use the PAL setting since my panel can process both ?

Playing HDMI in from a external OTA tuner, the picture is again amazing, color deeper and less noise, this was a great surprise. The audio synch once the latest firmware applied is significantly better but not perfect, a little more work and this one will disappear from the punch list. Again tle audio has the same issue than with BR, its lower and less engaging.. any clues ??
post #2877 of 10056
Quote:
Originally Posted by james57 View Post

am unfortunately still getting some stuttering I guess it’s because the panel can't process 24p, would it be better if I use the PAL setting since my panel can process both ?

That's a 1366 x 768 panel and the specs that it accepts 1080/60i, 50i, 24p, 24sF, 25p, 30p, but you should try different output resolutions from the player to see what works best for you.

With 1080p24 OFF, 1080p with TV system = NTSC will give 60hz, while PAL will give 50hz. If you are in North America you probably want NTSC.

-Bill
post #2878 of 10056
Hi Bill, thanks for the support, My panel is indeed a native 1366X768 and will accept 1080i and 720p at 60 or 50 hz via HDMI not all the other ones as they relate to the d-sub 15 pin input. Yes I am in Canada and use the NTSC format but was wondering which conversion was better 1080p24 to 720p60 or 720p50 ?
post #2879 of 10056
Quote:
Originally Posted by james57 View Post

Hi Bill, thanks for the support, My panel is indeed a native 1366X768 and will accept 1080i and 720p at 60 or 50 hz via HDMI not all the other ones as they relate to the d-sub 15 pin input. Yes I am in Canada and use the NTSC format but was wondering which conversion was better 1080p24 to 720p60 or 720p50 ?

Unknown, you'll have to try it. It would be really rare for a North American user to select 50hz output for anything other than a 50hz source.

-Bill
post #2880 of 10056
Quote:
Originally Posted by Doug Blackburn View Post

This is NOT really what you want... you may THINK you want it, but you really don't. You want the music played at the highest even multiple of the original recording... for some systems, that means 44.1 plays back at 88.2 and 48 plays back at 96, while in systems with 196 kHz DACs, you'd want to play 44.1 back at 176.4 and 48 at 192.

The reason for this is the filter that has to be used with lower sample rates... these brick wall filters (at 22,050 Hz for 44.1 kHz sample rates) create phase problems that propagate well down into the 1000s of Hz and possibly down into the 100s of Hz. The higher the sampling frequency, the gentler and less-intrusive the filtering can be. That results in removing all or nearly all of the phase problems caused by the brickwall filtering needed for lower sample rates. For that reason alone, you're better off with higher sample rates as long as you maintain even multiples... playing 44.1 at 96 is not the best choice, for example.

Some of the playback software (like JRiver Media Center for WIndows) allows you to specify the playback sample rate for each input sample rate... a really good feature IMO.

Great post.

I just bought J River and I really like it.
I setup the up sampling in even multiples and it does sound better. Not huge but noticeable.

I still cannot wrap my head around the difference in sound between USB Kernel and USB WASAPI (event Mode).
USB Kernel is very revealing and the most natural sounding.
USB WASAPI is hyper detailed and very bright. I find it compelling, but it changes the mix significantly. Vocals are lifted to a point that they do not sound quite right.

DMR streaming is very good, on the bright side but not as bright as USB WASAPI.

Scratches head.

- Rich
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Blu-ray Players
AVS › AVS Forum › Blu-ray & HD DVD › Blu-ray Players ›  Official OPPO BDP-105 Owner's Thread