or Connect
AVS › AVS Forum › Audio › DIY Speakers and Subs › MiniDSP
New Posts  All Forums:Forum Nav:

MiniDSP - Page 40

post #1171 of 2293
*Disclaimer* This post may be hazardous to the enjoyment of your MiniDSP. Continue reading at your own risk */Disclaimer*

Today I did some testing of my balanced MiniDSP 2x4 and found what I believe to be some fundamental issues with it that make it potentially problematic for LFE equalization duty if your receiver has a strong voltage output. Specifically the MiniDSP has a good probability of acting as a signal level restriction or clipping the LFE signal in the playback chain of people listening at reference.

Some things I learned today:

1) The “balanced” output and input of the balanced MiniDSP maxes out at ~4.6VAC (RMS). Specifically this is the point where increases of the voltage on the input are not met with corresponding increases in voltage on the output.
2) The “unbalanced” (RCA) output of the balanced MiniDSP is at effectively half the voltage level of the “balanced”. This is a loss of ~6dB from the input.
3) The subwoofer output from my THX certified Pioneer Elite SC-05 can exceed 4.6VAC (RMS) under a lot of circumstances.

I used a True RMS multimeter for all my measurements. I used a 40Hz sine wave as the basis of my testing. It was at -0.01dB from full scale. I used both .wav files and 1.5Mbit DTS 5.1 playback via HDMI from my HTPC. The receiver was set at 0dB (reference) and the SW output level trim was set to 0dB. All speakers were set to small and crossed at 80Hz. Additional sound processing was disabled (DPL IIx, etc). Here are the voltages I measured:

DTS 5.1
channel(s) 40Hz sine wave was in – measured VAC (RMS)
LFE - 7.87
Left - 2.76
L&R - 5.5
LFE+L&R - 9.23
All 5.1 - 9.47

Wave playback
channel(s) 40Hz sine wave was in – measured VAC (RMS)
L - 1.703
L&R - 3.43

Leaving the master volume at 0dB and backing the SW output level trim in my Pioneer down to -10dB, the lowest possible setting, and measuring the worst case DTS scenario (the 40Hz sine wave in all 6 channels) again I got 4.44VAC (RMS). This tells me 2 things. First, the Pioneer is probably clipping in the first 0dB SW level test since the extra 10dB should give a voltage gain of ~3.16x and the measurements only show ~2.13x. Second, that measured 4.44V is basically at the I/O limit of the balanced MiniDSP. This means that at reference even with the SW output channel trim turned down to -10dB it’s possible for content to push the input of MiniDSP right to the limit. Further, if you apply any signal boost with filters the MiniDSP doesn’t have the headroom to output a signal that hasn’t been clipped without decreasing the levels with the plugin software.

So, at reference you can just barely keep things from clipping by decreasing the SW output as much as possible and turning the MiniDSP down internally. For playback above reference there's no solution that I can see short of inserting a voltage divider ahead of the MiniDSP on the input side.
Edited by Stereodude - 12/16/12 at 3:27pm
post #1172 of 2293
Quote:
Originally Posted by Stereodude View Post

*Disclaimer* This post may be hazardous to the enjoyment of your MiniDSP. Continue reading at your own risk */Disclaimer*
Today I did some testing of my balanced MiniDSP 2x4 and found what I believe to be some fundamental issues with it that make it potentially problematic for LFE equalization duty if your receiver has a strong voltage output. Specifically the MiniDSP has a good probability of acting as a signal level restriction or clipping the LFE signal in the playback chain of people listening at reference.
Some things I learned today:
1) The “balanced” output and input of the balanced MiniDSP maxes out at ~4.6VAC (RMS). Specifically this is the point where increases of the voltage on the input are not met with corresponding increases in voltage on the output.
2) The “unbalanced” (RCA) output of the balanced MiniDSP is at effectively half the voltage level of the “balanced”. This is a loss of ~6dB from the input.
3) The subwoofer output from my THX certified Pioneer Elite SC-05 can exceed 4.6VAC (RMS) under a lot of circumstances.
I used a True RMS multimeter for all my measurements. I used a 40Hz sine wave as the basis of my testing. It was at -0.01dB from full scale. I used both .wav files and 1.5Mbit DTS 5.1 playback via HDMI from my HTPC. The receiver was set at 0dB (reference) and the SW output level trim was set to 0dB. All speakers were set to small and crossed at 80Hz. Additional sound processing was disabled (DPL IIx, etc). Here are the voltages I measured:
DTS 5.1
channel(s) 40Hz sine wave was in – measured VAC (RMS)
LFE - 7.87
Left - 2.76
L&R - 5.5
LFE+L&R - 9.23
All 5.1 - 9.47
Wave playback
channel(s) 40Hz sine wave was in – measured VAC (RMS)
L - 1.703
L&R - 3.43
Leaving the master volume at 0dB and backing the SW output level trim in my Pioneer down to -10dB, the lowest possible setting, and measuring the worst case DTS scenario (the 40Hz sine wave in all 6 channels) again I got 4.44VAC (RMS). This tells me 2 things. First, the Pioneer is probably clipping in the first 0dB SW level test since the extra 10dB should give a voltage gain of ~3.16x and the measurements only show ~2.13x. Second, that measured 4.44V is basically at the I/O limit of the balanced MiniDSP. This means that at reference even with the SW output channel trim turned down to -10dB it’s possible for content to push the input of MiniDSP right to the limit. Further, if you apply any signal boost with filters the MiniDSP doesn’t have the headroom to output a signal that hasn’t been clipped without decreasing the levels with the plugin software.
So, at reference you can just barely keep things from clipping by decreasing the SW output as much as possible and turning the MiniDSP down internally. For playback above reference there's no solution that I can see short of inserting a voltage divider ahead of the MiniDSP on the input side.

I believe this is what I have observed, and could not put in such expert terms...clipping is a nightmare with the DSP in my signal chain, even with jumpers set for 2v.

The solution then is the balanced version?
post #1173 of 2293
Anybody order the new nanoDIGI 2x8K yet? $175. You feed it a digital signal instead of analogue say from a digital Squeezebox or similar, the nanodigi does the processing and provides 8 channels of output to external dac(s). Does this mean that if you want to use it as an active crossover you'd need two separate external dacs to provide the analogue outputs signal to the preamp/amps?

Would the regular analog miniDSP be a better choice for active crossover for a 2 way +subs system?

http://www.minidsp.com/products/minidspkits/nanodigi-2x8-k

post #1174 of 2293
Quote:
Originally Posted by mfrey0118 View Post

I believe this is what I have observed, and could not put in such expert terms...clipping is a nightmare with the DSP in my signal chain, even with jumpers set for 2v.
The solution then is the balanced version?
This is the balanced version. My guess is that the unbalanced version probably behaves the same as the balanced one when using RCA outputs.
post #1175 of 2293
so is this only an issue when using rca outputs? are you using pheonix/rca adapters? what about using pheonix/xlr cables?
post #1176 of 2293
Quote:
Originally Posted by brian6751 View Post

so is this only an issue when using rca outputs? are you using pheonix/rca adapters? what about using pheonix/xlr cables?
The input was fed using a Phoenix/RCA adapter. The output was measured using Phoenix/XLR & Phoenix/RCA adapters. Feeding it with an XLR wouldn't change anything.
post #1177 of 2293
So, I did some additional testing today with an o-scope today. It's actually a little worse than I concluded yesterday.

Here's what I learned today (input jumpers were at 2.0V like yesterday):

1) The output of the Balanced MiniDSP is a true balanced output
2) The input of the MiniDSP clips with sine waves above -6 on the input meters in the GUI. This correlates to going above 4.6-4.7VAC (RMS).
3) The input meters in the GUI seem to be an RMS measurement, not peak amplitude. This means the numbers in the GUI keep increasing as the input continues to clip worse and worse while the peak amplitude is not increasing.
4) The MiniDSP is pretty much flat down to 2Hz. There's less than 1dB of output attenuation at 2Hz.

Here are some scope plots. The input is on top. The mathematical summation of the +and - outputs is on the bottom.

Max I/O 40Hz (without clipping):



40Hz w/ some severe clipping:



Here's what the GUI reports of the input levels during the above severe clipping:



-2 nice... rolleyes.gif

I'll post more later when I have more time.
Edited by Stereodude - 12/17/12 at 4:29pm
post #1178 of 2293
What does all this mean for us that are using the MiniDSP?

Should I stop cause this really doesn't sound good. (Newbie alert)
Sorry for the lack of my understanding on this stuff.
This information is a little overwhelming for me.

Any input would be appreciated.

With that being said here goes my stupid questions:
If I lower the "input Gain" on the MiniDSP, does that help reduce clipping?
If I lower the output LFE on my Receiver, does that help with reducing the clipping?

Thanks
post #1179 of 2293
Since the miniDSP is only rated at 2v input, doesn't it only make sense that you're clipping it since your voltage is obviously far above that threshold?
post #1180 of 2293
Quote:
Originally Posted by lukeamdman View Post

Since the miniDSP is only rated at 2v input, doesn't it only make sense that you're clipping it since your voltage is obviously far above that threshold?
Maybe, maybe not... The SW output from a receiver is way over 2VAC (RMS) at reference. Which means the MiniDSP isn't a suitable EQ if you're really limiting it 2VAC.
post #1181 of 2293
Quote:
Originally Posted by Stereodude View Post

Maybe, maybe not... The SW output from a receiver is way over 2VAC (RMS) at reference. Which means the MiniDSP isn't a suitable EQ if you're really limiting it 2VAC.

It's a simple fix, just lower the SW output level on the receiver.

If you know the limit is 2v, and within that 2v limit the miniDSP operates as designed, you're wasting your time with tests that exceed 2v. We already know it was only designed for up to 2v...
post #1182 of 2293
Quote:
name="lukeamdman" url="/t/1281290/minidsp/1170#post_22715219"

It's a simple fix, just lower the SW output level on the receiver.

If you know the limit is 2v, and within that 2v limit the miniDSP operates as designed, you're wasting your time with tests that exceed 2v. We already know it was only designed for up to 2v...

How do you lower the SW output on the receiver? IS this as simple as lowering the LFE output?
post #1183 of 2293
Quote:
Originally Posted by lukeamdman View Post

It's a simple fix, just lower the SW output level on the receiver.
If you know the limit is 2v, and within that 2v limit the miniDSP operates as designed, you're wasting your time with tests that exceed 2v. We already know it was only designed for up to 2v...
Thanks for not paying any attention before posting! rolleyes.gif

I don't know about yours, but my receiver outputs 4.44V RMS at reference from the subwoofer channel while playing a DTS 5.1 a encoded track with with 40Hz sine waves in 6 channel with subwoofer channel output level trim in the receiver all the way down at -10dB. So, it's impossible to comply with the 2V "spec" of the MiniDSP.

Further, a 2V RMS sine wave fed into the MiniDSP only yields an input signal of -21 on the input meters, so clearly 2Vrms is not intended to be the max input level regardless of what their webpage says.


Edited by Stereodude - 12/17/12 at 2:25pm
post #1184 of 2293
Quote:
Originally Posted by Freniata View Post

IS this as simple as lowering the LFE output?
Of course not. He's just spouting off uninformed nonsense because he didn't bother to read the last several posts.

One of the issues is that "meters" in the MiniDSP GUI shouldn't reflect a RMS type of measurement (area under the curve) they should reflect peak amplitude. There's no reason good why it reports the level of an input signal to be -6 when input is saturated and any further increase in the input level will result in clipping.

Here's the highly misleading levels reported by the GUI right at the threshold of clipping.



confused.gif
Edited by Stereodude - 12/17/12 at 4:10pm
post #1185 of 2293
Quote:
Originally Posted by Freniata View Post

If I lower the "input Gain" on the MiniDSP, does that help reduce clipping?
No. I have some scope plots that show this that I will post later.
Quote:
If I lower the output LFE on my Receiver, does that help with reducing the clipping?
Yes, but you can only lower it so far. With my receiver playing at reference I can only just get it under the point where the MiniDSP starts to clip the input with the SW out at -10dB.
post #1186 of 2293
Quote:
Originally Posted by Stereodude View Post

So, I did some additional testing today with an o-scope today. It's actually a little worse than I concluded yesterday.
Here's what I learned today (input jumpers were at 2.0V like yesterday):
1) The output of the Balanced MiniDSP is a true balanced output
2) The input of the MiniDSP clips with sine waves above -6 on the input meters in the GUI. This correlates to going above 4.6-4.7VAC (RMS).
3) The input meters in the GUI seem to be an RMS measurement, not peak amplitude. This means the numbers in the GUI keep increasing as the input continues to clip worse and worse while the peak amplitude is not increasing.
4) The MiniDSP is pretty much flat down to 2Hz. There's less than 1dB of output attenuation at 2Hz.

I'll post more later when I have more time.

how can effectively test the output Voltages of my SW or Preamp channels from my receiver? I obviously wasn't doing it correctly as the numbers maxed at about.... .1Vs.

I'm having mixed success, just seems I have to turn the SW out up quite a bit to get similar lvls as without the miniDSP in the chain. I was able to play and get a flat FR to about 15 Hz, but it required a +12 dB shelf filter starting at 25 Hz and a specail 3 dB on top of that at about 19 Hz....obviously, I can't get very loud at all with that set up lol.
post #1187 of 2293
Quote:
Originally Posted by Stereodude View Post

Quote:
Originally Posted by Freniata View Post

If I lower the "input Gain" on the MiniDSP, does that help reduce clipping?
No. I have some scope plots that show this that I will post later.
Quote:
If I lower the output LFE on my Receiver, does that help with reducing the clipping?
Yes, but you can only lower it so far. With my receiver playing at reference I can only just get it under the point where the MiniDSP starts to clip the input with the SW out at -10dB.

thanks for your responses.
post #1188 of 2293
Quote:
Originally Posted by kcnitro07 View Post

how can effectively test the output Voltages of my SW or Preamp channels from my receiver? I obviously wasn't doing it correctly as the numbers maxed at about.... .1Vs.
A multimeter with True RMS capability can measure the RMS voltage, but it can't directly tell you when you're clipping. Keep in mind RMS voltage is basically an area under the curve integration thing so even when you're clipping the RMS voltage goes up until you end up with a square wave. You need a scope to see the waveform to determine clipping or an audio analyzer that can measure distortion + a multimeter to find when clipping starts. Alternatively, you can use my strategy posted here to find the point of clipping with only a meter, recording multiple data points, & some math.

For example, I found my Pioneer SC-05 clips the SW output with some of the DTS encoded test tones at reference with the SW level trims at 0. Here's the -0.01dBFS 40Hz signal in all 6 channels.



So, the receiver clips it's output, drives that into the MiniDSP which clips it's input (and output) and by the time the signal gets to the SW's amp you've got a signal clipped twice over and you've got crazy amounts of distortion.
Edited by Stereodude - 12/17/12 at 5:55pm
post #1189 of 2293
Quote:
Originally Posted by Stereodude View Post

I don't know about yours, but my receiver outputs 4.44V RMS at reference from the subwoofer channel while playing a DTS 5.1 a encoded track with with 40Hz sine waves in 6 channel with subwoofer channel output level trim in the receiver all the way down at -10dB.

Something seems strange here. Forgetting about the MiniDSP entirely, and assuming you're going straight into a power amp, that's enough voltage to drive just about any power amp hard into clipping. For example, the Crown XLS Drivecore amps are specified for 1.4 VRMS input for rated output power. I assume that's with the pots set for max gain.

Is this measurement based on having performed an Audyssey cal with the pots of your sub amp set to well below maximum gain?
post #1190 of 2293
Quote:
Originally Posted by Freniata View Post

If I lower the "input Gain" on the MiniDSP, does that help reduce clipping?
As promised here's a screenshots and a scope capture showing it doesn't help.

Here's the GUI showing the input level at -5 (channel 2):




Here's the input and output waveform:




Same clipping just at a lower output level. This suggests that the input is clipping. FWIW, the output of the MiniDSP may also be clipping when the input and output gain are maxed, but it's impossible to know exactly what's clipping and to what extent.
Edited by Stereodude - 12/17/12 at 6:04pm
post #1191 of 2293
Quote:
Originally Posted by rock_bottom View Post

Something seems strange here. Forgetting about the MiniDSP entirely, and assuming you're going straight into a power amp, that's enough voltage to drive just about any power amp hard into clipping. For example, the Crown XLS Drivecore amps are specified for 1.4 VRMS input for rated output power. I assume that's with the pots set for max gain.
Yes, the receiver can easily drive any pro amp into hard clipping (on the amp's output). My receiver's output is clean up to ~7.4Vrms (~21.3Vpp). If you try to push it louder it starts clipping the output.

Scope plot:


Quote:
Is this measurement based on having performed an Audyssey cal with the pots of your sub amp set to well below maximum gain?
It's at a somewhat arbitrary level of 0dB on the sub channel with no equalization or processing in the receiver since it's an easily measured comparison point.
post #1192 of 2293
Here's input vs. output showing low frequency roll off (or the lack of it)

20Hz:


5Hz:


2Hz:



By my calculations 2Hz is down from 20Hz by -0.39dB. 5Hz is down by -0.12dB.
post #1193 of 2293
Quote:
Originally Posted by Stereodude View Post

Of course not. He's just spouting off uninformed nonsense because he didn't bother to read the last several posts.
One of the issues is that "meters" in the MiniDSP GUI shouldn't reflect a RMS type of measurement (area under the curve) they should reflect peak amplitude. There's no reason good why it reports the level of an input signal to be -6 when input is saturated and any further increase in the input level will result in clipping.
Here's the highly misleading levels reported by the GUI right at the threshold of clipping.

confused.gif

Chill.

I simply stated that if you operate the miniDSP within the manufacturers specifications you won't have a problem, and if you want to reduce the input voltage going to it, you simply lower the output level of the receiver. How is this "uninformed nonsense"?

I don't think a receiver outputting over 4v at reference with the sub output at -10db is normal. Something is strange with that.

If you want to talk along the lines of uninformed nonsense, this statement is a lot closer: "*Disclaimer* This post may be hazardous to the enjoyment of your MiniDSP. Continue reading at your own risk */Disclaimer*"

In reality, you seemed to have confirmed some really good things about the miniDSP, by showing that it's fully balanced, has as almost completely flat response to about 2hz, and yes, if you operate it within the published specs, it works just fine.
post #1194 of 2293
Quote:
Originally Posted by lukeamdman View Post

I don't think a receiver outputting over 4v at reference with the sub output at -10db is normal. Something is strange with that.
You have measurements of other receivers that show something different? My Pioneer Elite 84TXSi outputs 5.02Vrms under the same conditions so it's output seems a little "hotter" than my SC-05 though the 6 channel 40Hz DTS 5.1 file results in some sort of internal clipping in the 84TXSi that doesn't happen on the SC-05.
Quote:
If you want to talk along the lines of uninformed nonsense, this statement is a lot closer: "*Disclaimer* This post may be hazardous to the enjoyment of your MiniDSP. Continue reading at your own risk */Disclaimer*"
Right... Uninformed... rolleyes.gif I've provided more information about the actual operation and capabilities of the MiniDSP than anyone else I've ever seen. Clearly though I'm uninformed and making baseless claims. wink.gif The input meters are fine and working normal. Reporting back RMS style measurements is expected. Everyone would expect to have the input at the very limit when the input meter reads -6. And, of course there should be 15dB of headroom left on the table left in the input stage when when feeding in the specified 2Vrms max input voltage.
Quote:
In reality, you seemed to have confirmed some really good things about the miniDSP, by showing that it's fully balanced, has as almost completely flat response to about 2hz, and yes, if you operate it within the published specs, it works just fine.
However, many people aren't operating it within the published specs because it's not a good match for the LFE playback chain of people who listen around reference if they have a receiver with a strong voltage output.

IMHO, the MiniDSP should have a 3rd input gain setting that offers 2x the voltage input range of the current balanced model (~8Vrms). The output level should also be adjustable (like the input) and capable of outputting a signal that's 6dB hotter relative to the max input. Please note I'm not suggesting that it take a 8Vrms input signal and output 16Vrms. What I'm suggesting is something like this: A full scale input of 8Vrms translates to an output of 4Vrms with the ability to drive the output to 8Vrms so there is an extra 6dB of output capability to accommodate boost via filters.
Edited by Stereodude - 12/17/12 at 9:06pm
post #1195 of 2293
There's still something strange here. Let's work back from the acoustic side to the preamp SW outputs. For a desired acoustic output, one must have these conditions:
  1. The subs themselves must have the desired acoustic output capability (reference or higher, say).
  2. The subwoofer power amps must not clip when providing the subs with the power required to reach the desired acoustic output per the above.
  3. The input sensitivity of the subwoofer power amps must not exceed the maximum output capability of the MiniDSP (assuming no clipping on either its input or output side).

Let's take a hypothetical example of subs that have the required acoustic output, driven by Crown XLS amps with enough output power to drive the subs to this level. Since the input sensitivity of the Crown XLS is 1.4VRMS, there is no problem in this configuration with the MiniDSP in the nominal case.

It's not relevant what the maximum possible receiver SW output voltage is, unless it's lower than the power amp input sensitivity. For preamp outputs using op-amps with +/-15Volt supplies, these can typically put out more than 12 Volts peak before clipping. But they never need to, because the power amp has given up the ghost long before this.

While it may be true that the initial setup of receivers gives some unusual SW output levels, what matters is the behavior after it's set up with acoustical measurements for nominally flat response through the SW crossover region.
post #1196 of 2293
Quote:
Originally Posted by Stereodude View Post


IMHO, the MiniDSP should have a 3rd input gain setting that offers 2x the voltage input range of the current balanced model (~8Vrms). The output level should also be adjustable (like the input) and capable of outputting a signal that's 6dB hotter relative to the max input. Please note I'm not suggesting that it take a 8Vrms input signal and output 16Vrms. What I'm suggesting is something like this: A full scale input of 8Vrms translates to an output of 4Vrms with the ability to drive the output to 8Vrms so there is an extra 6dB of output capability to accommodate boost via filters.

its name is the 8x8
post #1197 of 2293
Quote:
Originally Posted by rock_bottom View Post

Let's take a hypothetical example of subs that have the required acoustic output, driven by Crown XLS amps with enough output power to drive the subs to this level. Since the input sensitivity of the Crown XLS is 1.4VRMS, there is no problem in this configuration with the MiniDSP in the nominal case.
That's not necessarily true as I keep stressing.
Quote:
It's not relevant what the maximum possible receiver SW output voltage is, unless it's lower than the power amp input sensitivity. For preamp outputs using op-amps with +/-15Volt supplies, these can typically put out more than 12 Volts peak before clipping. But they never need to, because the power amp has given up the ghost long before this.
I'm not sure it's entirely irrelevant but the conversation has been focused on the maximum amplitude the receiver can generate at the minimum SW output setting from the receiver at reference on the master volume and the maximum input signal the Balanced 2x4 MiniDSP can handle. What thread have you been reading?

If you're after reference level playback, some receivers when the SW output is set to the minimum possible level (as in you can't turn down the SW level trim any more) have less than 0.5dB of headroom from clipping the input of the balanced MiniDSP. Any boost applied to the signal in the MiniDSP is going to cause clipping on the output.
Quote:
While it may be true that the initial setup of receivers gives some unusual SW output levels, what matters is the behavior after it's set up with acoustical measurements for nominally flat response through the SW crossover region.
*sigh* What I've been saying over and over yet many somehow keep missing is that it's for some people it's not possible to avoid the MiniDSP mangling the SW signal with clipping if these three conditions are true:

1) You listen at reference (or louder)
2) Your receiver has a strong voltage output
3) You're applying any sort of signal boost in the MiniDSP

Edit: clarification
Edited by Stereodude - 12/18/12 at 7:09am
post #1198 of 2293
I had a trouble making all of my stuff play together nicely as well. If you're using sealed subs with a linkwitz transform it's hell to get right.
The sub channel on the avr had to be at -6, all off the other channels between 5 and 6db below that. The input and outputs of the minidsp had to be at -4 and -11, my power amp had to have the gain switch at +38db (down 6db) and the gain knob at 3/4. Anything else from this combination and either the minidsp clips at reference, or the power amp shuts off from overloading an input, which was all very odd to me since it didn't match up with the math, but you can't argue theory with reality.
post #1199 of 2293
what about using something like a cleanbox pro between the avr and the minidsp? couldnt you use that to control the input voltage to the minidsp?
post #1200 of 2293
Taking a step back for a minute I realize this could be confusing because we're talking about the maximum signal output a the minimum Subwoofer setting.

So, here's the process of what I'm discussing:

1) Set the SW channel a low as it goes (-10dB on my Pioneer SC-05)
2) Set the receiver to 0dB (reference)
3) Play a DTS file that contains the worst possible audio content in terms of generating a maximum amplitude LFE signal with all the speaker set small. This is a track that has a -0.01dBFS 40Hz test tone in all 6 channels (5.1).

This generates a signal on the SW output that has the highest possible amplitude (worse case) while the receiver is set to deliver the lowest possible SW signal at reference.

To put this in the obligatory car analogy think of this of this as the max speed a car can hit in first gear. Sure, the car can go faster in higher gears, but if you want to know how fast the car can go in the lowest gear you've got the run the engine up to the redline. The DTS track being played is akin to running the engine up to the redline. The SW channel being set to -10dB is akin to having the car in first gear.

In the case of my Pioneer Elite SC-05 this is what you get on the output under that condition:



The output is 4.48Vrms / 13.1Vpp.

Based on my testing, this (below) is the maximum signal the MiniDSP can take without clipping the input:



The input level is 4.76Vrms / 13.5Vpp

This means that the worst case DTS content played at reference on the master volume (with the SW level trim at -10dB) has only 0.52dB of headroom before clipping the MiniDSP input. So, if you push past reference you will clip the input of the MiniDSP. If you're playing at reference and apply some signal gain in the MiniDSP (filters that boost, LT, shelf filters, etc) you will probably clip the output. This last scenario may be avoidable by playing with the input / output gains in the MiniDSP. I will test this later today.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: DIY Speakers and Subs
AVS › AVS Forum › Audio › DIY Speakers and Subs › MiniDSP