Originally Posted by 3ll3d00d
I don't think you can say that is "best practice as the right answer depends on the input sensitivity of whatever is downstream of the minidsp, what levels you are aiming to produce, whether the system is for film or music, the voltage produced by the prepro (& whether they are XLR or RCA).
Thanks for responding--all fair points. Perhaps you or someone else can assist me in finding the answers to the questions you've raised, and let me know if the workflow I've outlined would actually constitute the best practice--for my particular setup
input sensitivity of whatever is downstream of the minidsp
How would I determine this? I have an SVS-PB2000
, containing a Sledge STA-500D DSP amplifier. Viewing the manual
and googling for "input sensitivity" doesn't seem to provide me with anything regarding the voltage expected... I believe I understand how I could use a multimeter to take a reading directly off the AVR output, and again off the output of the miniDSP, but I don't understand how (or if) you can measure what the sub is "expecting". This should be documented somewhere, no?
what levels you are aiming to produce
Peak, or calibration? My aim is to calibrate to 75dB, though truth be told my day-to-day listening tends to be lower. When calibrated to 75dB, I tend to run my AVR MV between -10 and 0.
whether the system is for film or music
System has to perform double-duty, with the edge given to music.
the voltage produced by the prepro (& whether they are XLR or RCA)
RCA. And perhaps this is where I'm not "getting it" yet. Over the past few pages, it seems as if even if your AVR produces greater than 0.9V, it is better to dial it down and set the miniDSP to 0.9V, rather than set it to 2.0V because of the way the 2.0V setting drops the miniDSPs output
. But perhaps I'm mis-reading it.