Originally Posted by moonhawk
So is 100 msec considered a good waterfall decay time?
The RT-60 is the usual measurement used to access the spectral decay. The RT-60 is the time it takes for reverberation to drop 60 dB. A "good" RT-60 is usually considered 300 Milliseconds or less.
xtz measures the RT-60 on the full range measurement, but just in the range from 125 Hz up. However the spectral decay in the bass range can been seen on the FR graphs, in the upper right corner, (and by clicking on it to exchange it with the FR graph.) Here is an example:
Note the scale on the right in dB. The color represents the SPL. Red is 10 dB down. Yellow is 20 dB down. Cyan is 30 dB down, etc. In the above graph, there is a room mode at ~32 Hz. This mode resonates out to about 225 ms., but the RT-60 would be well under 300 ms. This would be very good response in this range.
Here is a chart of the pre-Audyssey response in my room:
In the spectral decay chart in the upper right, you can see the prolonged resonances, with red and yellow out past 100 ms. at 20 and 40 Hz, and cyan and blue all the way across the chart.
Here is the post-Audyssey response:
Not the reduction in spectral decay times in the 20 and 40 Hz ranges. This is the effect of the frequency and time domain filters in the Audyssey algorithm. IME, this time domain correction is just as important as the frequency domain improvement seen the FR charts.