This afternoon, just for shits & giggles, I thought I'd experiment a little to see if I could hear any difference between crossover points of 200Hz and 60Hz. I took a few quality recordings, selected an appropriate track and played about 30 seconds of it with a 200Hz XO and then again at 60Hz. One more time at 200Hz and one more time at 60Hz. I could not discern any difference. I grabbed another CD and used another track and repeated the same process. Again, I could not tell the slightest difference in any aspect of what I heard.
Is this normal and to be expected?
I, for one, fully expected an obvious difference. Either in the way of 60Hz being way too low or 200Hz being way too high. But I found it shocking that about an octave and a half change yielded no sonic difference at all. I've been using a crossover setting of 90Hz since the new subs arrived but as a result of this experiment, I decided to increase it to 100Hz. Why? If I can't hear a difference, I may as well let the woofers of my mains concentrate on their midrange duties and let the subs sweat through their bass duties.
Is this normal and to be expected?
I, for one, fully expected an obvious difference. Either in the way of 60Hz being way too low or 200Hz being way too high. But I found it shocking that about an octave and a half change yielded no sonic difference at all. I've been using a crossover setting of 90Hz since the new subs arrived but as a result of this experiment, I decided to increase it to 100Hz. Why? If I can't hear a difference, I may as well let the woofers of my mains concentrate on their midrange duties and let the subs sweat through their bass duties.