It is actually quite tricky to measure this accurately.
The reason is that the power depends on the impedance of the load (i.e.
the speaker) and the latter can be VERY frequency dependent for a real
speaker.
Another problem is that the "maximum power" is not really that
interesting, the distortion increases with power and at some point the
amplifier will start clipping, this is not really big problem initially
(at least not if the amplifier is clipping "gently") but sooner or later
you will reach a point where the music sounds terrible and -worse- you
risk destroying your speakers (an amplifier that is clipping outputs DC
that heats up the voice coil).
Hence, measuring power of an amp feeding a speaker in "real time" is
therefore somewhat tricky; the only way to do it is to measure both the
voltage and the current from the amp but that is NOT something I would
recommend unless you know what you are doing (voltage is not the
problem, but measuring the current is).
The result will also depend on how long you are averaging (i.e.. the
"time window") since peak power can be MUCH higher than the median power
for real music.
Anyway, the standardized test is to measure the voltage across a 8 Ohm power resistor at 1% distortion and then calculate the power from P=V^2/R. This is what the wattage rating of an amp means.
REf : f95toli
No comments:
Post a Comment