|Home | Audio Magazine | Stereo Review magazine | Good Sound | Troubleshooting|
Q. The cables joining my rear speakers and subwoofer to the rest of the system are currently hidden under the edges of the living-room carpet, along the walls. We plan to install hardwood floors, which will have to be applied directly to the concrete slab beneath, but we don’t want the wires to be visible. Any suggestions on hiding the wiring and on how that should affect the choice of cables?
A. Your options depend to some extent on how the room is built. It may be possible, for instance, to poke a couple of small holes at each end of the side walls and fish the cables through, or a dropped ceiling may permit stringing the wires overhead. If these approaches are not feasible, the task of hiding the wires becomes harder but not necessarily impossible.
In the first place, you might consider placing the subwoofer up front rather than in the back. Its position is not really critical, so you can put it pretty much wherever the wiring permits. As for the rear speakers, they probably will not have to handle large amounts of power, so it may be possible to use fairly light wire, which would be easier to hide than heavy-gauge cables.
If you are contemplating some sort of baseboard at the bottom of the walls, that might serve to cover the wires. Some types of molding have space behind them; if not, placing the baseboard off the floor (or out from the wall) by the thickness of the wire would probably not be noticeable but would enable you to tuck the cables into the gap. There are also very flat ribbon cables available that might fit under a baseboard or even be glued directly to the wall and concealed with a coat of paint.
Q. I am using an old 110-watt-per-channel receiver to power my subwoofer. Is there any way to “strap” the outputs of the two amplifier channels to yield a significant increase in power?
A. Strapping, or bridging, is a technique for combining the two channels of a stereo amplifier to produce a single mono signal of higher power, typically somewhat higher than the sum of - the two channels operating normally. It may seem an attractive way to power a mono subwoofer, but it only works if the amplifier was specifically designed to operate that way. The instruction manual should tell you how to go about it if you can; if it doesn’t mention it, don’t try to do it.
I doubt there would be much benefit in your case anyway. An amplifier that puts out more than 100 watts should be more than powerful enough even for quite insensitive subwoofer, and doubling the power would yield only a barely audible increase of 3 db in volume.
Q. My CD player and cassette deck both have variable-level output controls, so I have eliminated the need for a preamplifier by feeding both components directly to the input of the power amp. To change the source, I simply unplug the power-ample from one component and attach it to the other. Would the level control on a preamplifier be sufficiently superior to the output controls I am using now to make adding the extra component worthwhile?
A. A simple level control, whether inserted in the output stages of a CD player or included in a preamplifier, is a pretty benign thing and unlikely to cause signal degradation. The setup you are now using would therefore not be improved significantly by the use of fixed-level outputs a*d a preamp.
I suspect, however, that your power amplifier has its own input-level controls. If not, you would be likely to get some very unpleasant—and potentially damaging—noises when you switch cables between your CD player and cassette deck (unless you shut off the power be fore changing sources). If your amp does have an input control, the path with the fewest circuit elements would be from the player’s fixed outputs to the amplifier’s inputs, with the playing level controlled at the amplifier. This setup might mean some sacrifice in convenience, though, and any audible improvement it might provide (very unlikely) would be negligible.
Q. Almost all speaker-sensitivity ratings I have seen are given in dB SPL (decibels of sound-pressure level) measured at 1 meter with 1 watt input. In STEREO REVIEW’s speaker reviews, however, the measurement is made with a 2.83-volt input. What, if anything, is the standard?
A. One-watt/one-meter is one of those P1 easily remembered formulas that ad writers like, and that’s probably why it is the closest thing to an accepted standard in speaker specifications. Unfortunately, it’s a little misleading.
The power flowing from an amplifier into a loudspeaker at any given moment is determined by the voltage being produced by the amp and by the impedance of the speaker. Impedance is a complex thing, however, and though most speakers have a “nominal” impedance ex pressed by a single number, usually 8 ohms, in reality this rating is only an average. Almost all speakers have impedances that vary significantly with frequency—much higher at some frequencies, lower at others—so the power in the circuit varies all over the place even if the amplifier’s output voltage remains constant. But a loudspeaker with flat frequency response will produce the same output at all frequencies for a given input voltage, even if the impedance, and thus the power, changes radically with frequency. Consequently, the sensible thing is to specify sensitivity for a standard input voltage rather than a standard input power, which would tend to give different outputs at different frequencies.
If specification writers could break themselves of the 1-watt/1-meter chant and specify a standard voltage instead, the result would be more consistent sensitivity ratings that would take into ac count both overall impedance differences between speakers and frequency effects as well. The natural voltage to specify, although admittedly this is arbitrary, is one that would produce 1 watt into a purely resistive load of 8 ohms: that is, 2.83 volts. In most cases, the “1 watt” you see on a specification sheet is just a catchy shorthand rendering of “2.83 volts.”
Source: Stereo Review (Jan. 1991) BY IAN G. MASTERS