Blocking Gain Compression;
|Sep 1st 2011, 19:45|
Total Topics: 0
Total Posts: 0
Blocking Gain Compression; Rationale For Our Test Method.
The ARRL uses a narrow bandwidth (down to 5 Hz at times), using a signal analyzer, to dig the desired signal out of phase noise while observing a 1 dB drop in speaker audio to determine the blocking level for our blocking gain compression tests. While a strong nearby signal will likely create strong noise in the speaker audio (caused by reciprocal mixing) and mask the desired signal by ear, there are a number of very valid reasons for the test to be done the way it is. Some think, “No one has a receiver with a 5 Hz bandwidth, the blocking compression dynamic range figures (we report) are inflated!” and insist that method of blocking gain compression should be to increase the blocking signal until the noise increases, then reporting the noise increase as the a noise-limited version of the measurement is correct. This is not the correct method! Please allow me to explain here in this forum.
First off, we must remember that we are measuring gain compression. The measurement is made with a desired signal test tone that is as well above the receiver's noise floor as possible, with AGC off, so that the measurement is made in the linear range of the receiver. For this reason, it isn’t useful or appropriate to report the noise-limited value, as the level that one would measure as noise limited is very much a function of the level of the blocking signal, as well as a function of the receiver's reciprocal-mixing noise. If the desired test-signal level were different, so would be the value of the noise-limited measurement.
This noise-limited measurement is relative at best, misleading at worst for stronger signal levels. A noise-limited measurement tells you nothing about the level at which non-linear gain compression will affect stronger signals, while a narrow bandwidth measurement of gain compression does. It is our opinion that if a measurement of blocking gain compression dynamic range is noise limited, the measurement really could not be made and the measurement should simply be reported as "noise limited."
Another reason to dig that gain compression result out of the noise is that gain compression that occurs in a receiver occurs at all signal levels. If a strong off-channel signal is causing gain compression at the level of the test signal, that gain compression will also be occurring at stronger signals, with AGC active (although AGC would mask some of the effect), and at those stronger levels, the noise that made the test "noise limited" would have very little effect, as the desired signal would be gain compressing due to strong off-channel signals, but not affected by noise. The noise-limited value has no meaning at anything other than the specific test level being used, so getting the "real" gain compression number by using a narrow bandwidth provides a result that can be applied to ALL signal levels.
The ARRL Lab’s older test method used an audio power meter at the receiver output. We increased the blocking signal while observing the desired signal when it dropped by 1 dB. When noise occurred, we indicated at what level the noise increased by 1 dB. This worked, after a fashion, when noise was minimal, or when noise was severe, but when the noise was at about the same level of the signal, it was entirely possible for noise to be increasing as the signal was being lost due to gain compression, with the net effect being 0 dB, even though by ear, the test engineer could tell that the desired signal was simply disappearing. In one or two instances, ARRL saw the noise increase, and then saw the receiver go into gain compression on the noise, which never did increase by a whole dB. While some may state this is the method to use, this is simply a bad test.
More important, one cannot make a reliable, accurate measurement of gain compression of a receiver with the AGC on! The desired test level is well above the noise floor, but below the level where a receiver would be non-linear without AGC. When AGC is active, a large change in in signal level will result in a small change in receiver output. The net effect of AGC is to significantly reduce the 1 db change that is being measured, so a 1 dB change that would be measured by the AGC off test would perhaps be a 0.1 db change in signal level, requiring that the strong off-channel signal in the blocking gain compression test be increased significantly to obtain an actual 1.0 dB change in receiver output. In that case, the test is not as much a measurement of gain compression as it is a measurement of the flatness of the AGC. Also, receiver AGC can be designed many different ways. Some manufacturers allow 10 to 20 dB of headroom before AGC takes effect. The level of the desired signal test tone in the blocking gain compression is in that region, so it is probable that receiver AGC would be affecting the level of the desired signal test tone. Other receiver designs have AGC start only a few db above the receiver's noise floor. In that case, the desired signal test tone would be well into the receiver's AGC range. In conclusion, testing blocking gain compression with AGC on is simply not going to give a valid result.
Note: Our new test method of BGC DR was first reported with the Yaesu FT-2000, in February, 2007 QST.
Bob Allison, WB1GCM
ARRL Test Engineer