LNBF Frequency Stability - Blind Scan

Status
Please reply by conversation.

Titanium

AI6US
Original poster
Lifetime Supporter
May 23, 2013
7,667
9,108
Meadow Vista, Northern California
Received an email about PLL LNB stability and blind scan accuracy. The user wanted to know how LNB stability specification was referenced and why some blind scanned transponder frequencies were the same as published, but other blind scan logged transponders were different. Here is my reply and thought it might be of interest to other FTA hobbyists.

- - - - - - -

The stability specification provides the stability of the IF in the range of operational temperature. The time to reach a stable operational temperature is dependent on ambient temperature. A crystal is used to provide reference is typically within the +/-50KHz range and is not user adjustable. The LNB has been confirmed on the factory test bench to output a signal within +/-50KHz of the closed source calibration signal over a 10 minute test period at 25c ambient.

There are many variables when attempting to calibrate a consumer system using live satellite downlink signals. The logging process analyzes the incoming signal and divides the signal by 1/2 to establish a centered frequency. Downlink frequencies are more often than not centered on the rounded frequencies published on public sites. The calculated center of the carrier signal can be affected by signal processing errors introduced by interference (terrestrial, adjacent satellite, cross pol, etc), hardware variances and software processing. To accurately reference the downlink frequency, both the LNB and the PCI card would need to to be externally referenced to the same source. One would need to qualify both the LNB stability/offset and the tuner LO stability/offset. I would not consider the results a blind scan as an accurate reflection of minute frequency stability and would likely only indicate major hardware stability or signal value changes.

Carrier centering frequency variances on different carriers (not offset universally - systematic) not as documented by the downlink service provider would likely indicate interference or processing rather than hardware related.

As a hobbyist aligning a system with live satellite signals, I would suggest operating your system for 5 minutes after sunset, blindscan and reference a confirmed downlink of a narrow SCPC service. You will need to know the exact (not Lyngsat or similar reported) downlink frequency. This information is often published on the downlink service website or by referencing link budget parameter agreements. It would be best to test on a carrier (SR 1000 or lower) with a high FEC, high SNR and as near perfect BER. After several hours of run time run this test blind scan again and compare the logged frequency on this carrier.

Is the logged carrier center frequency the exact as the service's downlink frequency? Did the center frequency of the carrier shift between the tests? If so, how much was the shift? Were there changes with the SNR or BER indicating a change in error free processing that could point to other factors in evaluating the performance of the system hardware?

If the frequency remained within +/- 50KHz between the two tests with no significant change in SNR / BER, this should be noted as the offset off your system LNB, card, software processing. Adjust the LO to calibrate for any systematic offset.

I typically will find the operational drift to be less than 20KHz total during similar testing. If you find that the center carrier frequency remained within a 20KHz range, but is repeatedly 100KHz high, then you will assume that the hardware and software processing should be calibrated with a -100KHz offset. You might even use the results of this testing to further flowchart what component or processing is responsible for introducing the offset.

You will find variances between blindscan logged transponder frequencies, but if you know the baseline of your system, you will be able to identify why center frequency logging may be reported differently.
 
Good and thorough explanation of a common issue. Explains why I see 12145 V 19999 on 103 instead of 12145 V 20000.
 
Symbol Rate determination is a software calculation during signal processing and is not affected by the LO stability or interference. As long as the DVB signal is error free enough to lock, the software will analyze and calculate the carrier center frequency, Symbol Rate and FEC. If the modulation is compatible, the transponder will be logged.

On consumer STBs, as long as the parameters are close, the receiver will lock within a coded range. Compare this to the automatic fine tuning function of a FM broadcast radio. If the parameter isn't quite exact, the STB will search within a range to attempt to lock.

Often the receiver will lock a marginal transponder faster if the the parameters are optimized. One might find that "detuning" the parameter settings allows a STB to lock a transponder that has interference or processing errors.

BTW... Some receivers are better at logging accurate signal parameters than others. :)
 
  • Like
Reactions: KE4EST
Status
Please reply by conversation.

Users Who Are Viewing This Thread (Total: 0, Members: 0, Guests: 0)

Who Read This Thread (Total Members: 1)

Top