Satellite signal levels on the ground

  • WELCOME TO THE NEW SERVER!

    If you are seeing this you are on our new server WELCOME HOME!

    While the new server is online Scott is still working on the backend including the cachine. But the site is usable while the work is being completes!

    Thank you for your patience and again WELCOME HOME!

    CLICK THE X IN THE TOP RIGHT CORNER OF THE BOX TO DISMISS THIS MESSAGE

fluke281

New Member
Original poster
Oct 20, 2006
3
0
I know this sounds strange, but I wanted to understand the actual power levels that we are using before deciding on a dish size. There are a lot of recommendations on the net but not much discussion of the levels involved. I have a book called "Satellite Television' by H. Benoit from 1999 and did some investigating. Here is what I have found: Lyngsat and other sites express the power from the directional antenna on a bird. This is the signal aimed towards the footprint area. Typically, the power is 100 watts or less for FTA. With a gain of 20-30 times, the EIRP is about 20 to 25,000 watts or about 50 dbW(referenced to one watt). You divide this by the total area (footprint) to get the power per square meter which is only -100 dbW for North America. I thought it was worse that that. This level corresponds to roughly 10 to the minus 12 watts/m2 (a picowatt per square meter). Converting to milliwatts, it is 10 to the minus 9 mW/m2 (a nanowatt/m2). With a dish gain of 30 dB and an efficiency of 50%, we have about 5 times 10 to the minus 7 mw. The dish is assumed to be a square meter. With an LNB gain of 60 dB, we have a 5 times a tenth of a milliwatt or -50 dbm. Most satellite receivers have a sensitivity of about -64 to -25 dbm (dbm is referenced to a milliwatt). Sometimes the first minus sign is excluded in the specs. This figure now makes sense

I have neglected losses in the cable, the 1.5 dbW loss for the atmosphere, and not considered the rain. Satellite boxes are also subject to interference from sources in their downconverted IF ranges. I did find a reference from a Power Point

stating that satellite signals have a level generally less than 100 picowatts which is one to two levels less than terrestrial receivers which would receive their digital signals at one to 100 microwatts.

This teaches me that having a 70 db gain on an LNB is important as well as a low noise figure. Increasing dish size from 80 cm to 120 cm only gives you about 5 db of gain increase but certainly cuts out adjacent channel interference. I really noticed this with 125 W when a 100 cm dish would not receive the HD channels but a 120 cm dish would do the job--but the larger one is much more difficult to tune and harder to manipulate. The Avenger LNB's have higher output power so that I have no need for one of those 20 db inline amplifiers.

I would appreciate any comments.
 
You don't need all that gobbledegook. 3db of signal gain is a DOUBLING of signal! You say 5db? That's a very significant gain.

Here's the rules for happy FTA viewing:

Want KU? Then you want a 1.2 meter (4ft) dish.

Want C-band? Then you want a 10 foot dish. (Mostly because of S2 signals, and that 10 footers "see" a tighter beam, so don't suffer signal bleedover from sats that are only 2 degrees apart).

Satellites don't always (hardly ever in fact) use full power on their transponders. They dial down the signal levels for various reasons.
 

Users Who Are Viewing This Thread (Total: 0, Members: 0, Guests: 0)

Who Read This Thread (Total Members: 1)