Rg6 ghz limit

  • WELCOME TO THE NEW SERVER!

    If you are seeing this you are on our new server WELCOME HOME!

    While the new server is online Scott is still working on the backend including the cachine. But the site is usable while the work is being completes!

    Thank you for your patience and again WELCOME HOME!

    CLICK THE X IN THE TOP RIGHT CORNER OF THE BOX TO DISMISS THIS MESSAGE

Edwin95

SatelliteGuys Family
Original poster
Oct 17, 2019
37
9
Los Angeles
Is there rg6 ghz limit on mpeg2 receivers back in August 30 2002? Were all dish Network receivers in August 30 2002 mpeg2?
 
It has nothing to do with Mpeg 2 or not.

From the dish to the receiver, legacy equipment topped out around 1,500 MHz. "Dish Pro" about 2150 MHz, and DPZ closer to 3 GHz.

The switch from Legacy to Dish Pro was made (started) around 2002, DPZ was introduced with the Hopper line of receivers.

Legacy & 1st generation DP receivers were Mpeg 2, no SD receivers were ever Mpeg 4. HD receivers, starting with the 211 and the 222/722/922 were all Mpeg 4.

The 411/811/921/942 (and a few others I am sure) were HD Mpeg 2.

There was also a change from QPSK (Quad PSK) to 8PSK in there at some point. That was before the Mpeg 4 change. (example 301->311)

Also, why the specific date?
 
  • Like
Reactions: charlesrshell
I think we all get confused from time to time: "Oh, that's right! It's the Band Btacking vs. the Frequency Stacking--NO WAIT, I meant Channel Stacking CSS vs. BSS. DUH, how could I make a mistake like that?"

I can't tell you have I've gotten things confused when someone says to me, "no, you meant . . ." Oh, yeah, that applies with that, not this. That's right. Yeah, right what you said." I've made my goofs on this forum far more times than once. :)
 
As stated, MPEG or any codec is irrelevant regarding cables used for Dish systems.

In answer to your first question: No.

In answer to your second question: No.

In 2002 there was at least one HD STB at that time, and I think it was the 6000 or something like that, and perhaps a follow-up to that model, but no HD DVR for Dish until a couple of years later, in fact, the DishPlayer 921 HD DVR may have been the very first SD/HD DVR, and I think it came out in 2004.

While it is generally true that the higher the frequency, the less likely it is to travel farther for a variety of reasons, the real heart of the matter is the limit to how long an RG6 cable is effective in reliable transmission of RF depending upon its use. Is that what you were asking?

For example, I think Dish won't recommend anything beyond 250 feet, IIRC, for DishPro, and I think the old Legacy stuff was a 150 feet limit by Dish (of course it may, indeed work beyond these official limits stated by Dish; they just want to find a length they can firmly stand behind before they recommend other solutions), but the higher bands on a Dish cable from the reflector is climbing towards 3GHz these days. However, the lower frequencies may travel further along that same RG6 and the environment does also have some effect, as well.

RG6 can be rated at different maximum frequencies, so it is a matter of RG6 quality and up to what maximum frequency it is rated. I've seen RG6 rated up to 2150 MHz and other RG6 rated up to 3 GHz. That can make all the difference in the world if you need to use higher RF bands. So, always check the max frequency rating on the cable of any RG6 cable.
 
Last edited:
  • Like
Reactions: charlesrshell
It has nothing to do with Mpeg 2 or not.

From the dish to the receiver, legacy equipment topped out around 1,500 MHz. "Dish Pro" about 2150 MHz, and DPZ closer to 3 GHz.

The switch from Legacy to Dish Pro was made (started) around 2002, DPZ was introduced with the Hopper line of receivers.

Legacy & 1st generation DP receivers were Mpeg 2, no SD receivers were ever Mpeg 4. HD receivers, starting with the 211 and the 222/722/922 were all Mpeg 4.

The 411/811/921/942 (and a few others I am sure) were HD Mpeg 2.

There was also a change from QPSK (Quad PSK) to 8PSK in there at some point. That was before the Mpeg 4 change. (example 301->311)

Also, why the specific date?
When I got my first install in 2000 with what is now legacy equipment, the cable installed outside (reflector) was labeled at 3GHz. It was less than two years when DishPro became available. So Dish was already future proofing by 2000. My aunt and uncle installed a year+ earlier in 1998 and got lower (below 2150MHz) frequency cable.


AFAIK, The modulation schemes, such as 8PSK, is not relevant to the cabling, but it is hardware dependent in the "receivers" such as a DVR or non-DVR STB, but not the clients like Joeys. 8PSK merely allow for more data to be transmitted on the very same RF without any change in bandwidth. This allows Dish to add more channels per Xpndr without and increase in bandwidth, hence the only thing NEW needed for demodulating 8PSK at/in the STB, not the LNBF's nor IF cabling. This is the same case when Dish implemented Turbo Coding for all HD channels in that what mattered was the hardware/firmware at/in the STB, not at the reflector nor the cabling.


The upgrade to DishPro tech was about the implementation of Band Stacking on the IF. This allowed Dish to eliminate having to use ONE separate cable for each polarity of the Xpndrs.
You might remember how there had to be one cable for Odd and another for Even from the reflector, and then into a diplexer (I forget the official part name) where a single cable emerged to connect to the STB.
There are only 16 DBS frequencies/Xpndrs, but if we modulate the same frequency at 180 degrees out of phase from the other, we get a net result of 32 Xpndrs, considered polarity Left and Right for DBS. DishPro allows for both of these polarities to be transmitted on the same, ONE, IF cable using the original band up to 1500 MGhz from legacy for one polarity, and then "stacking" a 2nd band up to 2150 MGz for the other polarity. This meant only ONE cable needed to directly feed a single STB. In the case of some early DishPro two-tuner DVR's, a splitter-like part was used to feed the two separate tuners on those models, or an installer could run an un-used IF cable from the reflector to the 2nd DVR tuner if that part was not available, which I think at first, it was not available until later.
But, DishPro's Band Stacking allowed for other benefits such as better external DishPro switching technology that allowed for efficient installs that provided support for more TV's per home that was economical to install.
The legacy tech required for my original 4 TV install was COSTLY, and required the cranky SW64 with a whole lot of cables involved to and from the switch. But I coveted my inefficient and moody SW64 for allowing me Dish at 4 TV's for all those years.

I did see photos around 2001, IIRC, from CES of Dish having wired together TWO SW64's to provide support for more than 4 TV's, but, thankfully, DishPro was soon released and made the SW64 obsolete. I've 6 TV's today elegantly connected to Dish and two H3's. Who would have thought it possible?
 
For example, I think Dish won't recommend anything beyond 250 feet, IIRC, for DishPro, and I think the old Legacy stuff was a 150 feet limit by Dish (of course it may, indeed work beyond these official limits stated by Dish; they just want to find a length they can firmly stand behind before they recommend other solutions), but the higher bands on a Dish cable from the reflector is climbing towards 3GHz these days. However, the lower frequencies may travel further along that same RG6 and the environment does also have some effect, as well.
The reason for that is line noise or dBm. At the Dish, there is approximately -30 - -32 dBm of line noise. Direct uses dBm's at their lowest value to point their dishes in, while Dish uses Signal Strength, at which point dBm's are typically at their lowest.
Every 10' of cable adds -1 dBm. every connector adds -1.5 dBm's so a barreled line adds -3 dBm, in total.
If you look at the splitters we use, it's notated on the splitters, the dBm value at each port.
1650721657422.png
1650721688301.png

You can do the math for 4-Way splitters/Solo Hubs

-50 dBm is the cutoff where you can expect profound signal loss. Therefore, 200' is the limit.
A good tech will check the signal behind the receiver, not just for signal strength but to gauge the line noise and if it's accurate or excessively high to determine bad cable or connections between the Dish and the receiver.
 
Last edited:
The reason for that is line noise or dBm. At the Dish, there is approximately 30-32 dBm of line noise. Direct uses dBm's at their lowest value to point their dishes in, while Dish uses Signal Strength, at which point dBm's are typically at their lowest.
Every 10' of cable adds 1 dBm. every connector adds 1.5 dBm's so a barreled line adds 3 dBm, in total.
If you look at the splitters we use, it's notated on the splitters, the dBm value at each port.
View attachment 156502 View attachment 156503
You can do the math for 4-Way splitters/Solo Hubs

50 dBm is the cutoff where you can expect profound signal loss. Therefore, 200' is the limit.
A good tech will check signal behind the receiver, not just for signal strength but to gauge the line noise and if it's accurate or excessively high to determine bad cable or connections between the Dish and the receiver.
This post makes no sense to me. dBm is a specific level that needs a number to go with it as in XX dBm, which is a value referenced to 1 milliwatt or 0dBm. A connector or cable doesn't have dBm loss, it simply has dB loss. Which by the way is about .8dB loss per 10ft of RG-6 at 1450MHz but connector loss is very small and more like under .2dB loss per connector at 2150MHz. Two good quality properly installed F connectors and a barrel adapter might be around .5dB loss at 2150MHz.

What is 30-32dBm of line noise at the dish? An LNBF puts out roughly -30dBm to -35dBm of signal level per transponder right at the LNBF connector. 30dBm would be 1 watt of power, are you missing a minus sign?

I don't know what is meant by "Direct uses dBm's at their lowest value to point their dishes in". DirecTV uses peak signal strength from an actual locked receiver in the AIM meter and a slightly degraded mistuned level that is balanced on both sides of peak then you park the dish in the middle of those mistuned points. This provides the most accurate boresight for the dish as the early DirecTV dishes had to work over 20 degrees of satellite arc and five orbital slots with a few of the satellites only 2 degrees apart.
 
This post makes no sense to me. dBm is a specific level that needs a number to go with it as in XX dBm, which is a value referenced to 1 milliwatt or 0dBm. A connector or cable doesn't have dBm loss, it simply has dB loss. Which by the way is about .8dB loss per 10ft of RG-6 at 1450MHz but connector loss is very small and more like under .2dB loss per connector at 2150MHz. Two good quality properly installed F connectors and a barrel adapter might be around .5dB loss at 2150MHz.

What is 30-32dBm of line noise at the dish? An LNBF puts out roughly -30dBm to -35dBm of signal level per transponder right at the LNBF connector. 30dBm would be 1 watt of power, are you missing a minus sign?

I don't know what is meant by "Direct uses dBm's at their lowest value to point their dishes in". DirecTV uses peak signal strength from an actual locked receiver in the AIM meter and a slightly degraded mistuned level that is balanced on both sides of peak then you park the dish in the middle of those mistuned points. This provides the most accurate boresight for the dish as the early DirecTV dishes had to work over 20 degrees of satellite arc and five orbital slots with a few of the satellites only 2 degrees apart.
Yep, missing a minus sign. Dammit!!
I was still on Coffee cup #1.

As for Direct, I never installed A DTV Dish but a friend of mine works for them and that's just how he explained it to me
 
The reason for that is line noise or dBm. At the Dish, there is approximately -30 - -32 dBm of line noise. Direct uses dBm's at their lowest value to point their dishes in, while Dish uses Signal Strength, at which point dBm's are typically at their lowest.
Every 10' of cable adds -1 dBm. every connector adds -1.5 dBm's so a barreled line adds -3 dBm, in total.
If you look at the splitters we use, it's notated on the splitters, the dBm value at each port.
View attachment 156502 View attachment 156503
You can do the math for 4-Way splitters/Solo Hubs

-50 dBm is the cutoff where you can expect profound signal loss. Therefore, 200' is the limit.
A good tech will check the signal behind the receiver, not just for signal strength but to gauge the line noise and if it's accurate or excessively high to determine bad cable or connections between the Dish and the receiver.
Thanks for the clarification of 200 feet that Dish will stand behind. I think I got others reporting longer than 200 feet cable runs as working (in some cases) and confused that with the number 250 feet, which I had seen some time ago. My IIRC's are dusty at time :). Thanks, again, for the correction.
 
  • Like
Reactions: HipKat
Yep, missing a minus sign. Dammit!!
I was still on Coffee cup #1.

As for Direct, I never installed A DTV Dish but a friend of mine works for them and that's just how he explained it to me
. . . And DTV uses Channel Stacking (DTV SWM technology), which is incompatible to Dish's Band Stacking as far as what is output at the LNBF and switching technology. IMHO, both have advantages/disadvantages, so I don't feel one is necessarily better than the other, but it does depend upon the vision of either DBS company and if some compatibility with "off-the-shelf" rebranded "consumer-type" splitters and DA's are considered a to be highly desirable early in the chain.
 

Users Who Are Viewing This Thread (Total: 0, Members: 0, Guests: 0)

Who Read This Thread (Total Members: 2)