Quite frankly, the numbers on that chart and other charts like them are absolutely meaningless as I have tried to get across for several years now - but unfortunately people don't want to hear that.
Dish uses StatMuxing on their transponders. On 110W for example, HBO/9456 is muxed with TNT-HD and HD-PPV/9468. On 148W HBO/9440 is muxed with ShowtimeHD/9430.
Now, generally speaking, the bitrates on 148W/9440 are higher for HBO-HD than on 110W/9456. BUT, if it is overnight after Dish has used the HD-PPV channel for a Sporting Event and no movies are being shown - only a black DishHD PPV slate on 9468, then 110W/9456 is generally higher as the slate is static and not taking up virtually any bitrate.
Why is that? Because ShowtimeHD's East Coast feed on 135W is also StatMuxed on the MegaPipe C Band feed. Thus it can reach video resolutions in the 16Mbps range (though rarely). If Showtime 9430 is taking a higher rate on 148W, then HBO-HD 9440 is clamped down more (which is stupid as they really don't need to do this at Dish, but I digress) - thus 110W with a black slate on 9468 combined with TNT-HD, HBO-HD on 9456 can have a higher bitrate than HBO-HD on 9440.
Thus, if you want to compare 1 event at 1 time - such as Beverly Hills Cop and BHC 2 this past month, 110W/9456 had a slightly higher bitrate than 148W/9440 on the first showing.
However, that is not typical as HD-PPV/9468 is usually showing a movie or sporting event.
Also consider Showtime's showing of Sheena. It came in on 148W/9430, depending on the showing, at 16.30Mbps 15.60Mbps and 14.80Mbps Video Bitrate. But then again, the first showing on 110W/9460 came in at 15.30Mbps. So if you compare that to the 14.80Mbps showing of Sheena on 148W/9440, 110W/9460 was also higher.
So it depends on the the showing, what is being statmuxed on Showtime East HD Distribution feed on 135W and what is being statmuxed by Dish on their transponders at the same time.
However, all this really means nothing as the source varies wildly per title. Showtime had I Love Trouble which just aired on 148W/9440 at a rate of 8.65Mbps.
So which number do you want for ShowtimeHD on 148W/9430? 16.30Mbps for Sheena or 8.65Mbps for I Love Trouble?
And to make this even worse, these are average bitrates. Some would argue that we should actually look at peak rates as most of the time you do not need the maximum rate for proper resolution.
For example, Star Wars 3 - Sith was roughly 9.15Mbps on HBO via C-Band Delivery on 127W. The Island was 13.73Mbps. When you add in the 384Kbps for audio, you see that The Island was clearly pushing the limit of the HBO-HD C Band distribution at 14.25Mbps total.
However, AVSForum is filled with people talking about how bad The Island looked at 50% bitrate (which it did) and nothing but praise for SW3 at 4.5Mbps less. Now, SW3 was 2.35:1 OAR and is a very dark film, so of course the bitrate needed was lower. The Island is very bright and HBO cropped it to full screen.
Even forgetting about the cropping, which did the people prefer in terms of PQ? SW3 at 4.5Mbps less.
I had a meter put on the output of the IRD used for Universal HD at the headend of the local cable company earlier this year when we got into a discussion on the forums of how Universal HD looked via D* and E*. Over a 24 hour period, the average TOTAL bitrate was 11.9067Mbps and the Peak was 15.4220 Mbps. Which figure do you want in the one number chart?
When you consider most movies on UHD average only in the 8-9Mbps Video range, its clear the TV Reruns such as Medical Examiner were bringing up this number. So do you want a number based on the TV shows or the Movies?
With Fox, last year you saw 9Mbps on 24 and other shows while American Idol and Football were over 13 Mbps. Which 1 number do you want to use for Fox? Do you average them - as that really dilutes the reality as well.
Beginning to see the issues with 1 number even if its an average over 7 different nights on Fox? The number is much different for the 2 hour Premier of 24 than it is for the 2 hour Premier of American Idol - and very different than the 2 hour 480 line 16:9 American Most Wanted/Cops on Saturday Night.
Which brings us to HDNET Movies with its constant roughly 17.50Mbps video stream in their distribution feed. To begin with, HDNET is not telecined in its distribution (though Dish has started to telecine it after the fact), and thus you can argue that this costs them 25% of the bitrate off the top as they must transmit 30 frames per second instead of 24 like HBO and Showtime. Thus, you can argue that HDNET is not really showing video at 17.50Mbps but at about 13.125 Mbps which helps answer the question of why some titles look better on Showtime. Of course, HDNET no longer has that type of bitrate on Dish any longer either.
But the bigger question is how does HDNET get a constant video bitrate of roughly 17.50Mbps for distribution where the transfers all come from different sources at different rates? The most common way would be to pad the mux with null streams, but of course, that isnt done in satellite distribution. HDNET has refused for years to answer that question about their chain, so one can only assume they running their video through a processor to get the constant bitrate - regardless of the original transfer rate. That also explains why some transfers on HDNET look soft even with the higher bitrate - as they are not really that high of a bitrate.
In fact, the Blade Runner airing on UHD and HDNET were virtually identical - even with DRAMATICALLY different bitrates.
Another example of this (though done after the initial distribution) is in Canada where Movie Central has one of the widest variation of rates that I have seen in HD Distribution - running from a 4.75Mbps (yes, you read that right) to over 14 Mbps in their Videostream - almost a 10 point spread.
Shaw Cable takes that signal and re-encodes it to get roughly 11Mbps-12Mbps consistently - which explains why some Movie Central stuff looks so bad - as its a much lower bitrate than is actually being shown. BEV will take the Movie Central feed, convert it to 720p and give it a constant 11.67Mbps Video Bit Rate - again, explaining why some titles look so bad (they were actually much lower in origination).
And before I forget, Animation takes a MUCH LOWER BITRATE to look good than a regular film does. Shrek can look much better at a lower rate than a live film.
Also Black and White can look good at much lower bitrates. Good Night, and Good Luck has a very low bitrate in the 7Mbps for example. Look at the low bitrates for Sin City, Kill Bill and Pulp Fiction for example.
And if you really want to get confused, consider the fact that average bitrates on the HBO-HD East Distribution on 127W are about .25Mbps - .35Mbps lower than a year ago on the same titles - even though their C Band transponder is still set to 14.25Mbps.
So again, 1 figure rates based on 1 clip or title - and not on the same thing are really meaningless.
But, if you really want a generalization, 148W is usually higher than 110W feeds - but as shown - not always - and as also shown with SW3 and The Island - 50% higher bitrate might actually look worse.
Editted to add Dish Channel Numbers for clarity.