The problem has been known ever since more channels began using DD5.1. The real issue here is DD2.0 vs. DD5.1 If you study the encoding performed in the mux of DD audio, you will see what is going on. It is purely a technical issue and has a simple solution! Yes very simple.
But first of all, all of you need to get over that this is some sinister conspiracy on the part of Dish to annoy it's subscribers, or that it is done by the advertiser intentionally to make some spots louder than others ( except in certain cases). It's not. As I said it is a purely technical issue with a very very simple solution. BUT, unfortunately, the simple solution must be accomplished at the signal source where the audio is muxed to mpeg 2. This means that it must be done there and anywhere there is an insert of spots to the stream. That could mean that Dish holds some culpability in this as well but not all of it.
So here's the exact cause of the issue. If any of you really want to see which spots are louder than others look at the meta data in the Dolby digital audio. You will see that the issue is only present when you have DD5.1 enabled. In a receiver with only DD2.0, the problem will never show up. The reason is because DD meta data is being switched by the originating station between DD2.0 and DD5.1. IN MY OPINION, this is a huge mistake. However the problem comes with some caveats. follow with me on this for a moment-
Two different commercial producers have different capability. One produces a spot for General Motors in DD5.1 and the other produces a spot for Ford in DD2.0. There is no FCC regulation that requires one over the other. Now when each of these spots airs in a break, the DD2.0 will automatically sound louder than the spot with the DD5.1 audio. Note that both sets of audio are ALWAYS included in the mux but when the producer who only has DD2.0 present, the DD5.1 c hannels will be silent. The beauty of the Dolby system is that DDis automatically switched so that if both are present, DD2.0 and DD5.1 sets, the DD5.1 will be default unless the receiver is only capable of DD2.0, then that audio will be present. If you have only a DD2.0 receiver, you'll never hear the lower volume DD5.1 and have your audio adjusted for DD2.0 all the time. The reason why you may hear the difference is because your receiver is switching automatically between DD5.1 ( lower average volume but wider dynamic range) and DD2.0 (higher average audio volume with less dynamic range)
The simple solution is for a legal requirement that ALL commercials to air on any HD channel MUST be in HD and contain both DD2.0 and DD5.1. Of course the downside to this is that first, broadcasters will immediately lose advertisers in droves, those who do not have spots ready to go in DD5.1/DD2.0 mux in the digital audio stream. The SD / HD issue is simple enough to handle with upconversion. The audio is not so easy to do as an upconvert. However, Dolby does have a boxed answer. Unfortunately it is not cheap but it does line level the loudness between DD2.0 and DD5.1 . Last I spoke with Dolby engineers 2 years ago , they said the stations and production companies were resisting due to the cost. AS a TV spot producer, I'd have to side with the stations on this because the cost would seem to be less just to go ahead and add the DD5.1 sound track to match the levels during spot production than have to pay for a box to generate the mix. Furthermore, the box from Dolby doesn't really address the entire issue. It only resolves the issue when stations are switching the meta data in the stream. So, in reality, the only way I can see this technical issue being resolved is to require all programming that airs on an HD channel to be in HD and DD5.1/DD2.0. Then forbid the stations from switching the meta on and off DD5.1 and rely on the consumer's receiver to work the way the Dolby chip is designed.
I also spoke with an engineer at Echostar about it and they were working on a long rage method to resolve the issue in the VIP receivers using a form of AI in the software. This was a long range R&D effort and frankly, I don't hold much hope for it since he also said it wasn't of high priority. I recall he said it was on the same R&D priority as doing something cleaver with the black side bars when 4:3 AR programs appear on the HD channels.
I hope I explained the issue as I understand it. I interviewed several Dolby engineers on this, one CBS network executive, and one local station BE as well as two engineers from Echostar and trust me, everyone is aware of the problem and everyone understands the best solution is to require all programming to have the full mux channels and never have the audio switch during the broadcast.
One final comment about sound volume- There are some rare cases where a commercial will be produced with unusually loud volume in mono sound or even stereo that sounds louder than others. This is usually done when the producer orders up a high degree of audio volume compression to create the illusion of yelling due to script creativity. All good commercial producers know that annoying works in advertising so that is why they resort to it. If they feel the product can't motivate people on it's own merits, then they will resort to annoying to get the job done. Sorry, but that's just what they do. Loud is annoying, right?