Trouble re: Dolby TrueHD bitstream from HD-A2. Help !!

bhelms

Retired & lovin' it!
Original poster
Lifetime Supporter
Feb 26, 2006
7,801
864
Central PA
Hi folks! I tried to search this topic and came up empty.

I am trying unsuccessfully to get my HD-A2 to output Dolby TrueHD to my Sony 5300 receiver, which should be auto detected. In spite of the "Dolby TrueHD" logo on the instructions and box it now appears that the lowly HD-A2 cannot output the Dolby TrueHD bitstream over HDMI.

Regardless of the settings I choose in the A2 menus, the only thing I see in the receiver's display (appears at the start-up) says "Linear PCM: 48KHz". There is a "Multi-channel Decoding" light that also comes on, but that's it. The "flag" for whatever signal type is being received is absent. When I have my ViP211 sourced the receiver correctly lights-up the "Dolby Digital" flag (and the receiver also shows "Dolby Digital 3/2.1" at startup which refers to my configuration) and that Multi-channel light is on.

The manual for the A2 says (pg. 53) re: the "Digital Output HDMI": Auto selection "...When you play a disk recorded in Dolby Digital Plus, Dolby TrueHD, or DTS-HD, PCM is output (core only for DTS-HD). When you play a disk recorded in Dolby Digital or DTS format, digital audio signal (bitstream audio) is output..." The chart on pg. 60 then shows for all HD DVD audio options, in the Digital Output HDMI Auto mode what is output "Depend on HDMI Receiver".

I guess I am to take from this that the A2 cannot output Dolby TrueHD bitstream over HDMI !! Am I correct in that assessment? (Hoping no, but not optimistic!)

I'm finally catching on to all the subtleties of my mid price range surround sound system. My recent upgrade should allow the latest in reproduction capabilities. But my $97 HD DVD player is apparently not going to be part of the equation. I'm am still very happy with that unit for what I paid for it. But it led me to a sizeable collection (and growing!) of HD DVDs and now I want to make sure I can get everything out of them they have to offer.

If the A2 cannot output Dolby TrueHD, is there any other model in the Toshiba line that can? How about an A20? (I think I can still get one of those new at my local BB for $120.)

Tks for any help/suggestions on this situation...!

EDIT - Well, I got my answer! Only the second generation XA2 or the 3rd gen. A35 will bitstream out the HDMI for decoding in the Sony receiver. Now on the hunt for one of those...
 
Last edited:
The A2 should be capable of outputting the decoded TrueHD stream as uncompressed PCM. There is literally no difference between decoding in the receiver and decoding in the player. If the bits are different, it's not a lossless codec.
 
I am trying unsuccessfully to get my HD-A2 to output Dolby TrueHD to my Sony 5300 receiver...
A Toshiba player with a Sony receiver? That will never work...:D:D:D
I guess I am to take from this that the A2 cannot output Dolby TrueHD bitstream over HDMI !!
Read this
UltimateAVmag.com: Toshiba HD-A2 HD DVD player
...HD DVD players will convert these higher resolution audio formats to multichannel PCM and transmit it on an HDMI connection along with the video. For you to hear this, your AVR or pre-pro must be capable of playing multichannel PCM audio received via HDMI.

Diogen.
 
As John stated, the A2 will not output bitstream of Dolby TrueHD or DTSHD Master Audio codecs. The A2 will internally decode DD+ or Dolby TrueHD and output it as PCM to your receiver. Technically, there should be no difference between the receiver decoding and the player. As for DTS HD (HR or MA) the A2 will decode the DTS core (1.5 mb/s).
 
Tks, all. I guess I'm still a bit confused about the "lossless" concept. The A2 does indeed decode TrueHD internally and output PCM over HDMI, and as I noted the Sony is reporting "Linear PCM: 48KHz" being received plus the "Multi-Channel Decoding" light is on. (HDMI is the only connection between the units.) I thought this PCM was compressed however, i.e. something is lost (?) Would the lossless bitstream sent to the receiver result in a better end product? If the decoded PCM is the same as bitstream, then why would we need the bitstream output at all? I guess that's what's still confusing me...

Anyway, I picked-up an A35 yesterday. The bitstreaming over HDMI and 1080p capabilities are its advantages (tho' the reviews aren't impressed by the 1080p/24fps capabiliities of this model). So I can do some comparison now. Since both the A2 and A35 will send the TrueHD as PCM but the A35 also has the bitstream capability, there must be some difference. What am I missing ?? The A35 also gives me the multichannel analog outputs for further experimentation.

I'm not too concerned about not having the DTS-HD MA bitstreaming capability in either player. (I don't think that format was ever part of the HD-DVD suite, was it?)

Tks all for your replies. I recognize all three of you as some of our resident A/V "heavy hitters" and I appreciate the time you took to help educate me!

(PS, Diogen - Sony vs. Toshiba - perhaps Sony made a couple of concessions to the loser. The Sony let me rename one of the HDMI inputs "HD-DVD" without any protest! To be sure, tho', I'll also be in the BD camp before long...)
 
Last edited:
Decoding a soundtrack (any soundtrack, lossless or lossy) IS converting to PCM.

Decoding can be done in the player or in the receiver. The decoding process is identical regardless where it is performed.
Talking about the transfer of the bits from the player to the receiver/amp means entering audiophile territory (i.e. where math doesn't always work...:)).

I'm not an audiophile (by a couple miles) and what follows below is what I picked up from reading AVS...
The key word here is "jitter" - this is something that at least has something resembling logic in the explanation;
everything else is snake oil (in my non-audiophile opinion).

Reproducing sound = creating a certain amount of air pressure at a certain time. If the timing gets screwed - it is called jitter.
The timing can get screwed when you decode in the player and send analog to the receiver i.e. timing information isn't included in the stream.
When sending bitstreamed sound (i.e. for it to be decoded in the receiver) there is no chance for jitter to be "born"...
Hence, if you are an audiophile, your equipment and setup are top notch - you have fewer chances to encounter jitter when sound is
bitstreamed to the receiver and decoded there...

Diogen.
 
OK, tks for that! What it says is that there is an advantage to sending the bitstream to the receiver, and that answers my question. I'm guessing the closer the D/A conversion takes place to where the analog amplification is done the better, and fewer conversions are better. Then from what you're saying, where there is some distance involved it's safer to send the bitstream over that distance vs. a coded signal...

Tks again all for the education !!
 
OK, tks for that! What it says is that there is an advantage to sending the bitstream to the receiver, and that answers my question.

No, there isn't. All of the codecs are deterministic -- which means that a given input should produce the same output on every decoder.

For lossless Codecs; the data is identical from head-end (production) to tail-end (your house).

I'm guessing the closer the D/A conversion takes place to where the analog amplification is done the better, and fewer conversions are better.

It's irrelevant.

Then from what you're saying, where there is some distance involved it's safer to send the bitstream over that distance vs. a coded signal...

No, he didn't say that at all.
 
diogen;1351442I'm not an audiophile (by a couple miles) and what follows below is what I picked up from reading AVS... The key word here is "jitter" - this is something that at least has something resembling logic in the explanation; everything else is snake oil (in my non-audiophile opinion). [quote said:
Reproducing sound = creating a certain amount of air pressure at a certain time. If the timing gets screwed - it is called jitter.

True in digital. The analogue equivalents are wow and flutter (think analog tape and LP).

The timing can get screwed when you decode in the player and send
analog to the receiver i.e. timing information isn't included in the stream.

Its a seperate issue from this.

When sending bitstreamed sound (i.e. for it to be decoded in the receiver) there is no chance for jitter to be "born"...

Decidedly not true. Jitter is a timing error of the samples, and it can occur anywhere in the chain.

Transferring a bitstream is in and of itself non-jitter as it is packetized and reassembled. However, jitter prior to conversion can and does happen.

Hence, if you are an audiophile, your equipment and setup are top notch - you have fewer chances to encounter jitter when sound is
bitstreamed to the receiver and decoded there...

Meridian's been doing it wrong for years then, by decoding in their players and sending the PCM out to the processor.

Audiophiles don't use receivers ;)
 
I stumbled onto the following article that does a great job explaining the various formats in use, and the contrast between BD and HD HDV as to their deployment. Perhaps you all already know this, but it's great basic info for a novice like me:

High-Def FAQ: Blu-ray and HD DVD Audio Explained | High-Def Digest

I'm still looking for something definitive explaining why Dolby TrueHD and DTS HD MA are superior if bitstreamed from the player to the receiver or pre-pro vs. converted in the player and then sent as PCM or even analog audio. True there could be a difference in the quality of the converter/decoder chips used in the player vs. what's in the receiver/pre-pro. Is there more to it than that? Does the "lossless" bitstream entering the receiver/pre-pro get converted directly to analog audio instead of first being converted to a "lossy" PCM that would be sent otherwise and then decoded? If so, then I think that's the answer!

I did finally get the TrueHD working in my receiver last evening. I had the two settings in the A35 correct but I did not realize I also needed to set the audio in the movie itself to enable the TrueHD option. The hint that I did actually came from the review of the A2, link provided by Diogen above (Tks for that!). I was watching "Happy Feet" at the time. As soon as I made the change from DD+ to TrueHD the receiver reported "Dolby TrueHD" briefly in the display and the "TrueHD" light came on. And I heard almost instantly what seemed to be more "clarity" (more high-end ??) and a widening/expanding of the whole soundfield, as "virtual" speakers might do. Maybe it was just my imagination, but I sensed a difference and I liked it! I left it in that mode for the duration, but now I will do some A/B comparison, and also compare the bitstreamed signal to the linear PCM signal of the A2.

If anyone can point me to more resources similar to the above, please do so. We "pseudo audiophiles" really want to know, too...!

Tks and have a great weekend.

(It will be mostly rainy here, so I know what I'll be doing! I discovered that the Matrix Trilogy are all in TrueHD as are 300, Troy, and Bourne Supremacy so I have plenty of stuff to experience all over again! Maybe Batman Returns will even arrive today - can't wait for that one based on the review that considers it "reference quality"...!)
 
bhelms. I just finished reading all the replies you got. I kept looking for someone to ask if you had set disc to True HD, True HD light lit up on A-35 but did it go out and not come back on? I don't think I have ever seen the True HD indicator light on mine, guess I will have to check it out more closely.
 
Dirtydan - There is no TrueHD indicator on the A35. It does have an HDMI indicator that will light, I believe, when the "handshake" over HDMI is acknowledged, but that does not give any indication of what audio format is being sent, the "lossless" bit stream or other PCM.

The TrueHD light to which I referred is on the Sony receiver. From what I can tell, that one will only light when it is receiving the bitstream representing the lossless TrueHD track on the disk. (If it was receiving DTS HD MA bitstream, there is a separate light for that.) I only got that TrueHD to light on the receiver after I had correctly set both settings in the A35 and the TrueHD setting in the movie audio set-up, as you suggested. As soon as the unit completely reset itself the display on the Sony briefly showed "Dolby TrueHD" and that smaller light came on. Then the display went to other information, as it always does, but the small light remained illuminated throughout the movie until after the credits. (It switched back to DD when the selection menu reappeared.) That same display always showed "Linear PCM: 48KHz" for any setting with the A2. For the A35 it initially said "Dolby Digital +", and the separate small light for DD+ also came on and remained on. So the receiver is sensing different forms of digital input. I'm just trying to understand the difference.

From the replies above, every digital signal including one designated as "lossless" is some form of PCM, but apparently at varying bitrates. The link I provided does point out that the other renderings are "lossy" in terms of what material is not provided by the person engineering the mix, but that the "lossless" bitstream should be the full audio track from the film maker. Indeed, both players can decode the TrueHD bitstream internally, but what then is the output result? I contend that the 5.1 (or 7.1 for BD) analog signals should, or at least could, be the D/A converted lossless signal that can be sent to a receiver or pre-amp, whatever, and the A35 has the 5.1 analog outputs. (Those signals would be subject to whatever noise or interference degradation that could be picked up in audio cables, but otherwise should be "whole".) But the PCM signal that's being sent over HDMI from the A2 or A35 that's not the "lossless" bitstream - how is that different from the "higher quality" signal? Apparently there is a bitrate difference that the TrueHD signal (and even the DD+ signal IIRC) require the higher bandwidth 1.3 protocol. If there is no difference, why have both? That's my dilemma, and if the answer is contained in the posts above then I am still confused. No audiophile here, and perhaps not one capable of understanding this technology either...!

But the quest continues...
 
Last edited:
Decoding, no matter where done, no matter how transferred, CANNOT turn a lossles soundtrack into a lossy one.

As discussed, it could be possible to hear the differences depending on the who does the decoding and how it its transported -
but it is difference orders of magnitude less than lossy vs. lossless.

The very fact that some people can't hear the difference between a lossless and lossy track (me, for example, DD+ vs. TrueHD;
but some audio pros as well), shows the level of differences we are actually talking about...

Diogen.
 
diogen:

Knowing that I've got the same bits that were output from the mixing console is a nice warm fuzzy feeling.

I'm not going to argue the "this that or the other" on who can hear what.
 
bhelms, One reason for having both bitstream and pcm is that if you want to use the pip feature on some discs bitstream can't handle it.
Dan
 
The A2 should be capable of outputting the decoded TrueHD stream as uncompressed PCM. There is literally no difference between decoding in the receiver and decoding in the player. If the bits are different, it's not a lossless codec.

The A2 only supports upto 5.1 sound for Dolby and core for DTS. A definite limitation. Its bitstream support is unclear on whether 7.1 or 5.1 is supported. Since this is one of the features mentioned for the more advance A-35, I take that as it does not support it. None of the HD-DVD discs I have show 7.1 soundtracks.
 
Last edited:
The A2 only supports upto 5.1 sound for Dolby and core for DTS. A definite limitation. Its bitstream support is unclear on whether 7.1 or 5.1 is supported. Since this is one of the features mentioned for the more advance A-35, I take that as it does not support it. None of the HD-DVD discs I have show 7.1 soundtracks.

The A2 is not HDMI 1.3 so it cannot send the bitstreams for TrueHD, or DTS-HD or DTS-HD Master Audio.

The best option is internal decoding with the decoded bitstreams being sent out the HDMI port.

Cheers,
 
The A2 only supports upto 5.1 sound for Dolby and core for DTS. A definite limitation. Its bitstream support is unclear on whether 7.1 or 5.1 is supported. Since this is one of the features mentioned for the more advance A-35, I take that as it does not support it. None of the HD-DVD discs I have show 7.1 soundtracks.

There are maybe 10 or so HD DVD's that have 7.1 tracks. Most of those are DTS-HD HR or DTS-HD MA. You cannot hear the 7.1 of those with a A1, XA1, A2, A20, A3 or A30 period. The Dolby demo disc has Dolby TrueHD 7.1. All of the Toshiba, Onkyo, Integra and Venturer players will internally decode that and send it to a receiver capable of accepting 7.1 PCM over HDMI.
 

Users Who Are Viewing This Thread (Total: 0, Members: 0, Guests: 0)

Who Read This Thread (Total Members: 1)