The ANSWER to last night's HD Poll

yeah...I believe it drifted. :D


Seems like all these threads drift, I think all non topic discussions should be deleted and back to topic, Scott when is Dish gonna put the rest at true mpeg4, or at least some more of them! Any ideal! Im sure all new HD channels would be anyways!
 
Anyways back on topic

Duke/Maryland game on ESPN-2 hd is worse than the NBA game on ESPN-HD, wow you watch the action and BOOM the screen blurs in shock and then comes back. This is putrid to watch!

is something wrong with my computer, this is the only site after it loads it looks like a deformed face!
 
is something wrong with my computer, this is the only site after it loads it looks like a deformed face!

Not sure what you're talking about there?

But I watchted ESPN2HD off and on all weekend and didn't notice any problems at all.
 
OK, I'll take a stab at getting it back on topic

I am already seeing some advantages and disadvantages to the new encoders. I recorded several hours of Nat'l Geographic since the change. On the up-side 4 hours of programming on "cost" me 2 hours on my DVR, on the downside I have heard about 5 audio pops in the first 10 minutes.

I am sure there will be some tweaking in the coming days, but all in all I like the change. I will like it more once we get the benefit of additional programming.
 
I seem to have a lot more video noise on my channels which started a few days back before that my HDTV and SDTV were amazing now not to good.
 
Does anyone know if the current E* HD receivers and HD DVRs have hardware capable of receiving a 1080p signal? Is it 1080p ready or can it be upgraded to 1080p with a software upgrade?

Would 1080p require a new receiver?

Sorry for all the questions but I was wondering.
 
Does anyone know if the current E* HD receivers and HD DVRs have hardware capable of receiving a 1080p signal? Is it 1080p ready or can it be upgraded to 1080p with a software upgrade?

Would 1080p require a new receiver?

Sorry for all the questions but I was wondering.

I doubt E* will bother making 1080p receivers any time soon, since nobody broadcasts in this standard, and probably won't. Cable channels have no reason to do it, since they would get downrezzed anyways (call it TRUE-HDlite) and even the ota channels have limitations on the amount of info they can broadcast effectively. (And no sat company is going to add this feature even if ota was 1080p, there's nothing in it for them.)
 
The difference is 9" is larger and finer pitch to get nearly true HD. I think it was some of the older Mitsu's of the 65" size that had 9" guns. 90% of CRTs used 7" guns. Don't get me wrong, I have a well calibrated CRT as well and feel it beats most sets out there. I fully agree with your comments.

Anyway, back on topic.

My 53" Pioneer has 9" guns. It's a great picture
 
Maybe even a magical "software" fix for my HDMI output on my 622....

I haven't been here in a while, but I noted your comment about an HDMI bug on your 622. I am running a 622 with HDMI at home without any real problems. There was some glitch with the dual tuner, but it didn't seem to have anything to do with HDMI.

Lori
 
And yet another thread of the ignorant, goofy, explanations vs. the science.

Folks, the original purpose of the 1080i HDTV signal was speced to accomodate the, back then popular, CRT which used electron beam scanning. The H "pixel" or line resolution spec was derived mathematically as the maximum permissible allowed for a broadcast signal to fill the licensed 6 Mhz channel bandwidth for broadcast TV. It was not arbitrarily decided on just to have a cool looking HDTV image that had to be achieved in order to be known as HDTV spec. The 720P x 1280 signal was designed to also fill the channel bandwidth as a progressive HDTV signal to accomodate the digital display devices that paint a solid image instantly on the screen as opposed to the interlaced scanning method that used phosphor persistance to paint the full 2 field image. Cross technologies allowed an interlaced scanned signal to display on digital progressive monitors using de-interlaced methods, some worked better than others and vice versa for P images on an interlaced display but both were tainted with artifacting. The recent technology of a progressive display has had dominance in the 720 x 1280 pixel array imagers that matched the 720P x 1280 HDTV spec so that a good match up of signal to "native" resolution could be achieved. Other devices as was already mentioned needed some interpolation to achieve the native non-standard HD resolution for display. This has varying degrees of artifacting. In the 1080i CRT devices, most are not able to achieve full line resolution detail in the display, especially in the H res spec of 1920. Sony once published a white paper on the lab test of the G90 where they actually measured the G90 at 1920 line resolution under laboratory conditions. It was their only PJ capable of doing this but the image was not considered usable for HT viewing. It had a to be reduced in beam current to such an extent to achieve beam focus that the image was quite dark and then had to be reduced in size to a rather small screen size not typical of G90 installations. Using a typical 72" diag screen size the maximum resolution of the H lines was a tad over 1100 as best achieved. Surprised? If you understood the physics involved it should not be. But that was then and today, the science of HDTV had evolved, not in the CRT camp but in the digital with several leading progressive technologies. Not until mid year 2006 did 1080P x 1920 displays begin to appear on the market and then only in very limited production. Nearing the end of 2006 holiday season the offering was quite expanded but still most of the sets I saw in the stores were in the 720P camp, not 1080P. What Scott and others stated is essentially correct in that many monitors can receive a 1080p signal but can't image that signal due to the lessor native resolution of the screen. IN 2007 this will not be the case as the 1080P x 1920 imagers become the main production line. During 2007, I do believe 1080P x 1920 monitors will be the main ones available.

But what about signal? What about program source? If you want true 1080P x 1920 program source, better stick with the latest DVDs for HD, both HD DVD and Blu Ray as no broadcast TV station will be allowed to transmit this. IT is possible that a closed circuit system may be capable of sending you a 1080P x 1920 image signal but don't look to the traditional acquisition and distribution channels for this. Until recently, ALL field acquisition for broadcast was restriced to 1080i x 1440 pixels or less. Recent advancements in equipment has allowed this to now record at the full 1920 lines for the 1080i acquisition. Unfortunately, the infrastructure of the acquisition industry for HDTV just doesn't upgrade to the latest technology as soon as it is released en mass. For years to come the majority of acquisition will be limited to 1440 H pixels maximum. Dumbing down from this, other "limiting" circumstances may be added to limit this resolution even more for both business and artistic reasons. So what this means is, that if you insist on only watching HDTV if it is a full 1920 pixels H resolution and you do not have one of the new HD DVD formats, you may as well just shut off your TV and go play checkers since it is a long way off from ever seeing that resolution on either broadcast or DBS or cable programming source.
Possible sources of true 1080i x 1920 could come from sources like HBO, Showtime, and other companies that convert film to D5 tape and distribute these D5 tapes to providers that have the modified D5 playback for transmission and allocate the full 19.4 Mbs bandwidth for that signal. Will that happen? Doubt it ( just doesn't make good business sense as so few people will be able to view it anyway)so again, if you want this level of quality, get a HD level DVD technology for viewing.

The game plan should be that you have all levels of program sources, from SD highly compressed, to Blu Ray HD DVD. But don't insist that all HD signals meet one standard. that of the highest limit allowed under the spec. Instead, enjoy the programming for it's content and have your home theater capable of displaying what is being made available for that program. Continue to complain about providers who are dumbing down the programming beyond the current popular viewing capability of the public. My opinion, is that this, today is 720P x 1280 but maybe in 2 years it will be 1080P x 1920. Remember it's not what was pervasive at CES2007, nor what is pervasive being sold now in 2007 in the stores, but what is common in homes across America for the HDTV viewing audience. In fairness, that is what I expect the majoriety of programming to be offered at for HDTV within the law, but also, to have cutting edge technologies available, like HD DVD and Blu Ray for those who want to witness all that we can have today.
 
And yet another thread of the ignorant, goofy, explanations vs. the science.

Folks, the original purpose of the 1080i HDTV signal was speced to accomodate the, back then popular, CRT which used electron beam scanning. The H "pixel" or line resolution spec was derived mathematically as the maximum permissible allowed for a broadcast signal to fill the licensed 6 Mhz channel bandwidth for broadcast TV. It was not arbitrarily decided on just to have a cool looking HDTV image that had to be achieved in order to be known as HDTV spec. The 720P x 1280 signal was designed to also fill the channel bandwidth as a progressive HDTV signal to accomodate the digital display devices that paint a solid image instantly on the screen as opposed to the interlaced scanning method that used phosphor persistance to paint the full 2 field image. Cross technologies allowed an interlaced scanned signal to display on digital progressive monitors using de-interlaced methods, some worked better than others and vice versa for P images on an interlaced display but both were tainted with artifacting. The recent technology of a progressive display has had dominance in the 720 x 1280 pixel array imagers that matched the 720P x 1280 HDTV spec so that a good match up of signal to "native" resolution could be achieved. Other devices as was already mentioned needed some interpolation to achieve the native non-standard HD resolution for display. This has varying degrees of artifacting. In the 1080i CRT devices, most are not able to achieve full line resolution detail in the display, especially in the H res spec of 1920. Sony once published a white paper on the lab test of the G90 where they actually measured the G90 at 1920 line resolution under laboratory conditions. It was their only PJ capable of doing this but the image was not considered usable for HT viewing. It had a to be reduced in beam current to such an extent to achieve beam focus that the image was quite dark and then had to be reduced in size to a rather small screen size not typical of G90 installations. Using a typical 72" diag screen size the maximum resolution of the H lines was a tad over 1100 as best achieved. Surprised? If you understood the physics involved it should not be. But that was then and today, the science of HDTV had evolved, not in the CRT camp but in the digital with several leading progressive technologies. Not until mid year 2006 did 1080P x 1920 displays begin to appear on the market and then only in very limited production. Nearing the end of 2006 holiday season the offering was quite expanded but still most of the sets I saw in the stores were in the 720P camp, not 1080P. What Scott and others stated is essentially correct in that many monitors can receive a 1080p signal but can't image that signal due to the lessor native resolution of the screen. IN 2007 this will not be the case as the 1080P x 1920 imagers become the main production line. During 2007, I do believe 1080P x 1920 monitors will be the main ones available.

But what about signal? What about program source? If you want true 1080P x 1920 program source, better stick with the latest DVDs for HD, both HD DVD and Blu Ray as no broadcast TV station will be allowed to transmit this. IT is possible that a closed circuit system may be capable of sending you a 1080P x 1920 image signal but don't look to the traditional acquisition and distribution channels for this. Until recently, ALL field acquisition for broadcast was restriced to 1080i x 1440 pixels or less. Recent advancements in equipment has allowed this to now record at the full 1920 lines for the 1080i acquisition. Unfortunately, the infrastructure of the acquisition industry for HDTV just doesn't upgrade to the latest technology as soon as it is released en mass. For years to come the majority of acquisition will be limited to 1440 H pixels maximum. Dumbing down from this, other "limiting" circumstances may be added to limit this resolution even more for both business and artistic reasons. So what this means is, that if you insist on only watching HDTV if it is a full 1920 pixels H resolution and you do not have one of the new HD DVD formats, you may as well just shut off your TV and go play checkers since it is a long way off from ever seeing that resolution on either broadcast or DBS or cable programming source.
Possible sources of true 1080i x 1920 could come from sources like HBO, Showtime, and other companies that convert film to D5 tape and distribute these D5 tapes to providers that have the modified D5 playback for transmission and allocate the full 19.4 Mbs bandwidth for that signal. Will that happen? Doubt it ( just doesn't make good business sense as so few people will be able to view it anyway)so again, if you want this level of quality, get a HD level DVD technology for viewing.

The game plan should be that you have all levels of program sources, from SD highly compressed, to Blu Ray HD DVD. But don't insist that all HD signals meet one standard. that of the highest limit allowed under the spec. Instead, enjoy the programming for it's content and have your home theater capable of displaying what is being made available for that program. Continue to complain about providers who are dumbing down the programming beyond the current popular viewing capability of the public. My opinion, is that this, today is 720P x 1280 but maybe in 2 years it will be 1080P x 1920. Remember it's not what was pervasive at CES2007, nor what is pervasive being sold now in 2007 in the stores, but what is common in homes across America for the HDTV viewing audience. In fairness, that is what I expect the majoriety of programming to be offered at for HDTV within the law, but also, to have cutting edge technologies available, like HD DVD and Blu Ray for those who want to witness all that we can have today.


I'm out of breath just reading this one,lol! :)
 
And I gather that some of the source material for HD-DVD/Blu-ray is not film, and so may well be at the 1440 limit. So even those discs will not always have 1920 either.
 
I just compared my recordings of Underworld Evolution. The new recording looks every bit as good as the older one. Same quality, less disc space. I love it.
 
I haven't been here in a while, but I noted your comment about an HDMI bug on your 622. I am running a 622 with HDMI at home without any real problems. There was some glitch with the dual tuner, but it didn't seem to have anything to do with HDMI.

Lori

About 50% of the 622s had their HDMI outputs fail to send a signal anymore. Some think that E*'s explanantion of a software problem is bogus, and that the pins holding the port to the motherboard break, severing the connection (hence the fact that some can fix it for a while by jiggling the cable). Others have no problems (My father is one of them, but his receiver is on top of his dresser with plenty of air... mine is in a corner cabinet so heat buildup may have contributed to the problem)
The problem has an easy workaround in that component cable outputs are unaffected. If you don't have enough Component Inputs freed up it may be an issue.....

Sorry to hijack, ... just wanted to explain it to Lori.
 
All he really said though was that he is not sure that the majority of sets sold in that period can display 1080p. I don't know if that is true either but it is surprising if it is true. I would agree thata 1080p set can do that. but what percentage are 720p or 1080i?


People are mentioning high end sets like the SONY KDS50A2000. Sure most $2,000 plus TVs can handle 1080p but that is nott hwe question. The question is can the majority of sets sold do it?

I'm way late getting back to this.

A 1080p microdisplay is an inherently progressive device. It shows one full screen at a time. A 1080i signal is deinterlaced by some means to get a 1080p display. Basic video these days.

CHeers,
 
Whats a microdisplay? ;)

You know you sure didn't read what I wrote; did you? I said exactly what a microdisplay was after I used the term.

Microdisplay is a way of lumping together transmissive microchips (aka LCD panels); MEMS (Microelectronic Mechanical Systems) aka DLP with its mirrors and transreflective microchips (LCoS (including D'ILA and SXRD which are variation on the LCoS theme)).

Technically speaking flat panels are not microdisplays; but they are usually lumped together with microdisplays because they too are inherently progressive.

Cheers,
 
I wonder what true 1080p shows those true 1080p display owners are watching and from where?

Get real, a majority of sets sold last year were not 1080p sets. 1080i capable yes.

This argument is getting old and sounds like Best Buy jibberish.

You guys really need to read some video primers...

Every scripted TV show in production today is film based (16mm or 35mm or digital film in a couple of cases). That material is then converted from it's film based 24 fps to an interlacing processed called Telecine. This uses a cadence of 3:2 for each alternating frame of video. So you have 3 frames of odd numbered frames and 2 of even numbered frames.

A microdisplay has to show a progressive image; so a 1080p shows 1080p. There are a variety of ways to get back from 1080i --> 1080p. Assuming it's a good job on the conversion most TV sets can reassemble the frames back to their progressive values; albeit at 30p instead of 24p. This is then doubled to show at 60p. The bad part is that 60 is not an integer multiple of 24; so it can introduce an artifact called judder which makes slower pans look a bit jumpy.

This is where high quality video processors come in. They have the ability to capture the original frames and get back to the 24p signal. This can then be shown at an integer multiple of 24 (ie 24/48/72/96/120p).

For material that is native interlaced (think 1080i sports) it's a much more complex job as each field (1/2 of a 1080p frame) is a discrete point in time. Simply putting the odd and even fields together looks terrible. Here, a quality "deinterlacing" is a matter of utilizing motion compensation and (in some chips) Directional correlation. This works to join the two fields together in an intelligent fashion to minimize any loss of resolution in the resulting progressive frame. This costs big $$$ at this time.

For channels that broadcast in 720p; it's just scaled to 1080p and displayed.

So there is actually plenty of content out there that is "progressive". The set or video processor just needs to know how to put it back together from the interlaced input.
 
1) There was a home theater mag article that reviewed several different 1080 and 720 big screen tvs of different technologies. If memory serves me, his results listed several TVs were not using the full 1920x1080 signal and in fact only using the first 540 and doubling up the lines (540 * 2 = 1080) and droping the second 540 lines.

That's called "Bob" deinterlacing. It isn't deinterlacing at all.

2) Currently Panasonic has a commerical 1080p 50" plasma, SONY, Sharp have 1080p LCDs (with 1080p input)...while JVC currently has a 1080p LCD display, but only with a 1080i input.

See my previous post about getting to 1080p from 1080i.

3) Would be nice to know whether the local OTA signal are true 1920x1080, though there is talk that they are not sending the full 1920x1080 (maybe 1440x1080, etc) and using the rest for other channels and/or private communications.

4) Hope E* will swap out my 6000 some day for free (not to lease). 6000 still going strong, but the OTA receiver is not as good in my cheap LCD TV tuner.....

Beyond the scope of my discussion.
 

AT&T HomeZone In CT

508 & Xbox 360 Interference

Users Who Are Viewing This Thread (Total: 0, Members: 0, Guests: 0)

Who Read This Thread (Total Members: 1)

Latest posts