Jump to content
Sign in to follow this  
Guest Black Label Society

HD TV ?

Recommended Posts

Guest Black Label Society

So, I'm watching the football game in HD, but I notice that when the camera is zoomed in on a player or object, the screen 'pixelates' a little bit when the object is moving.

 

 

 

 

 

Is that normal for a DLP or HD, or is it possible my signal is weak?

 

It says it;s running in 1080i.

Share this post


Link to post
Share on other sites

It's probably more a function of your provider. Most providers, satellite especially, use MPEG compression on their HD signals. This actually results in a worse picture than a native HD picture. It's like listening to MP3's vs. a CD. It's dumb but it's how they can get as many channels into their bandwidth as possible.

Share this post


Link to post
Share on other sites
Guest Black Label Society
It's probably more a function of your provider. Most providers, satellite especially, use MPEG compression on their HD signals. This actually results in a worse picture than a native HD picture. It's like listening to MP3's vs. a CD. It's dumb but it's how they can get as many channels into their bandwidth as possible.

 

 

 

So should I invest in a HD antenna?

Share this post


Link to post
Share on other sites
So should I invest in a HD antenna?

 

It would probably offer an improved signal for your local channels if it bothers you a lot. I have DISH and really don't notice the degradation. It may just be me though because I really don't notice much difference between the regular definition TV and the HDTV channels except in rare cases.

Share this post


Link to post
Share on other sites
Guest Black Label Society
It would probably offer an improved signal for your local channels if it bothers you a lot. I have DISH and really don't notice the degradation. It may just be me though because I really don't notice much difference between the regular definition TV and the HDTV channels except in rare cases.

 

 

 

WHAT?!?!?!?!

 

 

 

 

 

Dude, you're getting ROBBED then. There's NO comparison my man.

Share this post


Link to post
Share on other sites

Doesn't an interlaced signal have that issue, hence ther preference for a progressive scan signal?

Share this post


Link to post
Share on other sites
WHAT?!?!?!?!

 

 

 

 

 

Dude, you're getting ROBBED then. There's NO comparison my man.

 

Nah, don't think so. What I mean is it pretty much all looks damn good to me. I can definitely tell the difference on Discovery HD and some of the channels Dish got from VOOM!, but movies and what not look good on their standard def channels as well as the HD channels.

 

Doesn't an interlaced signal have that issue, hence ther preference for a progressive scan signal?

 

Progressive scan is certainly preferable to an interlaced signal, and I can't say for sure as it's been a while since I read up on this stuff, but from memory the main issue with interlaced signals is the potential to "flicker" since it's drawing every other line. I don't think pixelization is a huge issue with interlaced signals but I know compression can cause it. I won't say that with 100% certainty though.

Share this post


Link to post
Share on other sites
Guest Black Label Society
Doesn't an interlaced signal have that issue, hence ther preference for a progressive scan signal?

 

 

 

I don't know.....my TV just auto adjusts based on the signal. Should I turn it to 720p??

 

(CAn I turn it manually to 720p?)

Share this post


Link to post
Share on other sites

From cnet.com:

 

HDTV resolutions

 

Resolution, or picture detail, is the main reason why HDTV programs look so good. The standard-definition programming most of us watch today has at most 480 visible lines of detail, whereas HDTV has as many as 1,080. HDTV looks sharper and clearer than regular TV by a wide margin, especially on big-screen televisions. It actually comes in two different resolutions, called 1080i and 720p. One is not necessarily better than the other; 1080i has more lines and pixels, but 720p is a progressive-scan format that should deliver a smoother image that stays sharper during motion (for more on progressive scanning, see our primer). Another format is also becoming better known: 1080p, which combines the superior resolution of 1080i with the progressive-scan smoothness of 720p. True 1080p content is extremely scarce, however, and none of the major networks has announced 1080p broadcasts. Check out our comparison chart to see how HDTV stacks up against standard TV and progressive-scan DVD.

 

Also, here is another link from which I take that progressive pictures can get the flicker and interlaced are susceptible to a pixellation.

Share this post


Link to post
Share on other sites
From cnet.com:

Also, here is another link from which I take that progressive pictures can get the flicker and interlaced are susceptible to a pixellation.

 

Not to be argumentative, but i don't see the world pixel in any form on that page. I should also note that I don't find that page very credible because interlaced video is when you get flicker, not progressive. The main benefit of progressive scanning is to eliminate the flicker caused by interlacing.

Share this post


Link to post
Share on other sites
Not to be argumentative, but i don't see the world pixel in any form on that page.

All-in. Call me. :D

 

 

Seriously, I inferred it from this line

Two fields comprise the original image and are called a frame. This gets rid of the flicker, but it also does not display frames as clearly and solidly as progressive does

Share this post


Link to post
Share on other sites
All-in. Call me. :D

Seriously, I inferred it from this line

 

yeah, I figured that's where you got that from but that is a very vague description. They need to be more specific, but as I said above, I'm not giving that page too much credibility anyways.

Share this post


Link to post
Share on other sites
Guest Black Label Society
Not to be argumentative, but i don't see the world pixel in any form on that page. I should also note that I don't find that page very credible because interlaced video is when you get flicker, not progressive. The main benefit of progressive scanning is to eliminate the flicker caused by interlacing.

 

 

 

It's PC terminology. I figured somebody would know what I meant.

 

 

 

Now, I have to figure out if I can even manually turn this down to 720p.

 

 

 

Thanks Cgod and Strike! :D

Share this post


Link to post
Share on other sites

http://www.theavguide.co.uk/view_page.php?page=2

 

"

 

Non-Interlaced (De-Interlaced)

 

Also known as progressive, this term refers to the way a video image is displayed on screen where each line of a frame (one complete video image) is drawn on screen one after the other (one, two, three, four, five, and so on). This method of displaying a video image is in contrast to interlaced images that draw all the odd lines and then all the even lines in two separate fields to create the final frame or complete image.

 

However, progressive or non-interlaced video produces a higher quality image. Interlaced video suffers from flicker problems due to the full image not being displayed and from alignment problems where the odd lines do not exactly line up with the even lines. Alignment problems can be particularly bad in video containing fast moving images. Additionally, since fewer lines are projected at any given time using the interlaced format, there is a subjective degradation of picture quality and resolution compared to non-interlaced video."

 

It's PC terminology. I figured somebody would know what I meant.

 

 

 

Now, I have to figure out if I can even manually turn this down to 720p.

 

 

 

Thanks Cgod and Strike! :D

 

BLS,

 

Even if you can change the resolution/format you may not want to. I think, although I'm not 100% sure, that most digital receivers will figure out what format is coming in and set itself automatically. What you don't want to do is manually set your tuner to, say 720p, and then watch a 1080i program. If you can do that and do, you're converting it from one format to another. It will look a lot worse if it's converted than in it's native format, even if the native format is inferior to the converted format. In other words, even if you believe 720p is better than 1080i, the only time that's true is if you're watching a 720p program. And, from what i've read on some forums, a 720p picture actually has about the same perceived picture quality of a 1080i signal. It gains on the progressive scan side of the equation but loses in the actual resolution side of the equation.

As a sidenote, I've always thought it was stupid for them to come out with so many formats for HDTV. I know why they did it but it's dumb and just creates this massive confusion as this thread shows. The bottom line is that any HD format is going to look a hell of a lot better than what we've had for the last 40 years.

Share this post


Link to post
Share on other sites
Guest Black Label Society
http://www.theavguide.co.uk/view_page.php?page=2

 

"

 

Non-Interlaced (De-Interlaced)

 

Also known as progressive, this term refers to the way a video image is displayed on screen where each line of a frame (one complete video image) is drawn on screen one after the other (one, two, three, four, five, and so on). This method of displaying a video image is in contrast to interlaced images that draw all the odd lines and then all the even lines in two separate fields to create the final frame or complete image.

 

However, progressive or non-interlaced video produces a higher quality image. Interlaced video suffers from flicker problems due to the full image not being displayed and from alignment problems where the odd lines do not exactly line up with the even lines. Alignment problems can be particularly bad in video containing fast moving images. Additionally, since fewer lines are projected at any given time using the interlaced format, there is a subjective degradation of picture quality and resolution compared to non-interlaced video."

 

 

 

BLS,

 

Even if you can change the resolution/format you may not want to. I think, although I'm not 100% sure, that most digital receivers will figure out what format is coming in and set itself automatically. What you don't want to do is manually set your tuner to, say 720p, and then watch a 1080i program. If you can do that and do, you're converting it from one format to another. It will look a lot worse if it's converted than in it's native format, even if the native format is inferior to the converted format. In other words, even if you believe 720p is better than 1080i, the only time that's true is if you're watching a 720p program. And, from what i've read on some forums, a 720p picture actually has about the same perceived picture quality of a 1080i signal. It gains on the progressive scan side of the equation but loses in the actual resolution side of the equation.

As a sidenote, I've always thought it was stupid for them to come out with so many formats for HDTV. I know why they did it but it's dumb and just creates this massive confusion as this thread shows. The bottom line is that any HD format is going to look a hell of a lot better than what we've had for the last 40 years.

 

 

 

Word. THanks man.

Share this post


Link to post
Share on other sites

Strike, I see what you mean on that article, I actually got myself all twisted up.

 

I did a quick search for something because it seemed to me that I recalled reading somewhere that the interlace doesn't always match-up perfectly, especially during motion due to (albeit minimal) delay and that could cause the pixellation. (I thought it was the cnet article but a quick scan of it turned up only what I quoted.)

 

Also, I think I recall something about inadequate upconversion causing a similar problem.

 

I am sorry I don't have links, it's from memory from research I did over 6 months ago.

Share this post


Link to post
Share on other sites

I think no matter what the resolution of the signal you receive, your set will display the image at it's native resolution. My guess is that your set is <1080, and that's interlaced.

Share this post


Link to post
Share on other sites

I find the focus for the HD is slower than SD. But it is more of the focus on surrounding objects, not main object.

 

So watching golf, when the focus on the ball rolling on the green the blades of grass look fuzzy, but once the ball stops th blades come into focus.

 

Football.

FOX did a good job on not zooming fast or too far onto the ballcarrier. NBC zoomed in too far/fast and had the same loss of focus to me. Annoying.

Share this post


Link to post
Share on other sites

Rule of thumb is to always set your set top box's output to the native res of your TV. If you happen to have a 1080P television, then 1080i is gonna be the "best" since the newer 1080P sets have quality deinterlacing technology.

 

But if your box automatically switches between 720p and 1080i I would leave it alone. When it's 720p material you are fine and when it switches to 1080i material, your television is handling the scaling. Chances are your TV has a better scaler than your box.

 

Hmm, think I might have just repeated what you said Strike :thumbsdown:

 

Anyhow 720p VS 1080i is nothing to lose sleep over...

 

 

[bLS,

 

Even if you can change the resolution/format you may not want to. I think, although I'm not 100% sure, that most digital receivers will figure out what format is coming in and set itself automatically. What you don't want to do is manually set your tuner to, say 720p, and then watch a 1080i program. If you can do that and do, you're converting it from one format to another. It will look a lot worse if it's converted than in it's native format, even if the native format is inferior to the converted format. In other words, even if you believe 720p is better than 1080i, the only time that's true is if you're watching a 720p program. And, from what i've read on some forums, a 720p picture actually has about the same perceived picture quality of a 1080i signal. It gains on the progressive scan side of the equation but loses in the actual resolution side of the equation.

As a sidenote, I've always thought it was stupid for them to come out with so many formats for HDTV. I know why they did it but it's dumb and just creates this massive confusion as this thread shows. The bottom line is that any HD format is going to look a hell of a lot better than what we've had for the last 40 years.

 

 

I think no matter what the resolution of the signal you receive, your set will display the image at it's native resolution.

Correct.

 

My guess is that your set is <1080, and that's interlaced.

Nah, he said he has a DLP - so it's either 720P or 1080P if it's one of the new badboys...all fixed pixel display sets have a progressive native resolution.

Share this post


Link to post
Share on other sites
Rule of thumb is to always set your set top box's output to the native res of your TV. If you happen to have a 1080P television, then 1080i is gonna be the "best" since the newer 1080P sets have quality deinterlacing technology.

 

But if your box automatically switches between 720p and 1080i I would leave it alone. When it's 720p material you are fine and when it switches to 1080i material, your television is handling the scaling. Chances are your TV has a better scaler than your box.

 

Hmm, think I might have just repeated what you said Strike :huh:

 

Anyhow 720p VS 1080i is nothing to lose sleep over...

Correct.

Nah, he said he has a DLP - so it's either 720P or 1080P if it's one of the new badboys...all fixed pixel display sets have a progressive native resolution.

 

 

Which football game were you watching? I personally find that the top CBS games of the week (shown in 1080i btw) are the best looking football programs - hell, the best looking HDTV programs - out there. And yes, I watch my HDTV for free with a cheapo old bowtie UHF antenna on a (relatively) cheapo (and heavy!) CRT HDTV I bought from Wal-Mart with a built-in HDTV tuner. It is true that good old free over-the-air HDTV is uncompressed and (theoretically) the best quality picture available.

Share this post


Link to post
Share on other sites
Guest Black Label Society
Which football game were you watching? I personally find that the top CBS games of the week (shown in 1080i btw) are the best looking football programs - hell, the best looking HDTV programs - out there. And yes, I watch my HDTV for free with a cheapo old bowtie UHF antenna on a (relatively) cheapo (and heavy!) CRT HDTV I bought from Wal-Mart with a built-in HDTV tuner. It is true that good old free over-the-air HDTV is uncompressed and (theoretically) the best quality picture available.

 

 

 

Last nights NBC game.I'm gonna go get an air HD antenna and try that.

 

 

 

Thanks everyone for the input. :unsure:

Share this post


Link to post
Share on other sites
Nah, don't think so. What I mean is it pretty much all looks damn good to me. I can definitely tell the difference on Discovery HD and some of the channels Dish got from VOOM!, but movies and what not look good on their standard def channels as well as the HD channels.

Progressive scan is certainly preferable to an interlaced signal, and I can't say for sure as it's been a while since I read up on this stuff, but from memory the main issue with interlaced signals is the potential to "flicker" since it's drawing every other line. I don't think pixelization is a huge issue with interlaced signals but I know compression can cause it. I won't say that with 100% certainty though.

 

 

I could answer with 100% certainty.... :thumbsdown:

Share this post


Link to post
Share on other sites
Guest Black Label Society

Anybody else notice the Vikes Raiders game looking poor in HD?

Share this post


Link to post
Share on other sites
Anybody else notice the Vikes Raiders game looking poor in HD?

It was relative to the quality of the teams on the field and not the broadcast technology.

Share this post


Link to post
Share on other sites

I'm missing your point (no offense). You are watching a 1080i signal on a (theoretical) 1080i television. That's optimal, I agree. and yes, OTA HD is the most "pure" signal, I agree there too.

 

All I am saying is 1080i VS 720p to my eyes is a wash. I've had both RPCRT(1080i) and now have a 720p set. I think a lot depends on the equipment each station uses and how your HD provider gets the signal to you. Fox (to me) always seems to have the worst HD picture. CBS and ESPN are both awesome - as is NESN-HD and YES-HD (my HD is through Comcast).

 

 

Which football game were you watching? I personally find that the top CBS games of the week (shown in 1080i btw) are the best looking football programs - hell, the best looking HDTV programs - out there. And yes, I watch my HDTV for free with a cheapo old bowtie UHF antenna on a (relatively) cheapo (and heavy!) CRT HDTV I bought from Wal-Mart with a built-in HDTV tuner. It is true that good old free over-the-air HDTV is uncompressed and (theoretically) the best quality picture available.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

×