dslreports logo
 
    All Forums Hot Topics Gallery
spc
Search similar:


uniqs
3025

joelny
@mycingular.net

joelny

Anon

native pass 1080p tv

Should I use native pass thru on a 1080p tv? The TV is a sharp from 2010. Should this give me better pq compared to fixing it on 720p?
PJL
join:2008-07-24
Long Beach, CA

PJL

Member

You should try it, but be aware you will likely see a delay when your TV switches from 1080i to 720p to 480i (all are used on various FiOS channels). I have the STB set to output 1080i and feel that is the better approach. The TV then converts the signal to its native 1080p.

matcarl
Premium Member
join:2007-03-09
Franklin Square, NY

matcarl

Premium Member

Even though it's annoying when the TV takes a few seconds to adjust to each different resolution, a lot of people like native so the signals don't have to be processed so many times. If you have your TV set to 1080i all the time, then if you are watching a 720p channel, the signal is being processed from 720p by the box into 1080i and then processed again by your TV into 1080p which could resuult in some picture quality issues.
bwintx
join:2005-08-21
Plano, TX

bwintx to joelny

Member

to joelny
Simple question to ask yourself: "Do I tend to channel-surf?" And, if so, the less simple follow-up: "Do I surf between 720p and 1080i channels?" If the answer to both is "Yes," you probably don't want to set the box to native. However, if you tend to find a channel and stay there for a long time without looking in on something else, then native may well be preferable.
floridafuzz
Premium Member
join:2008-02-23
Hope, RI

floridafuzz to joelny

Premium Member

to joelny
I would like to add my 2 cents in as well. I had all 3 TV's set to native pass thru in the house. The delay when switching channels drove my wife crazy! I left native on the big screen but took it off the TV's in the kitchen and bedroom. Try it and see if the delay switching channels is tolerable.

bluepoint
join:2001-03-24

bluepoint to PJL

Member

to PJL
said by PJL:

The TV then converts the signal to its native 1080p.

I'm not discharging your statement as false but rather wants to know why the TV would convert to 1080p if the source is 1080i. Is there a reference regarding this that we can read? Thanks.
shark2k
join:2008-06-01
West Orange, NJ

shark2k

Member

said by bluepoint:

said by PJL:

The TV then converts the signal to its native 1080p.

I'm not discharging your statement as false but rather wants to know why the TV would convert to 1080p if the source is 1080i. Is there a reference regarding this that we can read? Thanks.

The TV needs to convert to its native ratio, which on a 1080p TV is 1080p.

-Shark2k
PJL
join:2008-07-24
Long Beach, CA

PJL

Member

said by shark2k:

said by bluepoint:

said by PJL:

The TV then converts the signal to its native 1080p.

I'm not discharging your statement as false but rather wants to know why the TV would convert to 1080p if the source is 1080i. Is there a reference regarding this that we can read? Thanks.

The TV needs to convert to its native ratio, which on a 1080p TV is 1080p.

-Shark2k

Thanks for the reply Shark. It's correct -- a display must use it's native resolution so any other resolution from the source must be converted to that native resolution.

bluepoint
join:2001-03-24

bluepoint

Member

So why is it when the source channel is in 480i format the picture is small with black square around it when the STB is set @1080i and stretch when the STB is set native?
PJL
join:2008-07-24
Long Beach, CA

PJL

Member

said by bluepoint:

So why is it when the source channel is in 480i format the picture is small with black square around it when the STB is set @1080i and stretch when the STB is set native?

That is a function of how the device zooms (or does not zoom) the image, not the resolution.

bluepoint
join:2001-03-24

bluepoint

Member

I assume the TV have the resolution converted to 1080p at that point. I guess my question is, why does 480i source doesnot look as good as the 1080i source when the TV had converted it to its native resolution?
PJL
join:2008-07-24
Long Beach, CA

PJL

Member

said by bluepoint:

I assume the TV have the resolution converted to 1080p at that point. I guess my question is, why does 480i source doesnot look as good as the 1080i source when the TV had converted it to its native resolution?

Your asking about picture quality I guess. Converting for display does not improve (significantly) the quality of the source. A 1080i image is higher quality (smaller pixels in a sense) tham a 480i. The upconversion doesn't put more information thast is the source, it just display it at the higher rate by interpolating the signal.
elefante72
join:2010-12-03
East Amherst, NY

elefante72 to joelny

Member

to joelny
Count confuse panel resolution with content resolution and scan type

First off you need to go to your tv manual and see what the native panel resolution. For instance a 720 tv is usually 1280x768 or the like. A 1080 panel 1920x1080 or something around that.

Next don't confuse content resolution. Today high def in the us is either 720p or 1080i. Sd is always interlaced at 480i. Some programming is actually broadcast in 24p and sent as 1080i in the transport stream.

No broadcasters that I know of broadcast in 1080p (which really means 1080p30). Superhd from Netflix is an example of a 1080p potential stream.

Next p is progressive meaning 24 or 30 scans per second and interlaced is 60 scans per second over half the field. The actual values are a little difference but you can read up on atsc if you are really interested.

Since interlaced signal was meant for electron gun tvs plasma and hd newer tvs convert the interlaced content signal into a progressive panel output using image processing and pull down. Some tvs do it better than others and this has little to do with the panel resolution and everything to do with electronics in the tv.

Now if the panel is a 1080 panel, then no matter what signal you feed it 1080i, 720p, or 480i the tv must process that signal and convert it to a native resolution for the panel including typically a 30 refresh rate or 24 depending upon source signal when you see a 240hz LCD panel that means the tv is taking the signal and converting it to a 240hz refresh on the panel which if you notice is not a broadcast refresh. Plasmas refresh at 600hz. All of these games affect output quality and how the electronics handle the processing.

The delay in native pass through if switching resolutions is thanks to an hdmi handshake not an issue with the tv or stb. You can thank drm for that.

So if you set the resolution to a fixed value in thes stb then the stb processes the picture and then when sent to the tv it reprocesses it. So obviously two levels or reprocessing is not ideal but most people wont notice. If you leave it on native then you deal with hdmi handshake and one level or reprocessing on the tv.

Also be aware that now certain channels for verizon are moving to h.264 referred to as mpeg4 and many older tvs do a piss poor job of decoding them so it may be necessary for the stb to decode them first.

So if you have a 1080 panel best to set the stb resolution to 1080i and 720p if you have a 720 panel. Again you must look in your tv manual to look up panel resolution. I have seen tvs be advertised with 1080p and have 720 panels because they can decode 1080p does no mean they can display 1080p in its native resolution.

bluepoint
join:2001-03-24

bluepoint to PJL

Member

to PJL
Oooops sorry, I just showed now cause we lost power in the middle of the storm. It appears I'm comparing apples and oranges after I've read elefante72 See Profile lengthy lesson. I think I have a good idea of what's going on now. Thanks for your time and everyone who contributed to my question.

TitusTroy
join:2009-06-18
New York, NY

TitusTroy to joelny

Member

to joelny
this thread is confusing me...I have a Panasonic 1080p plasma set and I've always preferred setting the cable box to output 'native passthrough'...meaning a 720p channel like FOX gets output in 720p and a 1080i channel like NBC gets output in 1080i...I always prefer the channel outputting it's native signal with no upscaling (I don't mind the slight delay when switching from a 720p channel to 1080i and vice versa)

so when does my 1080p TV upscale anything to 1080p?...if I set it to 1080i in the cable box won't it upconvert every channel to 1080i?...and if I set it to 720p won't it output a max of 720p?...how does 1080p figure into anything
tnsprin
join:2003-07-23
Bradenton, FL

1 edit

tnsprin

Member

1080p as used on most TV displays is actually 1080p60 (60 frames per second). 720p is 720p60. 1080i is 60 fields a second but only 30 frames per second. Usually by going native pass through you will improve the picture on a 1080p set as the frame rate is maintained throughout the conversion process. Converting to 1080i and then to 1080p means first you reduce the frame rate but interpolate the pixels to got the larger pixel count of the screen. Then try to double the frame rate (some sets just show the same frame twice).

More Fiber
MVM
join:2005-09-26
Cape Coral, FL

More Fiber

MVM

said by tnsprin:

(60 frames per minute).

That would be frames per second, not minute.

bluepoint
join:2001-03-24

bluepoint to TitusTroy

Member

to TitusTroy
said by TitusTroy:

so when does my 1080p TV upscale anything to 1080p?...if I set it to 1080i in the cable box won't it upconvert every channel to 1080i?...and if I set it to 720p won't it output a max of 720p?...how does 1080p figure into anything

Exactly what I was thinking. When I change channel from 1080i to 720p/480i in native settings, the TV pause to adjust the resolution to what it receives from the source(this is the one that causes the delay). For each resolution, from my naked eye, there are differences. If all feeds were converted to 1080p by the TV then why is 1080i channel very different from a 720p?
elefante72
join:2010-12-03
East Amherst, NY

elefante72 to TitusTroy

Member

to TitusTroy
If the panel is a 1920x1080 panel, the electronics in the TV must convert whatever broadcast signal is coming in to the native panel resolution.

So a 720 signal -> 1920x1080
A 1080 signal -> 1920x1080
a 480 signal -> 1920x1080

Broadcast:

720p (30 or 60 fps (actually a little less)) I don't see much 60fps material.
1080i (60fps or 24fps (tagged))

Now most plasmas take that signal and convert it to a 600Mhz refresh. LCD does 60, 120, or 240, or something in between. The "judder" from LCD is due to the GTG (gray to gray) response times of cheaper panels and the inferior electronics that are in some cheaper models. Plasmas rarely had that issue because they were designed as a presentation format, not LCD which has been retrofitted.

The largest issue always comes from 24p sources, which are movies and certain HD prime time broadcasts. The TV has to use pulldown or a number of tricks (that is why 120/240 HZ LCD is theoretically better because it can sync the refresh signal into a whole number. A 60Hz panel cannot. Amazingly many of the 120/240 panels still do a crappy job at this and people see judder.

bull3964
@208.40.147.x

bull3964 to bluepoint

Anon

to bluepoint
said by bluepoint:

Exactly what I was thinking. When I change channel from 1080i to 720p/480i in native settings, the TV pause to adjust the resolution to what it receives from the source(this is the one that causes the delay). For each resolution, from my naked eye, there are differences. If all feeds were converted to 1080p by the TV then why is 1080i channel very different from a 720p?

The pause is due to a HDMI handshake renegotiation due to the change in resolution. When you have the STB just outputting in 1080i, that's not necessary since it's the first link in the chain and the resolution is changed before it hits the HDMI output. However, when the TV is handling the scaling, the STB basically has to break the connection to the TV and create a new one when the resolution of the signal changes. This causes the pause you see.

On digital displays, no matter the technology, the signal must ultimately be converted to the native resolution of the display. Each pixel is given a color and intensity to display. If your input signal is 1280x720 but your display is 1920x1080, pixels need to be added to the frame in order for the picture to fill the full screen. Otherwise, a 720p broadcast would be windowboxed completely on a 1080p TV.

This is upscaling. Upscaling can be simple (multiplication of pixels to increase the resolution of the picture) or it can be complex (a computer algorithm tries to guess what the missing information should be based on the surrounding pixels.) You can see examples of how these different techniques work here (»en.wikipedia.org/wiki/Im ··· _scaling). Things are even more complex with video as you have a temporal component (a series of images) that can assist in determining what the missing information could be and due to the fact that the signal may be interlaced (picture divided up into alternating lines from two different frames.)

So, the end image quality you see is determined by the source and by the upscaling method. With a 480i input source, upscaling can only do so much. Detail that doesn't exist just doesn't exist and even with the best scaling algorithms, the end result is going to be a bit fuzzy and lacking in detail. The same goes for 720p, only less severe since you start out with a much better signal.

As to the first question, whether or not to use native output, the real answer is "it depends." If you have a higher end TV with advanced picture processing or if you have a higher end HDMI receiver with a good scaling chip built in, it's probably better to set the output to native and let the other devices handle the scaling because they will probably do a better job. Otherwise, just set the output resolution to match (1080i -> 1080p doesn't need scaling, just deinterlacing and the content should be such that any TV can handle the deinterlacing more or less perfectly.)

matcarl
Premium Member
join:2007-03-09
Franklin Square, NY

matcarl

Premium Member

said by bull3964 :

The pause is due to a HDMI handshake renegotiation due to the change in resolution.

I don't believe so, as I think it happens if you are hooked up to Component also. I haven't had that hook up in quite some time, but I believe when I did, the same pause happened.

MeatChicken
join:2007-08-15
Paramus, NJ

MeatChicken

Member

said by matcarl:

said by bull3964 :

The pause is due to a HDMI handshake renegotiation due to the change in resolution.

I don't believe so, as I think it happens if you are hooked up to Component also. I haven't had that hook up in quite some time, but I believe when I did, the same pause happened.

It's probably just a "Resolution"handshake" which would happen w/ HDMI or Component, where the TV needs a second or 3 to reconvert the input to the screen resolution ....

bull3964
@208.40.147.x

bull3964 to matcarl

Anon

to matcarl
said by matcarl:

I don't believe so, as I think it happens if you are hooked up to Component also. I haven't had that hook up in quite some time, but I believe when I did, the same pause happened.

Same outward effect, but likely for slightly different reasons. No matter what the interconnect technology used, you are breaking sync with the display to shift modes when you have the STB outputting native. If the STB is doing the scaling, then the continuity of the video signal never breaks so the TV doesn't have to resync the input.

You can have the same effect on a PC by playing with the driver options. On some video cards, you can have the video card do the scaling or allow the display to do the scaling. If you chose the former, you won't have your display blank out while going from a 1080p desktop to a 720p game. If you do the later, the video sync is broken as the display changes modes to accept the new input resolution and you see a blink.

Really, the TV is shifting to a temporary mode of "no input signal" between changes in resolution and that causes the pause you see.

joelny
@mycingular.net

joelny to matcarl

Anon

to matcarl
720p feeds would look real bad on 1080i on my 1080p TV. Setting the TV to 720p helped. I think native pass fixed this issue. I only selected 720 and 1080i so not that much switching. My other TVs I just left them on 720p.

Napsterbater
Meh
MVM
join:2002-12-28
Milledgeville, GA
(Software) OPNsense
Ubiquiti UniFi UAP-AC-PRO

Napsterbater to elefante72

MVM

to elefante72
said by elefante72:

Also be aware that now certain channels for verizon are moving to h.264 referred to as mpeg4 and many older tvs do a piss poor job of decoding them so it may be necessary for the stb to decode them first.

The TV see no difference between a mpeg2 or mpeg4 channel it doesn't know any better because it doesn't see that info otherwise you would have needed a new TV when mpeg4 came out, the cable box no matter what mode its in "unwraps" the video and sends it to the TV, pass-tough/native just send is in the same resolution its receives it as where as the other mode (fixed resolution from the cable box) forces the box to unwrap the signal then convert the resolution then send it to the TV.

bluepoint
join:2001-03-24

2 edits

bluepoint to bull3964

Member

to bull3964
said by bull3964 :

However, when the TV is handling the scaling, the STB basically has to break the connection to the TV and create a new one when the resolution of the signal changes. This causes the pause you see.

I think you got it in reverse. When the STB is in native setting, it actually is just passing the signal to the TV. Once the TV sense the signal from the source has changed, it pauses and adjust the resolution according to the signal it receives.
said by bull3964 :

If your input signal is 1280x720 but your display is 1920x1080, pixels need to be added to the frame in order for the picture to fill the full screen. Otherwise, a 720p broadcast would be windowboxed completely on a 1080p TV.

The above highlighted statement seems to be not happening to my cheap LED TV. When a 1280x720 input signal is received by the TV and the channel I'm at is in 1920/1080 resolution, it adjust to 1280x720 resolution as indicated by the TV during the pause and so it does when a 1920x1080 signal is received from a 1280x720 channel.
said by bull3964 :

This is upscaling. Upscaling can be simple (multiplication of pixels to increase the resolution of the picture) or it can be complex (a computer algorithm tries to guess what the missing information should be based on the surrounding pixels.)

The upscaling you're saying here is not the experience I see with my LED TV. What I see is, it adjust to the native resolution it receives and that's why I'm confused by you and others saying the TV converts it to 1080p.

Here's my experience when STB is in Native mode.

a. When watching a 1920x1080 channel and I change channel to a 1280x720 channel, the TV adjust to 1280x720 resolution.

b. When watching a 1280x720 channel and I change to a 1920x1080 channel, the TV adjust to 1920x1080 resolution.

c. When watching a 1280x720 channel and I change to a 1280x720 channel, the TV does not pause and not change resolution.

d. When watching a 1920x1080 channel and I change to a 1920x1080 channel, the TV does not pause and not change resolution.

TitusTroy
join:2009-06-18
New York, NY

TitusTroy to elefante72

Member

to elefante72
said by elefante72:

If the panel is a 1920x1080 panel, the electronics in the TV must convert whatever broadcast signal is coming in to the native panel resolution.

So a 720 signal -> 1920x1080
A 1080 signal -> 1920x1080
a 480 signal -> 1920x1080

I'm still confused...so my 1080p TV is converting everything to 1080p (1920 x 1080) even if I have my cable box set to native passthrough?...if that's the case then what's the point of changing that setting?...instead of native, what happens if I set it to 720p or 1080i?...I must be missing something simple here...so if I change the setting in my cable box to native, my cable box is doing the scaling versus my TV correct?

as far as 24p sources (Blu-ray mostly) there is a setting in the Blu-ray player which can output 24p (60Hz, 90Hz etc)
TitusTroy

1 edit

TitusTroy to bull3964

Member

to bull3964
said by bull3964 :

On digital displays, no matter the technology, the signal must ultimately be converted to the native resolution of the display. Each pixel is given a color and intensity to display. If your input signal is 1280x720 but your display is 1920x1080, pixels need to be added to the frame in order for the picture to fill the full screen. Otherwise, a 720p broadcast would be windowboxed completely on a 1080p TV.

This is upscaling. Upscaling can be simple (multiplication of pixels to increase the resolution of the picture) or it can be complex (a computer algorithm tries to guess what the missing information should be based on the surrounding pixels.) You can see examples of how these different techniques work here (»en.wikipedia.org/wiki/Im ··· _scaling). Things are even more complex with video as you have a temporal component (a series of images) that can assist in determining what the missing information could be and due to the fact that the signal may be interlaced (picture divided up into alternating lines from two different frames.)

So, the end image quality you see is determined by the source and by the upscaling method. With a 480i input source, upscaling can only do so much. Detail that doesn't exist just doesn't exist and even with the best scaling algorithms, the end result is going to be a bit fuzzy and lacking in detail. The same goes for 720p, only less severe since you start out with a much better signal.

As to the first question, whether or not to use native output, the real answer is "it depends." If you have a higher end TV with advanced picture processing or if you have a higher end HDMI receiver with a good scaling chip built in, it's probably better to set the output to native and let the other devices handle the scaling because they will probably do a better job. Otherwise, just set the output resolution to match (1080i -> 1080p doesn't need scaling, just deinterlacing and the content should be such that any TV can handle the deinterlacing more or less perfectly.)

thanks for the in-depth explanation...it makes more sense now...so when setting the cable box to native passthrough does that mean my TV does the scaling or the receiver?...I have a high end plasma from 2011 and a Denon receiver from the same year...and instead of native, if I set it to 720p or 1080i that means my cable box is doing the scaling correct?
PJL
join:2008-07-24
Long Beach, CA

PJL to TitusTroy

Member

to TitusTroy
said by TitusTroy:

I'm still confused...so my 1080p TV is converting everything to 1080p (1920 x 1080) even if I have my cable box set to native passthrough?...if that's the case then what's the point of changing that setting?...instead of native, what happens if I set it to 720p or 1080i?...I must be missing something simple here

If you set the STB to do the conversion to either 720p or 1080i, then the STB will convert all three signal types to 1080i and the TV will then convert it to 1080p. If you do pass through then the TV itself converts the appropriate signal to 1080p without an intermediate conversion in the STB. Some TVs may do the conversion better than the STB, but that is sometimes a subjective call (see last comment below).

Knowing this, I played with having pass through, 720p or 1080i into my 1080p TV. The picture looked best for me with 1080i of the STB (720p was washed out - soft - and looked inferior) especially since my TV took time to switch resolutions when changing channels using native output and there is no delay with the STB set to 1080i. I did not want such a big delay. (Other TVs may be better.)

As a last comment, I would again recommend you play with the settings and see what looks best for you.

bull3964
@208.40.147.x

bull3964 to bluepoint

Anon

to bluepoint
said by bluepoint:

I think you got it in reverse. When the STB is in native setting it actually is just passing the signal to the TV. Once the TV sense the signal from the source has changed it pauses and adjust the resolution according to the source.

No, I do not. The STB is just passing the signal it sees, but that output is variable based on the channel output. For one channel, the STB output is 480i, for another, it's 720p, for another it's 1080i. Every single time that output resolution switches, the STB momentarily turns off the video output and then turns it back on with the new resolution. That causes the pause you see. When the STB is set to output a single resolution (such as 1080i), the signal continuity is not interrupted with your TV as you change channels that have different native resolutions so you don't see a pause before the picture re-establishes itself.
said by bluepoint:

The above highlighted statement seems to be not happening to my cheap LED TV. When a 1280x720 input signal is received by the TV and the channel I'm at is in 1920/1080 resolution, it adjust to 1280x720 resolution as indicated by the TV during the pause and so it does when a 1920x1080 signal is received from a 1280x720 channel.

The upscaling you're saying here is not the experience I see with my LED TV. What I see is, it adjust to the native resolution it receives and that's why I'm confused by you and others saying the TV converts it to 1080p.

You are confusing what your TV is saying in its info bar with what's actually happening. If I feed my 1080p TV a 720p signal, the TV info says 720p. However, it's only reporting what the input resolution is. The TV is scaling it to 1080p. It is physically impossible for a fixed pixel display to output anything other than its native panel resolution. Any input signal that is not the same as the native panel resolution must be scaled in one way or another. Otherwise, a 720p signal on a 1080p tv would show up as a little square in the middle of the TV as it mapped the pixels 1:1 for between the input signal and the panel's native resolution.