BonezXBasement Dweller Premium Member join:2004-04-13 Canada 1 edit |
to snipper_cr
Re: 1080p?said by snipper_cr:Wasnt there a post around here saying that 1080p has no real purpose? yea, after bluring of the image, and anti aliasing you can't tell the diffrence between 1080p and 720p that and 1080p requires allot of bandwidth, and is locked into using HDMI and all that DHCP crap. |
|
Jerm join:2000-04-10 Richland, WA |
Jerm
Member
2006-Sep-7 1:59 pm
Thats like saying theres no difference between...Thats like saying theres no difference between ... running your desktop @ 1600x1200 vs 1024x768. Duh big difference.
Although for HD, its actually 1920 x 1080 resolution or 1280 x 720.
XBOX360 & PS3 FTL - I'll keep my "higher than 1080p" resolution on my Dell 24" LCD thank you very much. |
|
BonezXBasement Dweller Premium Member join:2004-04-13 Canada |
BonezX
Premium Member
2006-Sep-7 2:12 pm
said by Jerm:Thats like saying theres no difference between ... running your desktop @ 1600x1200 vs 1024x768. Duh big difference. Although for HD, its actually 1920 x 1080 resolution or 1280 x 720. XBOX360 & PS3 FTL - I'll keep my "higher than 1080p" resolution on my Dell 24" LCD thank you very much. your comparing two diffrent things, i run my computer at 2048x1536, and can't stand anything under 1600x1200, but remember, a console renders diffrent then a PC does where with a PC you can change the size of text and icons to make them look bigger and smaller in relation to the resolution, where the console allways renders the same scale regardless of resolution. example being, it's a tv not a monitor, where with a monitor you can easily tell the diffrence between 1024x768 and 1600x1200, with a tv 1080i and 720p basically look the same. |
|
dadkinsCan you do Blu? MVM join:2003-09-26 Hercules, CA 2 edits |
to BonezX
Re: 1080p? 720p |  1080p |  HDTV Resolution Chart |
Uh, not sure if your screen will display these, but mine will!  There is a difference between 720p and 1080p... slightly. If your screen/TV/monitor cannot achieve 1920x1200 and simply shrinks a 1080p picture to fit on a lower res screen, yeah! There will be no difference. Some of us do have screens capable of displaying 1080p video @ 1920x1200. *WE* can see a difference. EDIT: This is my laptop - which just happens to be my TV/display/monitor/etc. Note chart, the differences between Progressive and Interlaced... |
|
BonezXBasement Dweller Premium Member join:2004-04-13 Canada 1 edit |
BonezX
Premium Member
2006-Sep-7 2:35 pm
said by dadkins:Uh, not sure if your screen will display these, but mine will!  There is a difference between 720p and 1080p... slightly. If your screen/TV/monitor cannot achieve 1920x1200 and simply shrinks a 1080p picture to fit on a lower res screen, yeah! There will be no difference. Some of us do have screens capable of displaying 1080p video @ 1920x1200. *WE* can see a difference. EDIT: This is my laptop - which just happens to be my TV/display/monitor/etc. Note chart, the differences between Progressive and Interlaced... 2048x1536 definately below my display native. you can see the diffrence in the video because you are displaying 1920x1080 on a display above the video resolution. set your res to 1280x720 and run the first vid, then 1920x1080 and run the second. and you will see my point. |
|
dadkinsCan you do Blu? MVM join:2003-09-26 Hercules, CA 1 edit |
Your previous post is about 1080i, the PS3 games will be 1080p. The difference is night and day! The Native resolution of this laptop *IS* 1920x1200. Lowering it will decrease clarity. Sorry pal, not going to happen! Here, I'll post the chart once again for you... |
|
BonezXBasement Dweller Premium Member join:2004-04-13 Canada |
BonezX
Premium Member
2006-Sep-7 2:45 pm
then you can't see the effect that a user using a tv will(except the effect a 1080p native tv does when you run 720p content on it) |
|
dadkinsCan you do Blu? MVM join:2003-09-26 Hercules, CA |
said by BonezX:then you can't see the effect that a user using a tv will(except the effect a 1080p native tv does when you run 720p content on it) EXACTLY! This *IS* my TV! Also my DVR, monitor, etc. There is a difference. If you purchase a "HDTV" that has a max of 1280x768 ~ 1340x800 resolution, then you just got burned! Anyone can shrink a 1920x1200 picture to a smaller size and it will only look as good as the screen will allow. What would be the point though? Here, a stretched 720p picture on a 1920x1200 screen. Looks like crap, huh? |
|
rawgerzThe hell was that? Premium Member join:2004-10-03 Grove City, PA |
to dadkins
What the hell are you watching?! No wait, I don't want to know!  |
|
dadkinsCan you do Blu? MVM join:2003-09-26 Hercules, CA 1 edit |
|
|
Cheese Premium Member join:2003-10-26 Naples, FL |
to BonezX
said by BonezX:said by dadkins:Uh, not sure if your screen will display these, but mine will!  There is a difference between 720p and 1080p... slightly. If your screen/TV/monitor cannot achieve 1920x1200 and simply shrinks a 1080p picture to fit on a lower res screen, yeah! There will be no difference. Some of us do have screens capable of displaying 1080p video @ 1920x1200. *WE* can see a difference. EDIT: This is my laptop - which just happens to be my TV/display/monitor/etc. Note chart, the differences between Progressive and Interlaced... 2048x1536 definately below my display native. you can see the diffrence in the video because you are displaying 1920x1080 on a display above the video resolution. set your res to 1280x720 and run the first vid, then 1920x1080 and run the second. and you will see my point. Exactly what is your display by chance? |
|
|
owenhomekeeper of the magic blue smoke Premium Member join:2002-07-13 Bentonville, AR |
to BonezX
You have it backwards. There's not any visible difference between 720p and 1080i, but there can be a very substantial difference with 1080p.
There's a big difference in p verses the i, "p", standing for progressive, and "i" for interlaced. Basically, a 720p frame is painted on the screen as one solid image per frame. With 1080i, the image is created by splitting the original frame into lines. The image is split into 540 lines, displayed one after the other in an even/odd fashion 60 times per second. Your brain stacks up these lines into a solid image. Because of this process, each 540 lines of every interlaced frame is 1/60th of a second out of synch with the line before it in the preceding 540 lines of the previous interlaced frame. This means that each line of the even/odd set of 540 lines never quite matches. The difference between the lines is referred to as "interlaced artifacts". The faster an object in the frame moves, the worse these artifacts become. When you view a regular 480i (current non-HD TV) on a high resolution set (like a big screen HD TV), these interlaced artifacts are horribly obvious and 1080i has the same problem. Sometimes it looks like the image is being pulled apart line by line.
To lessen this horrible affect, the 1080i image is filtered frame by frame so that each 540 line frame more closely matches it's partner. This is done by reducing the real image's vertical resolution far below 1080 actual lines by around 60%, before it's split up into 540 line interlaced frames. This way there is much less actual picture information so the difference between the lines is less apparent. Because this picture information is removed when the 1080i material is originally broadcast, or recorded, it's gone forever and no amount of technology in our TV's can ever replace it.
This 60% reduction in the real picture information or the "i" format is the Achilles heal with the 1080i format. With "p", there is no need to remove ANY picture information because the image is NEVER split up. With a 720p image, there is no reduction in image quality from the original source what so ever.
When comparing the resolution of the actual presented image, the 1080i and 720p image is essentially the same. True, 720p has a lower resolution but because so much picture information is removed from 1080i, the difference in the real information presented in the image on the TV screen us negligible. However, the "p" format has 0 interlaced artifacts, so it is generally accepted as a better picture than 1080i.
Now comes 1080p. Imagine now what we would have if that 60% percent was never removed? Wouldn't 1080 then be a better format than 720 if the were both "p"? Wouldn't all of the advantages 720p has over 1080i be gone? Wouldn't there be no more interlacing, no more 540 line BS, and no more artifacts? And with the higher resolution, without any kind of filtering, and without any artifacts be better? Absolutely! If there is no information removed from the image, and it is displayed as an entire frame instead of splitting it up, the higher real resolution 1080 offers would be better. As such, 1080p carries much more picture information than 720p. Where as the picture information 720p and 1080i contain is pretty much the same with 720p actually containing MORE at times.
Now, in the real world.
You don't sit 2' away from your TV like you do your computer monitor. With a 61" HDTV, the recommended viewing distance is 9-11'. At that distance, you wouldn't likely notice any difference between 1080i (and 720p) and 1080p in a direct comparison except for maybe the lack of some interlacing artifacts.
With real HDTV's, some can't even display interlaced images anyway! With technologies like DLP and LCD, the way the picture is created and displayed on screen REQUIRES the picture to be displayed all at once. In a DLP set, the picture is created on a chip and reflected on to the screen. It can only work by displaying the entire image at once. With such TV's, the image must first be converted into a "p" format by the TV's image processor before it can be displayed. This is why they display a 720p image when given a 1080i source and also why so many of them are natively 720p. There's no need to do anything else if the standard is 1080i. In order for these sets to operate truly at 1080, they must convert it to "p". The newer sets, with higher resolution, 1080 lines of resolution, are by nature going to be 1080p. So the would convert 1080i into 1080p. But remember, the picture information is already lost forever when the source was created. So there is 0 advantage to doing this. With a 1080i source, like broadcast HD, a set displaying 720p, 1080i, and 1080p will all contain approximately the same amount of actual picture information. Meaning they will all look the same. But, if an actual 1080p source is available, only the TV that is 1080p capable will receive any benefit.
As of right now, the only source of 1080p in the world is with Blue-Ray HD-DVD players. Standard, non Blue-Ray players are only capable of 720p/1080i.
But even so, at a seated viewing difference, you will likely notice 0 difference between the same set showing a movie at 720p, and showing the same movie at a true 1080p of a Blue-Ray player. In fact, if the player and TV displaying the 720p movie is of sufficiently higher quality, it will likely even look vastly superior. We all know there's a lot more to how good something looks (or performs) than just what specs are stamped on it.
Broadcasters have adamantly opposed the 1080p format. It's unlikely that it will become anything more than a novelty because of it, at least in broadcast television (including cable/sat). It takes MUCH MUCH more bandwidth to transmit 1080p instead of 1080i. Only half of the 1080i image, only 540 lines, are being transmitted at any given time. With 1080p, the entire frame must be transmitted at once. But it's more than that, remember, this information is compressed when it's transmitted. It's decompressed when it's decoded by the HD receiver. Also, remember, around 60% of the image's information is removed with 1080i which also means..... it's more easily compressed. With 1080p, there's already twice as much information, plus that 60% is never removed, so the 1080p image is much less compressible. When you put it all together, it would take an enormous amount of bandwidth to carry the 1080p signal verses 1080i. Because of this, they consider 1080i "good enough". As such, broadcasters adamantly appose it. |
|
kamm join:2001-02-14 Brooklyn, NY |
to dadkins
You're confused, I believe, on multiple counts. 1. Downscaling certainly doesn't look like crap, unlike your example when you uprezzing a 720p to your native resolution, with a pretty stupid, crappy quality software solution (VGA driver). Besides this almost every better monitor have smart stretch-zoom-etc functions. Finally 1080p-capable HDTVs are exclusively 1920x1080, not 1920x1200 like your notebook display and they can be had for fair prices. I have a Sceptre 37" here which is native 1920x1080 - excellent piece, you can have it for only $1,500 and looks far better than your notebook display.  PS: My Dell 24" - 1920x1200 native - have no problems with 1080p either.  |
|
BonezXBasement Dweller Premium Member join:2004-04-13 Canada |
to owenhome
that was a spelling error on my part, i did mean to put P on the end of it, remember, a 720p native and a 1080p native displaying the same content at each resolution will look exactly the same.
displaying it on a device that does ABOVE that resolution (computer monitor) you will see a diffrence in quality, being the size of the display window relative to the rest of the display.
cheese, my display is a 19" samsung syncmaster 997DF, native is 1920x1440, max is 2048x1536. |
|
| |
to BonezX
said by BonezX:and all that DHCP crap. Man, I plugged in my HD-DVD player to HDMI, and it got an IP from my router. Wild. |
|
BonezXBasement Dweller Premium Member join:2004-04-13 Canada 1 edit |
BonezX
Premium Member
2006-Sep-7 5:16 pm
said by smcallah:said by BonezX:and all that DHCP crap. Man, I plugged in my HD-DVD player to HDMI, and it got an IP from my router. Wild. that's the callhome, it's 1 of the 3 diffrent literations of DHCP. the other two, are watermarking, and self distruct. |
|
|
Cheese Premium Member join:2003-10-26 Naples, FL |
to BonezX
said by BonezX:that was a spelling error on my part, i did mean to put P on the end of it, remember, a 720p native and a 1080p native displaying the same content at each resolution will look exactly the same. displaying it on a device that does ABOVE that resolution (computer monitor) you will see a diffrence in quality, being the size of the display window relative to the rest of the display. cheese, my display is a 19" samsung syncmaster 997DF, native is 1920x1440, max is 2048x1536. Hmmmmmmm....From Samsung Website..... The Samsung Sync Master 997df is a 19-inch CRT monitor offering 0.20mm horizontal dot pitch, 30-96 kHz horizontal frequency, 50-160 Hz vertical frequency and a maximum 1920 x 1440 resolution. It utilizes Samsung DynaFlatTM display technology, which has no visible curvature of the internal screen surface (horizontally or vertically)resulting in razor-sharp images without distortion and only minimal glare.  |
|
| |
to BonezX
It's HDCP, not DHCP, which is completely different. |
|
dadkinsCan you do Blu? MVM join:2003-09-26 Hercules, CA |
to kamm
Ok, how good will a 1080p video look on a 720p screen? Only as good as the lower res screen will display, no more. You can't create more pixels than are there. Watching a 720p video source on a 1080p capable display will either be WAY letterboxed, or stretched to unwanted pixelation. No confusion there. From across the room, your TV will look better than my laptop screen at the same distance, up close... not so. At 1920x1080, there are fewer of pixels. Plus, bigger screen, bigger pixels.  All you are getting is more distance, not a better picture. PS: My laptop has 17" display. |
|
BonezXBasement Dweller Premium Member join:2004-04-13 Canada 1 edit |
to Cheese
said by Cheese:Hmmmmmmm....From Samsung Website..... The Samsung Sync Master 997df is a 19-inch CRT monitor offering 0.20mm horizontal dot pitch, 30-96 kHz horizontal frequency, 50-160 Hz vertical frequency and a maximum 1920 x 1440 resolution. It utilizes Samsung DynaFlatTM display technology, which has no visible curvature of the internal screen surface (horizontally or vertically)resulting in razor-sharp images without distortion and only minimal glare. |
|
owenhomekeeper of the magic blue smoke Premium Member join:2002-07-13 Bentonville, AR 1 edit |
to BonezX
If you are dealing with a source that has the same resolution, lets say, for example, a standard 1080i source is being displayed, it's somewhat true that both a 720p and 1080p HDTV will display a very similar picture. The quality won't be much different, but they will not be exactly the same.
The 720p unit will convert the 1080i source to 720p. The 1080p unit will simply de-interlace each frame. The pixel per second count of 720p and 1080i is roughly the same but that's the only simularity.
A 1080i signal does not exactly equate to a 720p image when de-interlaced. It depends on several factors. It depends on the motion detection capabilities of the TV's video processor, it depends on the compression of the original source, and more. A 1080i image can only natively de-interlace to 1080p. For a 1080i image to be displayed at 720p, it must be converted. The two 540 interlace frames that make up a 1080i frame are summed and doubled. So it's really doubled to 1080p. The biggest reason why we had 720p displays and not 1080p displays is simply because it was easier to make a display to handle the amount of data involved with a 720 line non-interlaced image. The amount of data in a 1080i frame and 720p image are almost the same, but not the actual resolution. A 1080 line non-interlaced image has about double the pixel rate, right at 2 million per frame. That's a crap load of data. Not to mention Texas Instruments was not able to make a DLP chip that had 1080 lines of resolution and it was also cheaper and easier to make a LCD screen or rear projector with 720 lines of vertical resolution instead of 1080. But now, the technology involved has gotten better, production costs have decreased and Ti has a 1080 DLP chip and the other manufacturers have followed suit. Other technologies could recreate 1080 lines easier, like CRT for instance. Those have for the most part always been 1080i. But their electronics were not advanced enough to handle 2 MP frames and there were not any 1080p sources anyway.
Yes, with computers, the situation is different. With games or Windows programs, the detail increases with increased resolution because every pixel has it's own output. But when dealing with video on the computer, like playing DVD or video file, just like with a TV, the increased resolution makes no difference. The image is simply scaled to match the higher resolution, but there is no increase in quality what so ever. One pixel in the original image becomes several. The image quality can actually be a lot worse, depending on how well it is scaled. It cannot, however, EVER be better, only worse. |
|
dadkinsCan you do Blu? MVM join:2003-09-26 Hercules, CA |
to BonezX
I can push my GPU that far as well, that doesn't make it the Native res though. |
|
BonezXBasement Dweller Premium Member join:2004-04-13 Canada 1 edit |
BonezX
Premium Member
2006-Sep-7 8:35 pm
diffrence between mine and yours.
CRT, 4:3
and if you want to get specific, it's "native" is 1600x1200, with it's max being 1920x1440, and overscan to 2048x1536. |
|
dadkinsCan you do Blu? MVM join:2003-09-26 Hercules, CA 2 edits |
Irrelavent, you seem to have missed this: Maximum resolution: 1920 x 1440 at 64 Hz Recommended resolution: 1600 x 1200 at 76 HzMaximum refresh rate: 1024 x 768 at 85 Hz Horizontal frequency: 30 - 96 kHz Vertical frequency: 50 - 160 Hz That would make your Native 1600x1200 friend! "cheese, my display is a 19" samsung syncmaster 997DF, native is 1920x1440, max is 2048x1536." Uh, yeah. I give up friend, you believe whatever you want.  |
|
BonezXBasement Dweller Premium Member join:2004-04-13 Canada |
BonezX
Premium Member
2006-Sep-7 8:51 pm
said by dadkins:Irrelavent, you seem to have missed this: Maximum resolution: 1920 x 1440 at 64 Hz Recommended resolution: 1600 x 1200 at 76 HzMaximum refresh rate: 1024 x 768 at 85 Hz Horizontal frequency: 30 - 96 kHz Vertical frequency: 50 - 160 Hz That would make your Native 1600x1200 friend! "cheese, my display is a 19" samsung syncmaster 997DF, native is 1920x1440, max is 2048x1536." Uh, yeah. I give up friend, you believe whatever you want. did you not open the screencap ? it IS 2048x1536, or are you too stunned to click on an image ? |
|
Cheese Premium Member join:2003-10-26 Naples, FL |
Cheese
Premium Member
2006-Sep-7 9:11 pm
said by BonezX:said by dadkins:Irrelavent, you seem to have missed this: Maximum resolution: 1920 x 1440 at 64 Hz Recommended resolution: 1600 x 1200 at 76 HzMaximum refresh rate: 1024 x 768 at 85 Hz Horizontal frequency: 30 - 96 kHz Vertical frequency: 50 - 160 Hz That would make your Native 1600x1200 friend! "cheese, my display is a 19" samsung syncmaster 997DF, native is 1920x1440, max is 2048x1536." Uh, yeah. I give up friend, you believe whatever you want. did you not open the screencap ? it IS 2048x1536, or are you too stunned to click on an image ? Dunno then, directly from Samsung those specs where  |
|
BonezXBasement Dweller Premium Member join:2004-04-13 Canada 3 edits |
BonezX
Premium Member
2006-Sep-7 9:30 pm
 1080p VS my monitor |
said by Cheese:Dunno then, directly from Samsung those specs where if we belived specs, then sony should have a supercomputer and not a game console. also, you don't own or use one, so your going on "stats and spcs" where i'm going on "actual useage" » stores.tomshardware.com/ ··· 135635//» www.epinions.com/pr-Sams ··· ~reviewstwo others about doing 2048x1536. |
|
sjr join:2006-08-27 Osseo, MN |
to dadkins
All of this sounds nice but isnt this all a bit mute unless you are planning on playing a PS3 through your laptop?  |
|
koolman2 Premium Member join:2002-10-01 Anchorage, AK |
to BonezX
said by BonezX:... and all that DHCP crap. DHCP is a network protocol and has absolutely nothing to do with display technology. |
|
BonezXBasement Dweller Premium Member join:2004-04-13 Canada |
BonezX
Premium Member
2006-Sep-7 10:05 pm
said by koolman2:said by BonezX:... and all that DHCP crap. DHCP is a network protocol and has absolutely nothing to do with display technology. HDCP, i was doing network crap around the time i wrote that. |
|