BonezXBasement Dweller Premium Member join:2004-04-13 Canada |
to dadkins
Re: 1080p?said by dadkins:Irrelavent, you seem to have missed this: Maximum resolution: 1920 x 1440 at 64 Hz Recommended resolution: 1600 x 1200 at 76 HzMaximum refresh rate: 1024 x 768 at 85 Hz Horizontal frequency: 30 - 96 kHz Vertical frequency: 50 - 160 Hz That would make your Native 1600x1200 friend! "cheese, my display is a 19" samsung syncmaster 997DF, native is 1920x1440, max is 2048x1536." Uh, yeah. I give up friend, you believe whatever you want. did you not open the screencap ? it IS 2048x1536, or are you too stunned to click on an image ? |
|
Michieru2zzz zzz zzz Premium Member join:2005-01-28 Miami, FL |
..Bad business model if they expect most of there customers are with high end equipment they should think again. No kid will have a 1080p screen and probably the money to pay for the PS3 anyway.
So whose going to buy? |
|
Cheese Premium Member join:2003-10-26 Naples, FL |
to BonezX
Re: 1080p?said by BonezX:said by dadkins:Irrelavent, you seem to have missed this: Maximum resolution: 1920 x 1440 at 64 Hz Recommended resolution: 1600 x 1200 at 76 HzMaximum refresh rate: 1024 x 768 at 85 Hz Horizontal frequency: 30 - 96 kHz Vertical frequency: 50 - 160 Hz That would make your Native 1600x1200 friend! "cheese, my display is a 19" samsung syncmaster 997DF, native is 1920x1440, max is 2048x1536." Uh, yeah. I give up friend, you believe whatever you want. did you not open the screencap ? it IS 2048x1536, or are you too stunned to click on an image ? Dunno then, directly from Samsung those specs where  |
|
Combat ChuckToo Many Cannibals Premium Member join:2001-11-29 Verona, PA |
to Maxo
Re: Heh... cutbackssaid by Maxo:Yeah, it's pretty much a non-issue. While most of the processors made will not have all the cells working, the PS3 will be shipping with chips that have either 7 or 8 cells working. There is no reason (yet) to believe these cells have high burnout rates. If I were to buy a PS3 I would wait before buying to see what happens. It's a further non-issue for me as I'm going for the cheap-o wii. It's just a cpu, Cpu's don't generally go bad. Besides I seriously doubt that the chip will be able to function if a cell burns out whether it has 7 or 8 working to begin with. I imagine the chip goes thru a grading process and after that process something is done to lock in the number of cells the chip will use be able to use; similar to how many of the first PII chips were made into Celerons. From what I've read previously about cell, if Sony is shipping any PS3's with 7 cell's developers will only develop titles that will use 7 cells. It doesn't seem that it's a matter of throwing code at the processor and it figures out how to distribute it amongst the cells; the compiler generates code that say's "I need six cells and here's what they'll do" thus they'll have to compile the code that they write for the lowest common denominator, that being seven cells. Furthermore most of the games at launch won't even use the cells for much relying on the PPC core to do the work. Again this is what I get from various things I've read over the past year or so, correct me if I'm wrong. That DailyTech story seems fishy if you didn't notice. |
|
| Combat Chuck |
to Karl Bode
Re: My prediction:said by Karl Bode:...then having a hard time justifying a PS3 unless the killer exclusive titles are there, which I've yet to see... What? Hitting the weak point on giant historical crabs for massive damage isn't a killer game mechanic? |
|
|
BonezXBasement Dweller Premium Member join:2004-04-13 Canada 3 edits |
to Cheese
Re: 1080p? 1080p VS my monitor |
said by Cheese:Dunno then, directly from Samsung those specs where if we belived specs, then sony should have a supercomputer and not a game console. also, you don't own or use one, so your going on "stats and spcs" where i'm going on "actual useage" » stores.tomshardware.com/ ··· 135635//» www.epinions.com/pr-Sams ··· ~reviewstwo others about doing 2048x1536. |
|
sjr join:2006-08-27 Osseo, MN |
to dadkins
All of this sounds nice but isnt this all a bit mute unless you are planning on playing a PS3 through your laptop?  |
|
|
viperpa33sWhy Me? Premium Member join:2002-12-20 Bradenton, FL |
to Michieru2
Re: ..said by Michieru : Bad business model if they expect most of there customers are with high end equipment they should think again. No kid will have a 1080p screen and probably the money to pay for the PS3 anyway.
So whose going to buy?
This is when the parent's come into play.  |
|
koolman2 Premium Member join:2002-10-01 Anchorage, AK |
to BonezX
Re: 1080p?said by BonezX:... and all that DHCP crap. DHCP is a network protocol and has absolutely nothing to do with display technology. |
|
Michieru2zzz zzz zzz Premium Member join:2005-01-28 Miami, FL |
to viperpa33s
Re: ..Yeah I forgot how kids are spoiled by there parents these days. |
|
BonezXBasement Dweller Premium Member join:2004-04-13 Canada |
to koolman2
Re: 1080p?said by koolman2:said by BonezX:... and all that DHCP crap. DHCP is a network protocol and has absolutely nothing to do with display technology. HDCP, i was doing network crap around the time i wrote that. |
|
koolman2 Premium Member join:2002-10-01 Anchorage, AK |
to Glaice
Re: My prediction:Indeed it does. Four GCN controller ports as well as two GCN memory card slots. |
|
| koolman2 |
to Rob A
said by Rob A:I want to get a wii so bad over ps3 for price. But just can't. That damn controller is gonna be why that system will fail ultimately unless they create an alternative. Wow. You're honestly the first person that I've seen that thinks that the controller is a bad idea. |
|
Rob AAdjusting Premium Member join:2005-01-17 Pompton Plains, NJ |
Rob A
Premium Member
2006-Sep-7 10:28 pm
said by koolman2:said by Rob A:I want to get a wii so bad over ps3 for price. But just can't. That damn controller is gonna be why that system will fail ultimately unless they create an alternative. Wow. You're honestly the first person that I've seen that thinks that the controller is a bad idea. That surprises me. Everyone I've talked to is apposed to the idea. The game I play the most is madden, I can play it for hours on end. The last thing I wanna do is have to swing my hands around on every pass, run, kick, etc. |
|
djrobx Premium Member join:2000-05-31 Reno, NV |
to Michieru2
Re: ..There's a lot of middle-aged gamers.
-- Rob |
|
BonezXBasement Dweller Premium Member join:2004-04-13 Canada |
to Rob A
Re: My prediction:said by Rob A:said by koolman2:said by Rob A:I want to get a wii so bad over ps3 for price. But just can't. That damn controller is gonna be why that system will fail ultimately unless they create an alternative. Wow. You're honestly the first person that I've seen that thinks that the controller is a bad idea. That surprises me. Everyone I've talked to is apposed to the idea. The game I play the most is madden, I can play it for hours on end. The last thing I wanna do is have to swing my hands around on every pass, run, kick, etc. basically try to not look completely insane while swearing at the players on screen. |
|
koolman2 Premium Member join:2002-10-01 Anchorage, AK |
to BonezX
Re: 1080p?An easy mistake to make!  |
|
owenhomekeeper of the magic blue smoke Premium Member join:2002-07-13 Bentonville, AR 2 edits |
to BonezX
Samsung started this BS, and it is BS. Anybody ever wonder why there are just a couple monitors on the market that supposedly support these ridiculous resolutions?? Samsung and Phillips are the only two out there that do this, at least that I know of.
Anybody?
No?
Well, Mr. Bonez here is sort of right, and so is Dadkins. But, Dadkins is right....er.
See, a handful of manufacturers started doing this "virtual resolution" bullshit as a way to utilize the ultra-high resolutions newer video cards are capable of, to achieve more desktop real-estate, and in my opinion, falsely state the capabilities of their product to increase sales. This is a function performed partly by the driver and partly by the screen itself. These manufactures DO NOT list in there specifications the ACTUAL NATIVE RESOLUTION of the SCREEN, but RATHER the input resolution the MONITOR ITSELF IS CAPABLE OF ACCEPTING. Consequently, the NATIVE resolution we are so used to seeing, and the resolution THE USER SELECTS FOR USE are two COMPLETELY DIFFERENT THINGS.
Any sane individual with at least marginal logic skills can understand that two 19" monitors, both of the same viewable area, same aspect ratio, and relatively identical pixel size would have at least similar real (native) resolution. Simply put, they would have a similar number of real pixels. Not the trickery employed by some manufacturers (like Samsung), but rather the number of little colored dots. The total number of pixels top to bottom, side to side, would be close to the same (probably the exact same). A monitor with a native resolution of, lets say, 1600x1200 would be composed of 1,920,000 total pixels. Why is it then that with two monitors, which match in all other aspects, one will have 1.92 MP, and another will have 50% more (3,145,728)? Well, IT CAN'T because their ain't no friggin room to stick another million damn pixels!
To be honest, it's a damn dishonest way to do business. But that doesn't stop them. They never claim it to be the native resolution, only the max, and they're right in that aspect. It will display a resolution of 2048x1536, albeit not actually at 2048x1536, but rather down-scaled to fit the monitors actual (native) resolution. See, in order to actually and natively display at 2048x1536, it must have 2048 pixels across, and 1536 down. But, in fact, they do not. A monitor with which a recommended resolution is specified has an actual, native, true, pixel by pixel resolution equal to the recommended resolution.
This also means something else which is equally F-ED UP. See, since the resolution selected for use is not the native resolution, but rather one which is manufactured (down-scaled) by the driver, an updated driver can increase the display resolution selectable by the user. THIS DOES NOT INCREASE ACTUAL NATIVE RESOLUTION. It's just one higher notch that can be down-scaled to fit on the true native resolution. The reason the refresh is limited is a limitation only on how fast the monitor can handle down-scaling and how fast the video card can process it. With a fast enough scaler, and a fast enough video card, the maximum bullshit resolution's refresh rate would be equal to that of the recommended non-bullshit resolution.
For example, if we look at a monitor with these specifications....
Maximum resolution: 1920 x 1440 at 64 Hz Recommended resolution: 1600 x 1200 at 76 Hz Maximum refresh rate: 1024 x 768 at 85 Hz Horizontal frequency: 30 - 96 kHz Vertical frequency: 50 - 160 Hz
This monitor has a native resolution of 1600x1200 with the capability to display a 1920x1440 resolution through down-scaling. The 1920x1440 image is processed into 1600x1200, the true maximum. The maximum down-scalable resolution is never the recommended resolution because, due to the scaling process, a vast amount of picture information (pixels) is lost. There are sometimes millions of pixels that are simply not displayed, they are skipped or averaged into other pixels because the monitor just doesn't have that many.
Overscanning, well, Mr. Bonez is completely off base with that idea. Overscanning enlarges/crops the image and takes place on televisions, standard televisions. Overscanning cuts off all four sides of the actual image and "zooms in" toward the center. With overscanning, the beam that makes the image on the screen scans past the outside edges of the mask. Part of the image is literally drawn where we can never see it. Overscanning literally scans over the edges of the screen, hence the term.
Now, Mr. Bonez, your monitor does, in fact, have a native display resolution of 1600x1200. Higher resolutions, such as 1920x1440, or 2048x1536, are not in any way related to your native resolution, but rather they are a result of the down-scaling feature provided by your monitors driver. Your monitor will accept resolutions this high, but they are converted to fit on your actual pixel space of 1600x1200. Down-scaling the image to fit on your screen destroys the images fidelity as 50% of the pixels that make up the image are averaged to fit, thusly, they no longer exist. That's why those resolutions are not "recommended".
Arguing further on this topic is not necessary. Citing other product owner's equally deluded opinions is just as unnecessary. I would suggest you either research how monitors (yours in particular) actually work, and what resolutions truly mean, or sit down and count the number of little lights from one end of your screen to the other. |
|
| |
xillusionx to djrobx
Anon
2006-Sep-8 2:09 am
to djrobx
Re: ..I think the gaming market is getting to pricey for parents. Supposely with economy not as good as it was. 360 pricing was not to bad but ps3 is ungodly amount of money. 360 I was telling myself was to high but I paid for it. Is here debating an LCD screen or not , there isnt much use for one other then movies and games. I mean there isnt much content for it to subscribe to. Only DISH network offers almost 30 channels. I almost have to mount it above my regular tv because LCD sucks when looking at regular cable channel.
-illusion |
|
GlaiceBrutal Video Vault Premium Member join:2002-10-01 North Babylon, NY |
to koolman2
Re: My prediction:Excellent! At least I can port over my work from my Mario games.  |
|
Speedy8 Premium Member join:2002-08-22 Alliance, OH 1 edit |
to ytsejammer39
Re: XBoxI can actually run Quake 1 in 1920x1200 on my monitor, even better than 1080p, and the game is 10 years old.  |
|
| |
to djrobx
Re: ..I am a middle-aged gamer and I will buy the PS3 shortly after it comes out. The only problem is convincing my wife that we need another HDTV for it...I had a hard time getting my first one. LOL
I honestly can't see how a teenager can afford to buy equipment like this. I mean when I was 15yrs old,I was busy trying save up for a car let alone a video game console. |
|
| |
to snipper_cr
Re: 1080p?Anyone thinking there is little/no visible difference between 720 progressive and 1080i interlacing is limited by their eyesight, display, or both. Sorry, I can easily tell the difference on my 92" display. And the larger my display the more resolution I want in order to avoid pixel structure (digital) or scanlines (analog).
The chief benefit of 1080P source & transport is not having to a) DEinterlace, and b) live with DEinterlacing artifacts. Deinterlacing is a process-intensive task and most 'cheap' their way out of it by dropping half the fields in an interlaced frame and simply line double what was left.
Also, someone's comment about 1080P needing HDMI is incorrect. That statement is ONLY applicable to the PS3. I can and have easily used RGBhv (analog) without HDCP (and not DHCP) on my computer for video playback as a source device. |
|
| GhostDoggy |
to kamm
Re: Can the PS3 Save Sony?I doubt any single product is going to make or break Sony. The PS3, costly due to its forced-inclusion of the BD transport system, is part of a much larger project Sony hopes to bring the revenue.
Adding 1080P playback for content (game, movie, etc.) to their SXRD (LCoS) display initiative is an even bigger and bolder attempt to do things their way and not consider anything from outside their own tight control. |
|
| GhostDoggy |
to Philmatic
Re: XBoxInterlacing simply takes half of the vertical resolution and places it into a field, instead of a frame. The other hald of the vertical resolution is contained in a second field. The two fields are combined into a frame. The frame plays at the same rate as a progressive frame, BUT each of the two fields within the frame must share their allotted frame-time.
The act of interlacing video is not difficult nor process-intensive, but the reverse is also true: to deinterlace and interlaced frame it is very process-intensive in order to not lose information, or cause deinterlacing anomalies (artifacts).
A lot of manufacturers cheap their way out of deinterlacing by simply throwing away one of the fields in each frame thereby leaving a progressive frame of half the resolution. They then simply line-double this and call it the progressive original resolution. Those people should be shot, IMO.
As video codecs get more and more efficient at compressing video, the task of deinterlacing get's even more difficult. Why? Because, you can only scale a progressive frame, which means deinterlacing comes first. The next part is that deinterlacing must be done in the uncompressed world, and not while compressed in a cozy MPEG-2/4 codec.
The Microsoft Xbox 360 cannot do 1080P, but then again neither can any of the currently available HD DVD players either even though their movies are mastered-to-the disk as 1080P. But the 360 is worse as its mother doesn't believe in anything beyond 720P at this time. |
|
| |
to BonezX
Re: 1080p?Actually, 1080p will still be possible with Component cables. the only need for HDMI will be for Blu-Ray movies (HDCP). This has been confirmed by Sony. |
|
pianojl join:2003-07-08 Sebastian, FL |
to HyPeRbAnD
Re: .. |
|
| |
to Michieru2
Sony is simply making a system with the future in mind. 1080p is not popular now, but within the next five years you can be sure there will be plenty of HD-TV's that support that resolution. There is already a pretty decent number of high-end televisions out there that make that claim.
Having no 1080p capable games at launch is a non-issue. |
|
| SRFireside |
to owenhome
Re: 1080p?So what do you recommend we look for in a 1080p television to get true image quality, including image format (plasma, CRT, rear-projection, etc)? |
|
MemphisPCGuyTaking Care Business Premium Member join:2004-05-09 Memphis, TN |
to SRFireside
Re: ..will not having 1080P games at the products end of life cycle be a non-issue as well? |
|