nwa122 join:2002-01-29 Wayne, PA |
nwa122
Member
2002-Jul-16 3:47 pm
16bit or 32 bitwhat is the difference between 16 and 32 bit, and what do they do and what are the advantages? thanks guys |
|
|
| |
16 and 32 bit what? Programs? Color? Can you be a little more specific? |
|
slash MVM join:2001-03-01 Boston
|
to nwa122
16 bit is 16 bit and 32 bit is 32 bit. 32 bit is newer and more robust. The bits refer to the software and/or hardware architecture. 64 bit is currently in the works and should be wide spread by the end of 2003.
edit - Hieronymus makes a good point, what are you referring to? |
|
nwa122 join:2002-01-29 Wayne, PA |
nwa122
Member
2002-Jul-16 4:08 pm
like 16bit color or 32 bit color. and this 64 bit color soon to come out, what are the benefits and how good of a computer do u need for it? |
|
slash MVM join:2001-03-01 Boston |
slash
MVM
2002-Jul-16 4:12 pm
I wasn't referring to 64 bit color, I was referring to software and hardware architecture.
16 bit color can display 65,536 colors, 32 bit can display 16.8 million colors. |
|
SheeshI Am The Lizard King Premium Member join:2001-04-11 Windsor, ON Zhone zNID 2400A D-Link DIR-890L Netgear WNDR3400
|
to nwa122
Depends on what you are doing. If you are doing photo editing or video editing 32 bit is the way to go.
If you have a slower computer and you are playing a fast action 3D game, 16 bit is just fine. You will be to busy to notice any difference. |
|
Bach Premium Member join:2002-02-16 Flint, MI |
to nwa122
32 bit (or 24 bit) is better for displaying photos than 16 bit. The picture on the left is a 24 bit photo, on the right the same pic reduced to 16 bit. If your display hardware is set to 24 or 32 bit you should see a difference -- there is a swirling pattern in the sky because 16 bit does not as accurately display the original color. |
|
Epyon9283 Premium Member join:2001-12-26 Trenton, NJ |
to nwa122
If I am correct, I do not think monitors can display over 32bit color. |
|
dave Premium Member join:2000-05-04 not in ohio |
dave
Premium Member
2002-Jul-16 5:18 pm
said by Epyon9283: If I am correct, I do not think monitors can display over 32bit color.
You are incorrect for analogue monitors, because analogue monitors don't deal in 'bits', they deal in voltages. It's the function of the video card to convert from bits to voltages. It's a fair question as to whether (1) the monitor can display more than 16 million colours, which probably translates to asking about the smallest voltage change that causes a detectable change in screen illumination, and (2) whether you can see any difference in the resulting image. I thought that 32 bits was more colours than the eye could see anyway? (Isn't 32 bit colour mostly 24 bit colour ina 32 bit frame to make it faster to process because the data is naturally aligned?) |
|
Ianguy join:2002-06-09 Tehachapi, CA |
Ianguy
Member
2002-Jul-16 5:45 pm
Well, in gaming, 32 bit is usualy 24 bit with 8 bits for transparency. In the Photoshop/Desktop Publishing world, 24 bits for images is 8 bits/channel (RGB), 32 bits is 8/channel (CMYK) (i think) and 48 bits is 16/channel (RGB) (this is rarely used).
Hey, do you remember when 256 color (8 bit) was exciting? Big jump from 16 color (4 bit)! |
|
BassistguyAlrighty Then Premium Member join:2001-07-14 Ballwin, MO |
said by Ianguy: Hey, do you remember when 256 color (8 bit) was exciting? Big jump from 16 color (4 bit)!
Yep. In highschool, I took computer info. systems, and there were 2 rooms. The first year, I was in the room with the 286's, with 16 color... the other room had 386's with 256 color. The next year, I got the room with the 386's.. I was sooooo excited because I got to actually work with Windows 3.1 and 256 colors!!!.. lol  |
|