dslreports logo
 
    All Forums Hot Topics Gallery
spc
uniqs
13

JohnInSJ
Premium Member
join:2003-09-22
Aptos, CA

JohnInSJ to mozerd

Premium Member

to mozerd

Re: A 64 bit world for Windows OS?

said by mozerd:

+1

and get ready for the 128 bit Windows world coming soon.

Native word sizes just keep getting longer - like iPhone screens
dave
Premium Member
join:2000-05-04
not in ohio

dave

Premium Member

Perhaps, but by the usual rule of thumb, the extra 32 bits of addressing (from 32 to 64 bits) will keep us going for 48 years.

If you date the need for 64-bit systems as starting around the early 1990s (DEC Alpha), we've still got nearly 30 years of run-room left.

(I hold to the convention that an N-bit system means one with an N-bit virtual address, the size of an integer being a trivial concern that can easily be handled by the compiler).

JohnInSJ
Premium Member
join:2003-09-22
Aptos, CA

JohnInSJ

Premium Member

said by dave:

Perhaps, but by the usual rule of thumb, the extra 32 bits of addressing (from 32 to 64 bits) will keep us going for 48 years.

It's not just the addressing though, it's also doubling the data bus width, which is one way to "double" (some aspects of) processing performance without needing to make the electrons go faster.

Blogger
Jedi Poster
Premium Member
join:2012-10-18

Blogger to dave

Premium Member

to dave
said by dave:

Perhaps, but by the usual rule of thumb, the extra 32 bits of addressing (from 32 to 64 bits) will keep us going for 48 years.

I'm an old man. My world, referred to by another poster, is the real world.

Guess I can scratch this off my things to deal with list!
dave
Premium Member
join:2000-05-04
not in ohio

dave to JohnInSJ

Premium Member

to JohnInSJ
Sure, but if you want 128-bit additions (for example) to go as fast as 64-bit additions today, just wait 18 months and they will. This is much more cost-effective than overhauling Windows to produce a '128-bit edition'.

howardfine
join:2002-08-09
Saint Louis, MO

howardfine

Member

Data size has less to do with processing speed than anything else. The purpose of a wider width suits scientific computing and those with large memory addressing but home users have little, if any, need for 64-bit computing. Even 32-bit was only necessary due to large memory requirements for graphics and video prevalent today.

Manufacturers may struggle to create general-purpose, single chip 128-bit processors. They exist today but don't resemble anything in a PC. The physical size would just be too large which is why you see multiple cores and not longer word size.
dave
Premium Member
join:2000-05-04
not in ohio

2 edits

dave

Premium Member

Data size has less to do with processing speed than anything else.

Just to make it clear, I'm not the one that claimed that data width was a performance issue.

Wider fixed-point integers are trivial to implement in software with not a huge loss in efficiency. Greater-precision floating point is probably trickier (don't ask me, I'm an integer guy)

x86 has had 80-bit floating point since the 8087 math coprocessor, so it's clearly not related to whether we're talking about 16, 32, or 64-bit CPUs. IEEE 754 specifies a 128-bit form but AFAIK the x86 doesn't implement it - though from the preceding, it ought to be able to do so independent of turning x86 into a 128-bit architecture.

x86 has 128-bit registers for SIMD instructions but they're used to hold 2 x 64-bit numbers, 4 x 32-bit numbers, etc.

The VAX, bless it, had some 128-bit operations back in the late 1970s (though it was only a 32-bit machine)

The physical size would just be too large which is why you see multiple cores and not longer word size.

That seems implausible. The cost of a 128-bit processor is, what, doubling the register size of a couple of dozen registers, and (probably) making ALU data paths twice as wide.

The reason why there aren't 128-bit general-purpose processors is that 128 bits of VA are not yet needed. And if it doesn't have 128 bits of VA, it's not a 128-bit machine. That is all I meant.

EDIT: I just read in Wikipedia that the amount of data stored on Earth is around 2-to-the-70th bytes; i.e., around 70 bits is enough to uniquely address everything we've got. So it'll be a while before any individual computer needs 128-bit addressing.

EDIT2: I was once part of a team that defined a software shared-memory architecture that used 128-bit addresses, but the idea there was that addresses were never reused, i.e., the address space was sparse. Once you'd destroyed the object at address 0x1234, nothing else would ever have address 0x1234.