said by »www.petapixel.com/2012/10/10/why···ertised/ :
Humans think about numbers in base 10, the decimal numeral system, because we have 10 fingers and 10 toes. Thats why the parts of numbers are called digits just like the parts of our hands and feet.
Computers, on the other hand, think in base 2, the binary numeral system.
Herein lies the root of the issue. The brilliant marketing gurus at data storage companies decided early on that all their products should be marketed in the decimal system, since thats what consumers understand.
Therefore, one megabyte on their products is equal to 1,000,000 bytes, and one gigabyte is equal to 1,000,000,000 bytes. To a computer, however, a megabyte is 1,048,576 bytes and a gigabyte is 1,073,741,824 bytes.
Thus, for each gigabyte advertised in base 10, youre actually receiving about 70 megabytes less than a gigabyte in base 2.
Looks about right to me.