|reply to jlivingood |
Re: The big question
...and they are wrong.
If 100 (binary) Gigabytes (107,374,182,400 bytes, or 'octets') are counted by the [true binary gigabyte] meter, the meter will show '100 GB'. However, an application that reports 1 Billion bytes as '1 GB' will show '107 GB' (or '107.37 GB' if it does not round).
Therefore, 'counting traffic by billions of bytes' will result in a +7.37% error relative to the meter which uses binary numbers (true Gigabytes). This POSITIVE error indicates that the value is HIGH relative to the standard [true binary gigabyte] value. In this case, the decimal is OVERreporting relative to the [standard true binary gigabyte] meter.