said by dvd536:
Does comcast consider 1000 or 1024 bits a kilobit?
You may want to review »netforecast.com/documents/NFR510···racy.pdf
where that is addressed on page 4 in the chart "How Much is a Number?"
How Much Is a Number?--
The Comcast meter reports Gigabytes, which is a binary number not to be confused with the similar decimal number. There is an easy numbers trap that appears to make the two systems the same. A thousand is often referred to as the metric kilo, followed by a million that starts with the same "M" as mega. But in fact these are very different values. The following table illustrates the difference.
Counting traffic by billions of bytes will result in a -6.9% error relative to the meter which uses binary numbers. A negative error indicates that the value is low relative to the standard value. In this case the decimal is underreporting relative to the meter.
...and they are wrong.
If 100 (binary) Gigabytes (107,374,182,400 bytes, or 'octets') are counted by the [true binary gigabyte] meter, the meter will show '100 GB'. However, an application that reports 1 Billion bytes as '1 GB' will show '107 GB' (or '107.37 GB' if it does not round).
Therefore, 'counting traffic by billions of bytes' will result in a +7.37% error relative to the meter which uses binary numbers (true Gigabytes). This POSITIVE error indicates that the value is HIGH relative to the standard [true binary gigabyte] value. In this case, the decimal is OVERreporting relative to the [standard true binary gigabyte] meter.