said by XknightHawkX: Now who is the idiot that decided to define the speeds in bits?
The entire communications industry by convention always uses bits to express data rates, since that is what is sent down the "wire": the data is sent one bit after another down the transmission medium. Computer memories are organized around chunks called "bytes", so that is what is used in that context. And years ago, there were machines that didn't use 8-bit bytes. The CDC 6000 series in the '70s used a 60-bit word. DEC PDP-10 systems used 36-bit words.