A Byte is a character. Multiple bits make up a character.
Depending upon the number of control bits, there could be between 8-12 bits per Byte. I use 10 as a general rule, going back to the old signalling days when you had 8 data bits, 1 start bit, 1 stop bit and no parity bits.
Therefore, using simple conversion math:
1,000,000 Byte 10 bits 10,000,000 bits
1MBps = -------------------- * ---------------- = --------------------- = 10Mbps
sec 1Byte sec
@MarkJFine wrote:A Byte is a character. Multiple bits make up a character.
Depending upon the number of control bits, there could be between 8-12 bits per Byte. I use 10 as a general rule, going back to the old signalling days when you had 8 data bits, 1 start bit, 1 stop bit and no parity bits.
Therefore, using simple conversion math:
1,000,000
Byte10 bits 10,000,000 bits
1MBps = -------------------- * ---------------- = --------------------- = 10Mbpssec 1
Bytesec
But nearly all commercial speed tests use binary, not decimal, for their conversions. In speed tests 1MBps = 8Mbps and 1Mbps = 0.125MBps. I've never seen speed tests using a decimal conversion before, or at least not in anything commercial.
And remember, this is data transfer rate, not data amounts like storage.
@GabeU wrote:And remember, this is data transfer rate, not data amounts like storage.
Helps to know how it's being transferred
thanks for answering my question that you produced more detailed information.