I'm just looking for some clarity on something I read. It doesn't sound right so was hoping someone could explain it.
If for example you have a 100Mbps FastEthernet link. The router will be transmitting data onto the line at 100,000,000 bits per second. So that's 1,000 bits per millisecond if you divide by 100,000. That makes sense to me so far. The next part said that a 1500 byte packet would take 1.5ms to be serialized (which sounds right at first). But I gave it some thought and this is where I'm getting confused. If we were talking about 1500 bits, that makes sense to me because you are working with the same metric. As you know there are 8 bits in a byte so surely for a 1500 byte packet you would need to multiply that by 8 and then divide by 1000? So it would be 12ms?