Serialization Question

Oct 27th, 2007

I'm just looking for some clarity on something I read. It doesn't sound right so was hoping someone could explain it.

If for example you have a 100Mbps FastEthernet link. The router will be transmitting data onto the line at 100,000,000 bits per second. So that's 1,000 bits per millisecond if you divide by 100,000. That makes sense to me so far. The next part said that a 1500 byte packet would take 1.5ms to be serialized (which sounds right at first). But I gave it some thought and this is where I'm getting confused. If we were talking about 1500 bits, that makes sense to me because you are working with the same metric. As you know there are 8 bits in a byte so surely for a 1500 byte packet you would need to multiply that by 8 and then divide by 1000? So it would be 12ms?

Cheers,

Mike

Formula for calculating serialisation delay is:

no. of bits sent / speed of link.

In your case, 1500*8 / 100,000,000 = .12ms

Overall Rating: 5 (1 ratings)

Replies

bvsnarayana03 Sun, 10/28/2007 - 03:27

Formula for calculating serialisation delay is:

no. of bits sent / speed of link.

In your case, 1500*8 / 100,000,000 = .12ms

michael.whittle Sun, 10/28/2007 - 07:59

So you're basically agreeing with me that my reasoning is correct and that the answer is 12ms rather than 1.5ms? I thought so. Maybe it was a typo in the document I was reading.

Thanks,

Mike