# basic throughput calculations

Hi all,

I am doing some basic calculations for time it should take to transfer a file over my network. apparently its been a while since I had to do math as I am doubting myself!

I just wondered how long it would take to transfer my 90GB file over a network connection that I knew to be 800Mbps.

So I did the following:

1Mbps = .125MBps

800Mbps * .125MBps = 100MBps

So thats 100 meg a second correct?

1000Mb = 1GB

90000MB = 90GB

at that rate it should take roughly 900 seconds to transfer the 90GB file..or 15 minutes. is this right?

TIA,

R

Correct Answer by Joseph W. Doherty about 9 years 6 months ago

Your basic calculations are correct for time to transmit 90GB across 800 Mbps. In practice, due to frame/packet overhead, and how well stack deals with BDP (bandwidth delay product), actual transmission time might be considerably slower than the calculated value.

Overall Rating: 5 (1 ratings)

## Replies

Joseph W. Doherty Fri, 09/21/2007 - 07:20
• Super Bronze, 10000 points or more

Your basic calculations are correct for time to transmit 90GB across 800 Mbps. In practice, due to frame/packet overhead, and how well stack deals with BDP (bandwidth delay product), actual transmission time might be considerably slower than the calculated value.

Thanks for your reply. I do realize that this is a perfect world scenario with no overhead and such. I have dis proven that this will ever be the case with an actual transfer, but just for the sake of math...i gave it ago! thanks again