I am doing some basic calculations for time it should take to transfer a file over my network. apparently its been a while since I had to do math as I am doubting myself!
I just wondered how long it would take to transfer my 90GB file over a network connection that I knew to be 800Mbps.
So I did the following:
1Mbps = .125MBps
800Mbps * .125MBps = 100MBps
So thats 100 meg a second correct?
1000Mb = 1GB
90000MB = 90GB
at that rate it should take roughly 900 seconds to transfer the 90GB file..or 15 minutes. is this right?
Your basic calculations are correct for time to transmit 90GB across 800 Mbps. In practice, due to frame/packet overhead, and how well stack deals with BDP (bandwidth delay product), actual transmission time might be considerably slower than the calculated value.