I was trying to understand a simple scenario regarding bandwidth usage. Assuming there is an ethernet link of 40Mbps capacity between two locations.
there is a huge file transfer of around 400MB from one to another point.
The usage of the link after this transfer is complete shows around 10Mbps, at the same time the end host doing this tranfer has output stating completed transfer with around 1420 kbps.
how do we understand this difference between these seperate interpretations. does 1420 refer to the speed accepted/used by the protocol during the transfer only or is it something else?
Appreciate all help!
Thanks in Advance.