I have a Router with ATM interface with the bandwidth of 256-k. Now when i ping the other end with 1500 byte datagram size i get the 30 second 80-k bandwidth utilization. When i run simultaneously two tests with datagram size of 1500 then my bandwidth utilization jumps to 180-k. Similarly on my third simultaneous test my bandwidth utilization jumps to 350-k.
My question is this bandwidth utilization of 80-k, 180-k and 350-k in ATM ( 256-K) what's the equation or how it gets calculated. Any guidance or help will be greatly appreciated.
there is approximatly 9% header loss in atm cells so 256*0.09 = 23kbps then 256-23 = 233kbps which should be your actual bandwidth. hope this answers your question.
If i understand your question well, generally speaking, ATM uses cells, a cell is 53 bytes, 5 bytes header and 48 bytes payload. Accordingly every 53 bytes will have 5 bytes overhead and 48 bytes pure payload.
HTH, please rate if it does help,
the thing is than you cannot base any serious calculation on the output of show interface, and pings made from the router. It is means just to give an approximated value and no measurements purposes. If you want to know the real speed of your circuit, you have to use a tool on both sides and that will measure everything accurately.
Hope this helps, please rate post if it does!