I try to figure out, how much throughput is used by beacons, when using different data rates. It has no practical reason - I just want to understand the theory.
I understand, that beacon frames are sent at the lowest configured data-rate. So, a beacon frame needs longer on the air, when using 1MBit/s than 54 MBit/s :-)
When allowing 1Mbit as the base data-rate, the medium is longer busy by beacons, than with higher data-rates. I want to calculate the time the medium is busy by beacons per second.
So here are my questions and assumptions:
1.) Does the AP has to wait for an IFS before sending a beacon (DIFS, SIFS, PIFS)?
2.) If yes to the above, is it necessary to do "backoffing" as well?
3.) If the beacon interval is 0.1s, then 10 beacons are sent per second. I assume 100byte per beacon, which makes 1k per second by beacons.
4.) On a data-rate of 1MBit/s (125000byte/s), the 1k of beacons need 0,008 seconds for transmission in total.
Are these assumptions correct?