Does anyone know how often WRED calculates the average queue length? I know the general formula for AVG_Q(t) for time=t, with exponential weight=n is
AVG_Q(t) = (AVG_Q(t-1) * (1-(1/2^n)) + (CURRENT_Q * 1/2^n)
So, for a standard n=9, the formula becomes:
AVG_Q(t)= (AVG_Q(t-1) * (511/512)) + (CURRENT_Q/512)
I assume that CURRENT_Q is an instantaneous reading at the time of the calculation. Does anyone know how long it takes until AVG_Q(t) becomes AVG_Q(t-1)?? Is it once a second? Every time a new packet arrives?
With the default value of n=9, it takes several thousand iterations for the AVG_Q to match the mean queue depth.... I set up a quick spreadsheet to calculate the AVG_Q with various values of n out to 4096 iterations. If I set CURRENT_Q to a constant value of 40, it takes 2242 iterations for AVG_Q to catch up. If I set CURRENT_Q to a random number between 0 and 40, it takes AVG_Q about 1500 iterations to catch up.
I'm trying to figure out if I want to stay with the default n=9 or tune it lower, but that would depend on how frequently the computation runs. If it's once per incoming packet, that's probably fine -- I often see several hundred packets/second on my WAN interfaces, so 1800 iterations would only take about 10 seconds. Anyone know for certain? Thanks!!