We connect to our Metro Ethernet provider via multiple Gigabit Ethernet connections. On each connection we have configured 802.1Q as an encapsulation, and have created multiple subinterfaces, defined by 802.1Q tags, and having /30 IP subnets assigned to them. (Essentially, it's the Ethernet equivalent of a channel bank!)
My question: Each subinterface has been configured with a bandwidth statement that corresponds to the bandwidth provided by the ME provider (mostly 10M, but there are some 5M, 20M, one 30M and one 100M).
However, the delay parameter remains the default for the interface (10usec), and is the same for all subinterfaces.
Should the delay be set, per interface, to a lower value, corresponding with the lower speed of the actual bandwidth available per subinterface?
By the way, I have searched extensively and have yet to find how IOS determines the value of delay for an interface. Does anyone know?