Nexus vPC question

Unanswered Question

I am curious how failover is accomplished in the following scenario:

2 NX5ks and 2 NX2ks (each connected to a single N5K, NOT dual-homed) and a Server is connected to each NX2k, with vPC configured.  See diagram below.

In the event of a peer link failure, i understand how how the server understands NOT to send traffic to the secondary 2k switch (vpc shuts the port-channel).

However in the event of a SWITCH failure (i.e., nk501), how is the SERVER notified to STOP sending traffic to n2k01?  n2k01 has no connectivity upstream at this point.  The server has not lost link to either n2k01 or n2k02 so how does IT know to stop sending to n2k01?

Do the port-channels automatically shutdown on n2k01 when the upstream 5k connectivity is lost, or is information passed via n2k02 to the server via etherchannel?

n5k01 ----peer link-----n5k02

   ||                              ||

   ||                              ||

   X                            ||

   ||                              ||

   ||                              ||

n2k01                   n2k02

       \                    /

        \                  / 

         \                /

          \              /

          server (with vpc)

Thanks in Advance!

  • 1
  • 2
  • 3
  • 4
  • 5
Overall Rating: 5 (1 ratings)
scott1322 Wed, 08/04/2010 - 12:37
User Badges:

Hi Andrew,

Is the server a blade chassis? If it is something like the HP C-class chassis then using Cisco 312x or Flex10's I think there is software available for this exact type of scenario. I think it was called something like smartlink. If it's to a single server with 2 nic's that are "teamed" then I am sure the same software exisits from the vendor of the server.

I think the new NX-OS supports vPC links to dual x 2k's, but I havent touched the nexus for a few months now and cannot confirm this.

Collin Clark Thu, 08/05/2010 - 08:34
User Badges:
  • Purple, 4500 points or more

If I understand your question correctly, the answer would be spanning-tree.You're running a U shaped L2 domain. Spanning tree will see the links down and open blocked ports as necessary.

I opened a TAC on this and they confirmed that when an upstream 5K dies, all local interfaces on the 2K die as well. From the server/hosts's perspective, link is lost to the 2K that has no upstream connectivity.  So it is a link-loss event for all ports on the 2K when its upstream 5K dies....according to TAC.

Collin Clark Thu, 08/05/2010 - 09:06
User Badges:
  • Purple, 4500 points or more

Huh, never would have guessed that. Thanks for posting the info.

ronbuchalski Wed, 08/25/2010 - 06:49
User Badges:


Two things....

1) NIC teaming is not the same as etherchannel.  So, in the case of a server having two NICs teamed as a single interface, it will strictly react based on external link status.  Of course, it will also be able to react to NIC failure, which is considered internal to the server.

2) Regarding port-channel and vPC, I am using a scenario where I have a Catalyst 2950 connected via Gi0/1 and Gi0/2 to each N2Ks in a rack, and configured as a LACP-speaking port-channel to a vPC on the Nexus 5K.  This provides path diversity in the event of an interface, cable, N2K, or N5K failure.  I tested the scenario of losing the link between the N2K and N5K (which also simulates the loss of the N5K).  Upon loss of the N5K-N2K link, the interface on the Catalyst 2950 immediately saw loss of link and dropped the interface, but maintained connectivity via the remaining port-channel member.

Upon re-establishment of the N5K-N2K link, it takes 20-30 seconds for the N2K to re-establish communication with the N5K, after which time it will activate its' ports.  At that time, the port-channel connection on the Catalyst 2950 regained the second member, and the full port-channel is re-established.

Hope this answers your question.


iwearing Mon, 08/30/2010 - 03:36
User Badges:


I have been reading this post with interest as I have a situation where Servers with Dual Nics will be connected to two different N5Ks. In this scenario must I configure the N5Ks as a vPC.



chris.warren Wed, 09/29/2010 - 06:31
User Badges:

I am curious as to how you connected a Catalyst to the Nexus 2K. I didn't believe this could work since you cannot disable BPDUGuard on a Fabric Extender. Were you able to disable it?


This Discussion

Related Content