cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
140239
Views
35
Helpful
37
Replies

Hulc LED Process

billy.bakti
Level 1
Level 1

Hi All,..

I saw in my switch there is a hig processor CPU in PID 110, could you please advice me what it is? and how to solve this?

Device type : WS-C3560G-48PS

-----------------------------------------

Switch#sh proc cpu sor

CPU utilization for five seconds: 23%/0%; one minute: 22%; five minutes: 21%

PID Runtime(ms) Invoked uSecs 5Sec 1Min 5Min TTY Process

110 11990607 1436516 8347 15.97% 15.83% 15.84% 0 Hulc LED Process

-----------------------------------------

Thanks

37 Replies 37

Yudong Wu
Level 7
Level 7

Hulc LED Process is a process responsible for link status detection.

If there is SFP port in use, it might be a little bit high.

You can check if you hit the following bug.

CSCsi78581 Hulc LED Process uses 10-15% CPU on Catalyst 3750/3560.

Hi,

We have a similar problem on a 2960 with SFP ports.

Is this bug only related to 3750/3560 or also to 2960 ? I can't find a bug on the bug toolkit...

Our CPU is running around 25 - 30 % so we would like to decrease it if possible.

Do you know a workaround ?

gr

That bug is not related to 2960.

I tested 2960 in the lab. If it is 2960 with PoE, Hulc LED process will use about 15% CPU.

In general, it is normal for 2960 to have cpu usage around 15-40%.

If you don't have any performance issue, you can ignore this.

CSCsi78581 lists this fixed in in 12.2(44)SE only, when it will be fixed in other newer versions ?

I have it on 48 ports 2960 thta can ony run 12.2(50) and higher.

On one of the 2960-48 that suffer this, I've loaded for a try 12.2(52)SE

Hulc LED remained the same, but general CPU utilli went up actually with "REP LSL" ?!?.

CPU utilization for five seconds: 34%/0%; one minute: 30%; five minutes: 29%
PID Runtime(ms)   Invoked      uSecs   5Sec   1Min   5Min TTY Process
114     9582762   1518651       6310 15.65% 15.40% 15.50%   0 Hulc LED Process

96        1834       108      16981  5.11%  0.41%  0.08%   1 SSH Process
238      192955     73776       2615  3.35%  0.63%  0.40%   0 SNMP ENGINE
236       78907    133133        592  2.07%  0.31%  0.17%   0 IP SNMP
183     4008328 164163844         24  1.27%  7.01%  7.28%   0 REP LSL Hello PP

Can't Cisco fix this ?

disable not-connected ports and you'll see a drop in the cpu usage for this process

I shut down the SPF ports not in use and saw CPU drop by a minimum of 20% on all of my 2960S-POE switches:

Then it came right back up to %60

Upgrade to either 12.2(55)SE3 or 12.2(58)SE1.

Hi,

After disable two SFP ports in 2960 the "Hulc LED process" dropped to 6.23%. Before it was 15.65%!!

leolaohoo,

    

     This IOS (58) isn't consuming a lot of memory?

I've got news for you George, STAY AWAY from IOS level 15.0.  Across the board, it's got a memory leak issue WITHOUT any configs.

I'm currently using 12.2(58)SE1 and I have no issue. 

Hi,

i've got the same problem. We have two 3560x 48P running IOS 12.2(58)SE2 and the HULC LED process takes over 15% cpu load.

For what this process is responsable?

    3333333333333333333333333344444444443333344444333333333333

    7766663333344444888889999922222555557777700000999988888888

100

90

80

70

60

50                                *****

40 ******          ******************************************

30 **********************************************************

20 **********************************************************

10 **********************************************************

   0....5....1....1....2....2....3....3....4....4....5....5....

             0    5    0    5    0    5    0    5    0    5

               CPU% per second (last 60 seconds)

    4444445444444444444444444444444444444444444444444444444444

    5533537321232332633343242222223322445244332224325233535342

100

90

80

70

60       *

50 **  * *         *                   *           *   * *

40 ##########################################################

30 ##########################################################

20 ##########################################################

10 ##########################################################

   0....5....1....1....2....2....3....3....4....4....5....5....

             0    5    0    5    0    5    0    5    0    5

               CPU% per minute (last 60 minutes)

              * = maximum CPU%   # = average CPU%

     1       1                 1     1111                       1

    7068757780566565576559785560556660000566555666976756656774670765667657

    3032391000910845988349359340222180000048268309429544494069810073970443

100  *       *           *     *     ****                       *

90  *       *           * *   *     ****         *             *

80  * *    **       *   * *   *     ****         *  *      *   *

70 ** ** ****       **  ***   *    *****  *     *****     ** ***** ***  *

60 *******************  **** **  ******* ** ********* ****** ***** **** *

50 **********************************************************************

40 ######################################################################

30 ######################################################################

20 ######################################################################

10 ######################################################################

   0....5....1....1....2....2....3....3....4....4....5....5....6....6....7.

             0    5    0    5    0    5    0    5    0    5    0    5    0

                   CPU% per hour (last 72 hours)

                  * = maximum CPU%   # = average CPU%

3560-x#sh processes cpu sorted 5min

CPU utilization for five seconds: 35%/0%; one minute: 35%; five minutes: 38%

PID Runtime(ms)   Invoked      uSecs   5Sec   1Min   5Min TTY Process

129  1769580351 396248960       4465 15.97% 15.39% 14.11%   0 Hulc LED Process

  60  1017745522 191907631       5303  8.62%  7.60%  7.13%   0 RedEarth Tx Mana

211  1379704348  90033624      15324  0.15%  1.23%  5.45%   0 REP LSL Hello PP

  59   266213943 266690463        998  2.23%  1.93%  1.92%   0 RedEarth I2C dri

  97   394926822  33075411      11940  1.59%  1.55%  1.54%   0 hpm counter proc

I have the same issue as ssch1ndler does; exact same code revision but my usage for this particular process sits at 20% even after turning off the 4 gi1/1 - 4 ports.

Anyone know if a newer version of the code has fixed this high cpu process?

Thank you

I have fought this matter with the TAC for quite a long time, they were adamant:

no fix is possible (hard to believe, I know) and one has to keep it as it is.

Getting Started

Find answers to your questions by entering keywords or phrases in the Search bar above. New here? Use these resources to familiarize yourself with the community: