[rbak-nsp] Slot 2 CRASH after upgrade to SEOS-12.1.1.12p15

Wojciech Wrona w0jtas at w0jtas.com
Tue Jul 28 02:46:48 EDT 2020


Hi guys,
We're having strange problem with SE600 equipped with 4 10ge-4-port, and 
2 xcrp4-base. It was working fine on SEOS-12.1.1.11p7-Release with more 
than 2,5 year uptime, till 23th July when we decided to upgrade to 
SEOS-12.1.1.12p15-Release to mitigate potential BGP UPDATE flags problem 
which is mentioned in changelog and was mentioned on this group earlier.

Upgrade went fine and till today everything was working ok. Today at 4 
am something wrong started to happen at Slot 2. Card crashed, and after 
comming up started to loose around 30% of traffic. Making a long story 
short, we pulled it out and diagnosed that problematic is only card in 
slot 2.
*
**After every insertion of card in slot 2 (or redback restart of course) 
we have following logs (which wasnt the case before):*

Jul 28 08:14:08: %PPAQOS-3-QOS_ERROR: 
bc51a3df/0000000016/470900000:02/EPPA/EU00:For cct 255/22:13:58/7/2/20 
policy type 200 failed to process parameter 6, error code 1 - 001a5eee 
003a30cc 001a512e 0016f84e 0003f30e 0020f94c 0021348c 0003dbae 0039ebcc
0039eb80 00000000
Jul 28 08:14:08: %PPAQOS-3-QOS_ERROR: 
bc55ea03/0000000016/471500000:02/EPPA/EU00:For cct 255/22:13:58/7/2/21 
policy type 200 failed to process parameter 6, error code 1 - 001a5eee 
003a30cc 001a512e 0016f84e 0003f30e 0020f94c 0021348c 0003dbae 0039ebcc
0039eb80 00000000
Jul 28 08:14:08: %PPAQOS-3-QOS_ERROR: 
bc5a2ccd/0000000016/472100000:02/EPPA/EU00:For cct 255/22:13:58/7/2/22 
policy type 200 failed to process parameter 6, error code 1 - 001a5eee 
003a30cc 001a512e 0016f84e 0003f30e 0020f94c 0021348c 0003dbae 0039ebcc


*And after some time something like that goes on:*
Jul 28 08:06:10: [255/22:13:58/1/2/83]: %STAT-7-CALC: 
[stat_cct_reset_counter_base_hw]source_p->rx_pad_byte_overhead : 0
Jul 28 08:06:10: [255/22:13:58/1/2/84]: %STAT-7-CALC: 
[stat_base_counter_cg_archive_reset_ippa]source_p->rx_pad_byte_overhead 
: 0,target_p->rx_pad_byte_overhead : 0
Jul 28 08:06:10: [255/22:13:58/1/2/84]: %STAT-7-CALC: 
[stat_base_counter_cg_archive_reset_ippa]source_p->rx_pad_byte_overhead 
: 0,target_p->rx_pad_byte_overhead : 0
Jul 28 08:06:10: [255/22:13:58/1/2/84]: %STAT-7-CALC: 
[stat_cct_reset_counter_base_hw]source_p->rx_pad_byte_overhead : 0
Jul 28 08:06:10: [255/22:13:58/1/2/84]: %STAT-7-CALC: 
[stat_cct_reset_counter_base_hw]source_p->rx_pad_byte_overhead : 0
Jul 28 08:06:10: [255/22:13:58/1/2/85]: %STAT-7-CALC: 
[stat_cct_reset_counter_base_hw]source_p->rx_pad_byte_overhead : 0
Jul 28 08:06:10: [255/22:13:58/1/2/85]: %STAT-7-CALC: 
[stat_cct_reset_counter_base_hw]source_p->rx_pad_byte_overhead : 0
Jul 28 08:06:10: [255/22:13:58/1/2/86]: %STAT-7-CALC: 
[stat_base_counter_cg_archive_reset_ippa]source_p->rx_pad_byte_overhead 
: 0,target_p->rx_pad_byte_overhead : 0
Jul 28 08:06:10: [255/22:13:58/1/2/86]: %STAT-7-CALC: 
[stat_base_counter_cg_archive_reset_ippa]source_p->rx_pad_byte_overhead 
: 0,target_p->rx_pad_byte_overhead : 0
Jul 28 08:06:10: [255/22:13:58/1/2/86]: %STAT-7-CALC: 
[stat_cct_reset_counter_base_hw]source_p->rx_pad_byte_overhead : 0
Jul 28 08:06:10: [255/22:13:58/1/2/86]: %STAT-7-CALC: 
[stat_cct_reset_counter_base_hw]source_p->rx_pad_byte_overhead : 0
Jul 28 08:06:10: [255/22:13:58/1/2/87]: %STAT-7-CALC: 
[stat_cct_reset_counter_base_hw]source_p->rx_pad_byte_overhead : 0
Jul 28 08:06:10: [255/22:13:58/1/2/87]: %STAT-7-CALC: 
[stat_cct_reset_counter_base_hw]source_p->rx_pad_byte_overhead : 0
Jul 28 08:06:10: [255/22:13:58/1/2/88]: %STAT-7-CALC: 
[stat_base_counter_cg_archive_reset_ippa]source_p->rx_pad_byte_overhead 
: 0,target_p->rx_pad_byte_overhead : 0
Jul 28 08:06:10: [255/22:13:58/1/2/88]: %STAT-7-CALC: 
[stat_base_counter_cg_archive_reset_ippa]source_p->rx_pad_byte_overhead 
: 0,target_p->rx_pad_byte_overhead : 0
Jul 28 08:06:10: [255/22:13:58/1/2/88]: %STAT-7-CALC: 
[stat_cct_reset_counter_base_hw]source_p->rx_pad_byte_overhead : 0
Jul 28 08:06:10: [255/22:13:58/1/2/88]: %STAT-7-CALC: 
[stat_cct_reset_counter_base_hw]source_p->rx_pad_byte_overhead : 0
Jul 28 08:06:10: [255/22:13:58/1/2/89]: %STAT-7-CALC: 
[stat_base_counter_cg_archive_reset_ippa]source_p->rx_pad_byte_overhead 
: 0,target_p->rx_pad_byte_overhead : 0
Jul 28 08:06:10: [255/22:13:58/1/2/89]: %STAT-7-CALC: 
[stat_base_counter_cg_archive_reset_ippa]source_p->rx_pad_byte_overhead 
: 0,target_p->rx_pad_byte_overhead : 0
Jul 28 08:06:10: [255/22:13:58/1/2/89]: %STAT-7-CALC: 
[stat_cct_reset_counter_base_hw]source_p->rx_pad_byte_overhead : 0
Jul 28 08:06:10: [255/22:13:58/1/2/89]: %STAT-7-CALC: 
[stat_cct_reset_counter_base_hw]source_p->rx_pad_byte_overhead : 0
Jul 28 08:06:10: [255/22:13:58/1/2/90]: %STAT-7-CALC: 
[stat_base_counter_cg_archive_reset_ippa]source_p->rx_pad_byte_overhead 
: 0,target_p->rx_pad_byte_overhead : 0
Jul 28 08:06:10: [255/22:13:58/1/2/90]: %STAT-7-CALC: 
[stat_base_counter_cg_archive_reset_ippa]source_p->rx_pad_byte_overhead 
: 0,target_p->rx_pad_byte_overhead : 0
Jul 28 08:06:10: [255/22:13:58/1/2/90]: %STAT-7-CALC: 
[stat_cct_reset_counter_base_hw]source_p->rx_pad_byte_overhead : 0
Jul 28 08:06:10: [255/22:13:58/1/2/90]: %STAT-7-CALC: 
[stat_cct_reset_counter_base_hw]source_p->rx_pad_byte_overhead : 0
Jul 28 08:06:10: [255/22:13:58/1/2/91]: %STAT-7-CALC: 
[stat_base_counter_cg_archive_reset_ippa]source_p->rx_pad_byte_overhead 
: 0,target_p->rx_pad_byte_overhead : 0
Jul 28 08:06:10: [255/22:13:58/1/2/91]: %STAT-7-CALC: 
[stat_base_counter_cg_archive_reset_ippa]source_p->rx_pad_byte_overhead 
: 0,target_p->rx_pad_byte_overhead : 0

And card crashes. It does not look like hardware card issue cuz we have 
spare one, we've changed it and story looks the same. First log already 
showed up so i think that the second one is the matter of time.

It does not feel like it is a coincidence. It have to be software 
related. Anyone had similar issues ?

Best regards,
-- 
Wojciech Wrona

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://puck.nether.net/pipermail/redback-nsp/attachments/20200728/f42e1b70/attachment.htm>


More information about the redback-nsp mailing list