Problems this morning with litespeed going unresponsive

LSUser12

Well-Known Member
#1
For everyone dealing with the issues this morning, here's what I'm seeing

The nobody user fork of the litespeed process becomes unresponsive:

nobody 722810 65.9 0.0 131856 18156 ? R<l 11:03 5:05 litespeed (lshttpd)

... and will appear to max out the CPU in top.

strace reveals no system calls, and the process itself is unresponsive to kill / term signal

I'm still trying to get a debug level 9 error log when it's down

It appears to affect a lot of versions, even older ones.
 
#3
We have the same issue random on some servers.

We still investigate but nothing until now.

Plase reply if you find something. We hope the litespeed support answer to ouer ticket...
 

LSUser12

Well-Known Member
#5
Looks like an attack?

Seeing this russian network on different servers right before they go down:

188.120.224.119

2017-02-17 11:28:04.987 [DEBUG] [188.120.224.119:61167] Headers: GET /rss/catalog/notifystock/ HTTP/1.1
Host:xxx
Connection: keep-alive
Accept-Encoding: gzip,deflate
Authorization: Basic AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 

LSUser12

Well-Known Member
#7
So far I have found no examples of a server going down that wasn't from that network.

2017-02-17 09:12:20.208 [NOTICE] [188.120.224.119:56848] Status 400: Http request header is too big, abandon!
2017-02-17 10:04:14.863 [NOTICE] [188.120.224.119:65358] Status 400: Http request header is too big, abandon!
2017-02-17 10:12:20.100 [NOTICE] [188.120.224.119:60603] Status 400: Http request header is too big, abandon!
2017-02-17 10:19:28.835 [NOTICE] [188.120.224.119:52043] Status 400: Http request header is too big, abandon!
2017-02-17 10:26:03.751 [NOTICE] [188.120.224.119:59108] Status 400: Http request header is too big, abandon!
2017-02-17 09:12:20.208 [NOTICE] [188.120.224.119:56848] Status 400: Http request header is too big, abandon!
2017-02-17 10:04:14.863 [NOTICE] [188.120.224.119:65358] Status 400: Http request header is too big, abandon!
2017-02-17 10:12:20.100 [NOTICE] [188.120.224.119:60603] Status 400: Http request header is too big, abandon!
2017-02-17 10:19:28.835 [NOTICE] [188.120.224.119:52043] Status 400: Http request header is too big, abandon!
2017-02-17 10:26:03.751 [NOTICE] [188.120.224.119:59108] Status 400: Http request header is too big, abandon!
 

LSUser12

Well-Known Member
#8
Definitely the issue has to do with overflowing the HTTP authorization.

Seeing it in every occurrence. The only server's that have gone down since blocking the russian IP (I blocked the /24) are ones that get proxied through another IP I haven't blocked:

2017-02-17 11:35:16.011 [DEBUG] [141.101.79.84:57138] Headers: GET /rss/catalog/notifystock/ HTTP/1.1
Host: www.arenaflowers.co.in
User-Agent: Railgun/5.3.0
Authorization: Basic AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA

(Dead)
 

hd-sam

Active Member
#9
We are also seeing the same thing.

Until this is patched...

Block these IPs in your firewall:
188.120.224.0/24
188.120.226.0/24
188.120.232.0/24
188.120.234.0/24
188.120.235.0/24
188.120.236.0/24
69.30.193.133
 

LSUser12

Well-Known Member
#10
I'm blocking all of 188.120.224.0/20

LS mentioned 69.30.193.133, but I'm skeptical that one is involved.

Keep in mind that if some network serves as a proxy for a request, it's going to show up too in the error log.

The big pain here is that the bug will break LS through cloudflare or sucuri etc
 

hd-sam

Active Member
#12
I would still block it. We have tons from that 69.30.193.133 IP. Also lists a Russian name as customer for IP at a datacenter in Kansas City: http://whois.domaintools.com/69.30.193.133

69.30.193.133 - satrun77 [17/Feb/2017:09:10:10 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - satrun77 [17/Feb/2017:09:11:16 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - Meanbee [17/Feb/2017:09:12:21 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - Meanbee [17/Feb/2017:09:13:28 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - Meanbee [17/Feb/2017:09:14:34 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - Meanbee [17/Feb/2017:09:15:40 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - therouv [17/Feb/2017:09:17:04 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - therouv [17/Feb/2017:09:17:51 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - therouv [17/Feb/2017:09:18:57 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - therouv [17/Feb/2017:09:20:04 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - FireGento_Team [17/Feb/2017:09:21:11 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - FireGento_Team [17/Feb/2017:09:22:17 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - FireGento_Team [17/Feb/2017:09:23:23 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - FireGento_Team [17/Feb/2017:09:24:29 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - Zopim [17/Feb/2017:09:25:35 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - Zopim [17/Feb/2017:09:26:45 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - Zopim [17/Feb/2017:09:27:21 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - Zopim [17/Feb/2017:09:27:50 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - MangoExtensions [17/Feb/2017:09:28:19 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - MangoExtensions [17/Feb/2017:09:28:47 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - MangoExtensions [17/Feb/2017:09:29:18 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - MangoExtensions [17/Feb/2017:09:30:04 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - sherrie [17/Feb/2017:09:31:11 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - sherrie [17/Feb/2017:09:32:17 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - sherrie [17/Feb/2017:09:33:24 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - sherrie [17/Feb/2017:09:34:30 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - bolasevich [17/Feb/2017:09:35:37 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - bolasevich [17/Feb/2017:09:36:42 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - bolasevich [17/Feb/2017:09:37:45 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - bolasevich [17/Feb/2017:09:38:52 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - SOFORT_AG [17/Feb/2017:09:40:00 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - SOFORT_AG [17/Feb/2017:09:41:06 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - SOFORT_AG [17/Feb/2017:09:42:14 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - SOFORT_AG [17/Feb/2017:09:43:22 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - VnEcoms [17/Feb/2017:09:44:29 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - VnEcoms [17/Feb/2017:09:45:36 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - VnEcoms [17/Feb/2017:09:46:43 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - VnEcoms [17/Feb/2017:09:47:46 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - migrashop [17/Feb/2017:09:48:53 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - migrashop [17/Feb/2017:09:50:01 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - migrashop [17/Feb/2017:09:51:08 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - migrashop [17/Feb/2017:09:52:15 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
69.30.193.133 - AddThis [17/Feb/2017:09:53:23 -0800] "GET /rss/catalog/notifystock/ HTTP/1.1" 401 44 "-" "-"
 

LSUser12

Well-Known Member
#14
Well fwiw I haven't had any issues since blocking that network both in the firewall and LS <deny/> rules.

Looks like a zero day. So far nothing to suggest it did anything beyond DOSing the LS service. I'm rolling out the latest update as recommended
 

Jon K

Administrator
Staff member
#20
pardis, What is your current CPU % for litespeed? The fix for the issue was included in 5.1.13 which makes me to think this is something different. Have you tried restarting litespeed? What is using all your CPU?
 
Top