403 forbidden errors that happen sometimes

Monarobase

Well-Known Member
#1
Hello,

We have been using litespeed for quite a few months now and are very pleased with it. However a few customers have complained about 403 errors that appear sometimes on certain PHP pages and go away after a few minutes.

We have PHP in deamon mode and have set quite high limits :

PHP suEXEC Max Conn : 100

Per Client Throttling

Static Requests/second : 60
Dynamic Requests/second : 50
Connection Soft Limit : 50
Connection Hard Limit : 80

---

Are these settings coherent ?

We had Dynamic Requests/second : 15

Is this too low for bots like Google ?

What level of logging would I need to enable to find ou if any IP's were shown 403 errors because of too many connections ? I presume it is 403 errors that are given when the maximum amount of allowed connections is hit for a single IP ?

Thanks
 

webizen

Well-Known Member
#2
Dynamic Request/sec should be lot lower than Static (~1/10). the recommended limit for soft and hard is 20 and 30. but you may raise it whenever you see fit.

maybe those users are behind a proxy?

Info log level should let you see. Yes, 403 error will be given for those reach the connection limit.
 

Monarobase

Well-Known Member
#3
No they are not behind a proxy, and a customer has reported google bot being blocked (reporting numerous 403 errors).

Is there some way I can see the list of IP's that were blocked because of ratelimiing ? What level of logging do I need to activate to see if it is this causing the 403 errors ?

Thanks
 

Monarobase

Well-Known Member
#5
I've activated notice level logging but am not seeing the reason for blockages.

In /usr/local/apache/domlogs/[USERNAME]/[DOMAIN]

I see 403 error's link this :

66.249.76.96 - - [11/Apr/2013:16:13:56 +0200] "GET /1175-2531-thickbox/file.jpg HTTP/1.1" 403 380 "-" "Googlebot-Image/1.0"

When I enter the complete URL in my browser the file comes up.

I've checked both :

/usr/local/apache/logs/access_log

And in

/usr/local/apache/logs/error_log

And I can't find the reason for the 403 errors.

The user's .htaccess files are not blocking anything.

I've got 403 errors for both static image files and dynamic php files without any apparent reason.

What can I do to find out why this is happening a prevent it from happening. I can't have google being blocked
 

Monarobase

Well-Known Member
#6
Could this be an issue with the option to block badly formed requests ?

I've just set it to "no" in the hopes that it will stop the 403 errors
 

Monarobase

Well-Known Member
#8
Setting Block Bad Request to "No" seems to have fixed the 403 errors for the moment.

If this option blocks Google then I presume it need's fixing to not block Google
 

Monarobase

Well-Known Member
#12
I found this :

2013-04-12 18:12:14.190 [NOTICE] [66.249.76.94:55996-0] too many bad requests, block.

And lines like this in user domlogs :

66.249.76.96 - - [11/Apr/2013:16:42:28 +0200] "GET /xxxx-xxxx-thickbox/x-xxx-xxx.jpg HTTP/1.1" 403 380 "-" "Googlebot-Image/1.0"

They are both Google's bots :

host 66.249.76.94
94.76.249.66.in-addr.arpa domain name pointer crawl-66-249-76-94.googlebot.com.

host 66.249.76.96
96.76.249.66.in-addr.arpa domain name pointer crawl-66-249-76-96.googlebot.com.
 

Monarobase

Well-Known Member
#14
How does the blocking work exactly ?

I see in the domlogs that 66.249.76.96 got 403 errors for multiple days and multiple times per hour and that since I deactivated the option to block bad requests there are no more 403 errors. 66.249.76.96 is crawling the site again and getting 200 success messages instead of 403 error messages.

Would the error log contain just the first blockage ? If so I guess I won't have the info as I activated more detailed logging only a few days ago.
 
Top