403 forbidden errors that happen sometimes

Discussion in 'Install/Configuration' started by Monarobase, Apr 11, 2013.

  1. Monarobase

    Monarobase Well-Known Member


    We have been using litespeed for quite a few months now and are very pleased with it. However a few customers have complained about 403 errors that appear sometimes on certain PHP pages and go away after a few minutes.

    We have PHP in deamon mode and have set quite high limits :

    PHP suEXEC Max Conn : 100

    Per Client Throttling

    Static Requests/second : 60
    Dynamic Requests/second : 50
    Connection Soft Limit : 50
    Connection Hard Limit : 80


    Are these settings coherent ?

    We had Dynamic Requests/second : 15

    Is this too low for bots like Google ?

    What level of logging would I need to enable to find ou if any IP's were shown 403 errors because of too many connections ? I presume it is 403 errors that are given when the maximum amount of allowed connections is hit for a single IP ?

  2. webizen

    webizen Well-Known Member

    Dynamic Request/sec should be lot lower than Static (~1/10). the recommended limit for soft and hard is 20 and 30. but you may raise it whenever you see fit.

    maybe those users are behind a proxy?

    Info log level should let you see. Yes, 403 error will be given for those reach the connection limit.
  3. Monarobase

    Monarobase Well-Known Member

    No they are not behind a proxy, and a customer has reported google bot being blocked (reporting numerous 403 errors).

    Is there some way I can see the list of IP's that were blocked because of ratelimiing ? What level of logging do I need to activate to see if it is this causing the 403 errors ?

  4. webizen

    webizen Well-Known Member

    NOTICE logging level should do. you should see something like 'reached per client hard limit'.
  5. Monarobase

    Monarobase Well-Known Member

    I've activated notice level logging but am not seeing the reason for blockages.

    In /usr/local/apache/domlogs/[USERNAME]/[DOMAIN]

    I see 403 error's link this : - - [11/Apr/2013:16:13:56 +0200] "GET /1175-2531-thickbox/file.jpg HTTP/1.1" 403 380 "-" "Googlebot-Image/1.0"

    When I enter the complete URL in my browser the file comes up.

    I've checked both :


    And in


    And I can't find the reason for the 403 errors.

    The user's .htaccess files are not blocking anything.

    I've got 403 errors for both static image files and dynamic php files without any apparent reason.

    What can I do to find out why this is happening a prevent it from happening. I can't have google being blocked
  6. Monarobase

    Monarobase Well-Known Member

    Could this be an issue with the option to block badly formed requests ?

    I've just set it to "no" in the hopes that it will stop the 403 errors
  7. webizen

    webizen Well-Known Member

    set logging level to INFO to show more context of 403 error in error_log which should help figure out the reason.
  8. Monarobase

    Monarobase Well-Known Member

    Setting Block Bad Request to "No" seems to have fixed the 403 errors for the moment.

    If this option blocks Google then I presume it need's fixing to not block Google
  9. NiteWave

    NiteWave Administrator

    or triggered any mod_security rule?
  10. Monarobase

    Monarobase Well-Known Member

    Mod security is disabled as it caused too many issues.
  11. NiteWave

    NiteWave Administrator

    not 100% sure, if bad request blocked, may leave log entry:
    "Status 400: Bad request method"
    instead of 403
  12. Monarobase

    Monarobase Well-Known Member

    I found this :

    2013-04-12 18:12:14.190 [NOTICE] [] too many bad requests, block.

    And lines like this in user domlogs : - - [11/Apr/2013:16:42:28 +0200] "GET /xxxx-xxxx-thickbox/x-xxx-xxx.jpg HTTP/1.1" 403 380 "-" "Googlebot-Image/1.0"

    They are both Google's bots :

    host domain name pointer crawl-66-249-76-94.googlebot.com.

    host domain name pointer crawl-66-249-76-96.googlebot.com.
  13. NiteWave

    NiteWave Administrator

    but ... raise another question: can you find time stamp matched log in both error_log and domlog ? IP + time stamp.
  14. Monarobase

    Monarobase Well-Known Member

    How does the blocking work exactly ?

    I see in the domlogs that got 403 errors for multiple days and multiple times per hour and that since I deactivated the option to block bad requests there are no more 403 errors. is crawling the site again and getting 200 success messages instead of 403 error messages.

    Would the error log contain just the first blockage ? If so I guess I won't have the info as I activated more detailed logging only a few days ago.

Share This Page