Bots are not visited my website because of Litespeed Cache

AndreyPopov

Well-Known Member
#2
what CMS you use?

as I know, Google bots (and other bots) not accepted cookies
LSCache use _lscache_vary coockie variable to mark ("store") cache

for example on my Opencart 3.x site I add to .htaccess

Code:
RewriteCond %{HTTP_USER_AGENT} "bot|compatible|images|cfnetwork|favicon|facebook|crawler|spider|addthis" [NC]
RewriteRule .* - [E=Cache-Control:vary=isBot]
and not set _lscache_vary coockie for Bots (commit for Opencart 3.x LScache module)


this is a separate cache for bots.
 
#5
you should add all domains to Google Webmasters and learn why the bot don't crawl your sites.
do you mean I should add with "HTTP" and non-WWW my website's domain?

how did you solve this problem? I have similar problems with my site. my site address: https://www.handpowertoolslist.com/
I still try to fix it. and I use quick cloud too... and I take a lot of DNS connection problems when I use the Quic cloud DNS..
 

Germont

Well-Known Member
#6
do you mean I should add with "HTTP" and non-WWW my website's domain?
I assume non-secured urls are redirected to https, then it doesn't matter anymore.
Same for non-www.
Then you can add sitemap, check what Google Bot (or why not Bing, they have their own platform?) says about an url. Once these small details are clarified, investigate at the server level.
 
#7
I assume non-secured urls are redirected to https, then it doesn't matter anymore.
Same for non-www.
Then you can add sitemap, check what Google Bot (or why not Bing, they have their own platform?) says about an url. Once these small details are clarified, investigate at the server level.
Yes. I redirected all non-secured URLs to HTTPS. but if I add all sitemaps to all other domains for my website then will I take crawled problems, isn't it? And I am using bing too. I sent all sitemap URLs there. But I took low traffic on Bing.
 
#8
what CMS you use?

as I know, Google bots (and other bots) not accepted cookies
LSCache use _lscache_vary coockie variable to mark ("store") cache

for example on my Opencart 3.x site I add to .htaccess

Code:
RewriteCond %{HTTP_USER_AGENT} "bot|compatible|images|cfnetwork|favicon|facebook|crawler|spider|addthis" [NC]
RewriteRule .* - [E=Cache-Control:vary=isBot]
and not set _lscache_vary coockie for Bots (commit for Opencart 3.x LScache module)


this is a separate cache for bots.
I use WordPress.. how can I do this on WordPress?
 

serpent_driver

Well-Known Member
#9
@harwester

You don't have a problem with LiteSpeed or LScache. You seem to have a missconfiguration which address is the unique URL of your page. Before you provide Google & Co with any URLs or sitemaps determine the exact URL of your page. https://www.domain.com is not https://domain.com, but it doesn't matter if you use www. or not. Set the exact URL in your .htacess and define what happens if user requests https://domain.com, but URL of your page is https://www.comain.com. In this case you must set a redirection in .htaccess, so in every case is guaranteed that your page is always connectable under 1 unique URL.

Example:

Code:
  RewriteCond %{SERVER_PORT} !^443$
  RewriteRule (.*) https://%{HTTP_HOST}/$1 [R=301,L] 
  RewriteCond %{HTTP_HOST} !^www\.
  RewriteRule ^(.*)$ https://www.domain.com/$1 [R=301,L]
The Rewrite Rule above defines:

http -> https
without www. > www.

Comprende? ;)

If done, I will give you "secret" information how to speed up connection time.....
 
#10
@harwester

You don't have a problem with LiteSpeed or LScache. You seem to have a missconfiguration which address is the unique URL of your page. Before you provide Google & Co with any URLs or sitemaps determine the exact URL of your page. https://www.domain.com is not https://domain.com, but it doesn't matter if you use www. or not. Set the exact URL in your .htacess and define what happens if user requests https://domain.com, but URL of your page is https://www.comain.com. In this case you must set a redirection in .htaccess, so in every case is guaranteed that your page is always connectable under 1 unique URL.

Example:

Code:
  RewriteCond %{SERVER_PORT} !^443$
  RewriteRule (.*) https://%{HTTP_HOST}/$1 [R=301,L]
  RewriteCond %{HTTP_HOST} !^www\.
  RewriteRule ^(.*)$ https://www.domain.com/$1 [R=301,L]
The Rewrite Rule above defines:

http -> https
without www. > www.

Comprende? ;)

If done, I will give you "secret" information how to speed up connection time.....

I have added this .htaccess code. I hope this setting will work well. Thank you so much.
 

serpent_driver

Well-Known Member
#13
Okay sit down and fasten your belt. ;)

First of all some basic information. If a browser requests a URL and the URL has https a handshake between browser and client happens. Both parties "discuss" about the connection before any data can be sent. If a server supports only http/2 this discussion takes more time than with HTTP/3 and LSWS, because LSWS and HTTP/3 use QUIC, a UDP protocol that is specialized for https connections. But also with QUIC server and browser discuss about connection conditions, but faster and with less round trips.

This time for discussion can be bypassed and reduces the time to send data to the client. Since many years Google offers a service named "HSTS Preload List Submission". With this service you can add your page URL to this service. If done the URL will automatically be added in almost any browser. If a browser has stored the information that your page is always and only be connectable with https the browser doesn't ask the server if he supports https. Therefore there is no more handshake and this reduces the time if data can be sent.

Request https://hstspreload.org/ and add the URL of your page to this submission list, but only domain name, no www or https. Before you do that you must add the following code to your .htaccess:

Code:
<IfModule mod_headers.c>
Header always set Strict-Transport-Security: "max-age=63072000; includeSubDomains; preload"
</IfModule>
It can take some weeks until all browsers have added this information into its own configuration, so be patient.

Enjoy :)

The image below shows what happens if a URL is requested and also shows the difference between http/2 with TCP and http/3 with QUIC. You must not understand what is what. Only count the numbers of arrows. The less arrows the better and faster it is.

quic.jpg
 
#15
Okay sit down and fasten your belt. ;)

First of all some basic information. If a browser requests a URL and the URL has https a handshake between browser and client happens. Both parties "discuss" about the connection before any data can be sent. If a server supports only http/2 this discussion takes more time than with HTTP/3 and LSWS, because LSWS and HTTP/3 use QUIC, a UDP protocol that is specialized for https connections. But also with QUIC server and browser discuss about connection conditions, but faster and with less round trips.

This time for discussion can be bypassed and reduces the time to send data to the client. Since many years Google offers a service named "HSTS Preload List Submission". With this service you can add your page URL to this service. If done the URL will automatically be added in almost any browser. If a browser has stored the information that your page is always and only be connectable with https the browser doesn't ask the server if he supports https. Therefore there is no more handshake and this reduces the time if data can be sent.

Request https://hstspreload.org/ and add the URL of your page to this submission list, but only domain name, no www or https. Before you do that you must add the following code to your .htaccess:

Code:
<IfModule mod_headers.c>
Header always set Strict-Transport-Security: "max-age=63072000; includeSubDomains; preload"
</IfModule>
It can take some weeks until all browsers have added this information into its own configuration, so be patient.

Enjoy :)

The image below shows what happens if a URL is requested and also shows the difference between http/2 with TCP and http/3 with QUIC. You must not understand what is what. Only count the numbers of arrows. The less arrows the better and faster it is.

View attachment 2783
I have added these codes to the .htaccess file.
Thanks for your cool information. I saw my website's flag on Firefox.
Thanks for your help.
 
Last edited:
Top