CACHE

serpent_driver

Well-Known Member
#61
my choice is recache all URL's for all possible UAs

1-2 secs response time for cached page is more preferred than 10-15 secs for non-cached page.
If you reduce that to the URLs actually used, you don't have this problem. But that requires not only intelligence, but logic. You seem to lack both. Sorry, but I neither like advice-resistant users nor ignorance nor arrogance. You actually assume that what you have to offer on your website is so popular that all of your pages are regularly accessed by users. There really isn't much more misconception.
 

AndreyPopov

Well-Known Member
#63
Googlebot's crawling and how long it takes for your page to load does not affect your ranking in any way, precisely because Googlebot does not take any measurements that affect the ranking.
are you sure?

but I see vise versa results on
https://search.google.com/search-console/page-experience
https://search.google.com/search-console/core-web-vitals
https://search.google.com/search-console/mobile-usability
when lscache for Google bots is absent!

now - green indicator
without lscache for Google bots (even not orange) - red indicator!
 

AndreyPopov

Well-Known Member
#64
This should not make other users believe that this would be possible in principle. Especially since it doesn't solve your problem that your hosting provider empties the /lscache directory every 3 days.
when lscache store folder placed in my account storage space - I decide what, where and when to clean!
 

serpent_driver

Well-Known Member
#65
are you sure?

but I see vise versa results on
https://search.google.com/search-console/page-experience
https://search.google.com/search-console/core-web-vitals
https://search.google.com/search-console/mobile-usability
when lscache for Google bots is absent!

now - green indicator
without lscache for Google bots (even not orange) - red indicator!
Don't keep asking me that stupid question. Unlike you, I think before I post.

I've told you several times that Google can't and doesn't want to measure the performance of your website because the result would not only always be wrong, but also unfair. To get safe and fair measurement performance there is User API built in Chrome Browser and only this API measures something and sends the relevant ones to Google home. Anyone who claims otherwise has no idea of the matter. That's why you can still post so many links. It doesn't change that. But you don't need any knowledge for this, just a logical understanding that what you believe in is not possible.

But there is another decisive factor that is just as foreign to you. Googlebot doesn't crawl every URL of yours just because you provide links to crawl. The Googlebot crawls on demand. On-demand means that Google only crawls what is of interest or what is being searched for. In your case, this means that in case of doubt, it only displays the start page in the search results because Google believes it makes little sense to display countless search results if the target page only offers products for wool. You can even verify this assumption, which is only a guess for you, if you examine an analysis of your traffic and especially the first requested page. At the latest then you should realize that you have to throw your views overboard.

Everything I do, I've been doing for more than 20 years. You can therefore assume that I am light years ahead of you on many topics. Above all, my knowledge is based on practice. So if I make comments that seem clever, I have tested them several times in practice beforehand.
 

serpent_driver

Well-Known Member
#68
you again hear and listen only yourself
From the moment you utter this sentence, I can no longer accept you as an acceptable user that I can respect. If you want me to take you seriously then please only post more comments that don't question your reputation. Or to put it another way, your comments do not allow for a trustworthy setting.

If you have no trouble determining the location of the cached files, then why are you whining about your hosting provider emptying the /lscache directory every 3 days.

Hence the repeated question, are you kidding us or what do you want?
 

AndreyPopov

Well-Known Member
#71
Of course I read what you post.
no! you hear and listen only yourself :(

- I find that hoster clear lscache every 3 days, because no hit on pages already crawled
- I ask hoster to store /lscache in my account storage space
- after that I find that some files in /lscache folder compressed (if request by user) and some files not compressed (when created by crawler)
- after that I add ENCODONG option to crawler
this increase crawler speed and reduce /lscache folder size.
 

serpent_driver

Well-Known Member
#72

serpent_driver

Well-Known Member
#79
Live demo? I never sent you a link to the live demo for the Kitt Cache Crawler! Which CMS do you need a crawler for? I have several demos depending on the CMS.
 
#80
Live demo? I never sent you a link to the live demo for the Kitt Cache Crawler! Which CMS do you need a crawler for? I have several demos depending on the CMS.
wordpress. Do you have any reviews or anyone using this plugin? Where is the support from Lite Speed. You sound like a programmer selling product. Where are the guys who built Lite Speed ?
 
Top