OpenCart LS Module Scroll Mobile Issue

Lee

Well-Known Member
#23
ESI is disabled in the GUI, my site would not work at all with ESI turned on.

I'm trying your lscache rules now, I will report back.
 

Lee

Well-Known Member
#26
It appears that when I use your lscache in the .htaccess file, it will not cache for my iPhone unless I manually do it!

here's what I have:
1.jpg

2.jpg

3.jpg

4.jpg

5.jpg
 

AndreyPopov

Well-Known Member
#27
It appears that when I use your lscache in the .htaccess file, it will not cache for my iPhone unless I manually do it!
- Purge All Cache
- leave only IPhone UA
- start build cache

what php max_execution_time?
how many product and other links on your site?

nearly 1000 links can cached per hour (for one UA)

P.S. if possible create info.php in root with
PHP:
<?php
phpinfo();
?>
and check on iPhone

PHP Variables
$_COOKIE['_lscache_vary'] browser:chrome,device:mobile,os:linux
$_SERVER['LSCACHE_VARY_VALUE'] ismobile
$_SERVER['LS_CACHE_CTRL'] vary=ismobile


are present
 
Last edited:

Lee

Well-Known Member
#28
what php max_execution_time? 3600
how many product and other links on your site? about 3K

Still not caching on my iPhone!

BTW, I just installed you caching program too...
 

AndreyPopov

Well-Known Member
#29
like I wrote above - nearly 1000 links can cached per hour (for one UA)

you must start standard recache again and again for full site recache for one UA 4-5 times

read previously post about php info
 

Lee

Well-Known Member
#30
Well.... With your cache software and .htaccess it's not caching for windows or iPhone now!

So I have to keep recaching for an entire day for this to work?

My code and settings worked far better! I'll be going back to my settings..
 

Lee

Well-Known Member
#32
Am I understanding that I have to change the UA to just one like apple, then recache, then change the UA to windows and recache etc.?
 

Lee

Well-Known Member
#34
I don't use the Popups, I've written my site to have all pertinent data on the main screens.

Looks like I was correct: Lscache is a buggy immature software. It does not work anything like it should. I've spent most the day just to find out that my stupid settings are working far better. I wish I had my time back...

Thanks for your help, it was much appreciated.
 
Last edited:

AndreyPopov

Well-Known Member
#35
Am I understanding that I have to change the UA to just one like apple, then recache, then change the UA to windows and recache etc.?
yes. cache must be made for each UA.

you can add how many UAs as you want, but:
- for one UA - ~1000 links recached per hour
- for two UAs - ~500 links recached per hour
- for three UAs - ~330 links recached per hour

and if not use my "advanced crawler" mode - standard crawler recache algorithm always start from beginning and recache already cached links.
on my experience standard recache algorithm with restart/restart/restart .... /restart can recache already recached nearly 4000 links per hour :( others links never been cached by standard crawler :(
 

serpent_driver

Well-Known Member
#36
Code:
<IfModule LiteSpeed>
RewriteCond %{HTTP_USER_AGENT} "bot|compatible|images|cfnetwork|favicon|facebook|crawler|spider|addthis" [NC]
RewriteCond %{HTTP_USER_AGENT} !Chrome [NC]
RewriteCond %{HTTP_USER_AGENT} !Mobile [NC]
RewriteCond %{HTTP_USER_AGENT} !Macintosh [NC]
RewriteRule .* - [E=Cache-Control:vary=isBot]
</IfModule>
When will you start learning, that it doesn't make sense trying to make cache for bots available? ;) First, this rule is completely wrong. Second, there are hundreds of bots and almost every bot has different user agent. Third, this rule prevents cache for Google, Bingbot and other popular bots/search engines.

Code:
<IfModule LiteSpeed>
RewriteCond %{HTTP_USER_AGENT} Bot [NC]
RewriteCond %{HTTP_USER_AGENT} Android [NC]
RewriteCond %{HTTP_USER_AGENT} Chrome [NC]
RewriteRule .* - [E=Cache-Control:vary=ismobilebot]
</IfModule>
Same like above

Code:
<IfModule LiteSpeed>
RewriteCond %{HTTP_USER_AGENT} Macintosh [NC]
RewriteRule .* - [E=Cache-Control:vary=isMac]
RewriteCond %{HTTP_USER_AGENT} "iPhone|iPad|Petal" [NC]
RewriteRule .* - [E=Cache-Control:vary=ismobileapple]
</IfModule>
You only need iphone to make cache for Apple based mobile phones available, but why do you vary cache for Apple mobile devices? webp images? Then, this rule is also wrong. Newer version of MAC OS X supports webp images, so you must vary OS version in UA.

Code:
<IfModule LiteSpeed>
RewriteCond %{HTTP_USER_AGENT} Android [NC]
RewriteCond %{HTTP_USER_AGENT} "Chrome|Firefox|Opera|OPR" [NC]
RewriteCond %{HTTP_USER_AGENT} !Bot [NC]
RewriteRule .* - [E=Cache-Control:vary=ismobile]
</IfModule>
You only need Android for user agent. There is no other browser vendor that uses Android with other than known browser vendors. Why do you exclude "bot" UA if there are hundreds of bots that don't use "*bot" as user agent. Sorry, that is nonsense!
 

AndreyPopov

Well-Known Member
#38
You only need iphone to make cache for Apple based mobile phones available, but why do you vary cache for Apple mobile devices? webp images? Then, this rule is also wrong. Newer version of MAC OS X supports webp images, so you must vary OS version in UA.
first, webp images must support web browser Safari, not MAC OS X!
second. nothing matter exist or not support of webp images in new versions of Safari because current version of Journal theme always provide to Safari jpg and png images (not webp)


When will you start learning, that it doesn't make sense trying to make cache for bots available? ;) First, this rule is completely wrong. Second, there are hundreds of bots and almost every bot has different user agent. Third, this rule prevents cache for Google, Bingbot and other popular bots/search engines.
first, I NOT interested in in other "hundreds of bots", I logged of requests to site during 6 month and choose most often bots!
second, really "prevents cache for Google, Bingbot and other popular bots/search engines"!?!?!?!?! are you sure?
because:
- Desktop Google bot is
Code:
Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)
- Desktop Bing bot is
Code:
Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)
- Desktop Yandex bot is
Code:
Mozilla/5.0 (compatible; YandexBot/3.0; +http://yandex.com/bots)
third, rule for desktop bots is very-very-very right and work!

I previously says to you that build cache for bots make sense because:
1. Google Search Console give best results for more effective urls
2. rewrite rules prevent from errors in Mobile bots parts of GSC like "small fonts", "wider screen", "elements close to each others"




May I help you to make up 100,000 requests per hour?
this is really FAKE!!!!!!!!!!!!
because only simple text pages can build fast!
simple text pages without:
- images
- internal js
- external js (google shopping, analytics, tag manager) and webfonts (google fonts)

in real conditions, REAL web page of products for full build(load) and save to disk for really generated cache of page - require 2-4 seconds!!!!!!
this is from 450 to 1800 pages per hour.
 
Last edited:

serpent_driver

Well-Known Member
#39
first, webp images must support web browser Safari, not MAC OS X!
second. nothing matter exist or not support of webp images in new versions of Safari because current version of Journal theme always provide to Safari jpg and png images (not webp)

Yes and No, but OS version matters if Safari supports webp. Check .htaccess of LScache plugin for Wordpress.


first, I NOT interested in in other "hundreds of bots", I logged of requests to site during 6 month and choose most often bots!
second, really "prevents cache for Google, Bingbot and other popular bots/search engines"!?!?!?!?! are you sure?
because:
- Desktop Google bot is
Code:
Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)
- Desktop Bing bot is
Code:
Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)
- Desktop Yandex bot is
Code:
Mozilla/5.0 (compatible; YandexBot/3.0; +http://yandex.com/bots)
third, rule for desktop bots is very-very-very right and work!

Yes, I am shure! :) "Chrome" is part of Googlebot user agent, so if you exclude Chrome from Googlebot UA you exclude Googlebot!


I previously says to you that build cache for bots make sense because:
1. Google Search Console give best results for more effective urls
2. rewrite rules prevent from errors in Mobile bots parts of GSC like "small fonts", "wider screen", "elements close to each others"

How should that work if you exclude Googlebot from caching?!

this is really FAKE!!!!!!!!!!!!
because only simple text pages can build fast!
simple text pages without:
- images
- internal js
- external js (google shopping, analytics, tag manager) and webfonts (google fonts)

in real conditions, REAL web page of products for full build(load) and save to disk for really generated cache of page - require 2-4 seconds!!!!!!
this is from 450 to 1800 pages per hour.
This is not a fake. Check attached log file. This log file shows requests done with my crawler that crawled a Wordpress installation with 50,000 posts. Check request times. To cache a page you don't need to do a complete request like a browser does it.
 

Attachments

Lee

Well-Known Member
#40
How do I help Curl from crashing and timing out every few minutes? Any settings I can change?

I have to sit in front of the computer and babysit this when rebuilding cache so I can constantly restart the script, very annoying!

The error:
curl: (18) transfer closed with outstanding read data remaining
 
Top