Plugin Author
KeyCDN
(@keycdn)
Sure, what is your preference how we detect AO?
if (defined('AUTOPTIMIZE_PLUGIN_DIR')) {
// echo warning
}
seems like the easiest way. no need to disable the optoins, we can assume our users are smart enough to choose which plugin does HTML (& inline JS) minification 🙂
frank
Plugin Author
KeyCDN
(@keycdn)
Just added the AO check to the trunk. Thanks!
@keycdn The .gz file is generated correctly (I can retrieve the HTML file from it manually). I got it to work, but perhaps someone can explain to me why the following error happens. Let’s say I have this in my Nginx config:
location / {
try_files /wp-content/cache/cache-enabler/${http_host}${cache_uri}index.html $uri $uri/ /index.php$is_args$args;
}
Now, this works. After the page is loaded for the first time and Cache Enabler creates the index.html and index.html.gz files, Nginx will retrieve the .gz file on the second try. But if I break the config by making it so that Nginx doesn’t find the cached file:
location / {
try_files /wp-content/cache/cache-enabler/${http_host}${cache_uri}index2.html $uri $uri/ /index.php$is_args$args;
}
This will produce a content decoding error in Chrome on the second try. But I don’t entirely understand what happens that causes the error. Since it doesn’t find index2.html, I imagine Cache Enabler will try to serve the file via PHP. But is it somehow gzipped twice in the process (once via PHP, and then again by Nginx)?
My previous example was overly complicated, this probably illustrates the problem better:
location / {
try_files $uri $uri/ /index.php$is_args$args;
}
This, combined with “Pre-compression of cached pages.” in Cache Enabler will give a content decoding error (but only from the second time the page is loaded, the first time the page still displays correctly). Surprisingly, this happens even when I disable gzip in Nginx altogether. For example, Here are the headers for the first two requests on the same page in WordPress:
1st Page Load (this one displays correctly)
HTTP/1.1 200 OK
Server: nginx/1.10.0 (Ubuntu)
Date: Wed, 18 May 2016 19:35:57 GMT
Content-Type: text/html; charset=UTF-8
Transfer-Encoding: chunked
Connection: keep-alive
2nd Page Load (ERR_CONTENT_DECODING_FAILED in Chrome)
HTTP/1.1 200 OK
Server: nginx/1.10.0 (Ubuntu)
Date: Wed, 18 May 2016 19:35:59 GMT
Content-Type: text/html; charset=UTF-8
Transfer-Encoding: chunked
Connection: keep-alive
X-Cache-Handler: php
Content-Encoding: gzip
Perhaps I’m seeing the same thing described here: https://ww.wp.xz.cn/support/topic/difficult-problem-with-double-content-encoding-gzip-content-encoding-gzip?replies=2
Is there a way to tell Cache Enabler to only create the .gz file, but not actually send gzipped content to Nginx?
Plugin Author
KeyCDN
(@keycdn)
How did you solve your previous problems?
It is important to let others know how you fixed it so they can benefit from this thread as well. And this allows us to add more checks to give users the right hints what is missing.
Please open a new thread for the nginx configuration issue.
I didn’t resolve the previous problem, I’m just testing this from the copy of the website where I did get Cache Enabler to work. I may go back and try to fix it on the live website, but that will prove to be difficult when I can’t replicate the problem on the copy of the website.
I could be wrong, but I haven’t seen Cache Enabler alert the user to potential problems related to permissions? Something that the other caching plugins I used typically point out in their settings. Something like that would prove to be helpful. WP Super Cache also has the option to log various events in a debug log. That really helps to figure out what the plugin is doing and where it fails.
I made a new thread for the Nginx + gzip issue.
Plugin Author
KeyCDN
(@keycdn)
Ya, replicating problems can be difficult sometimes.
The permission check has been included in the latest release (wp-content/cache).