The simpler media website CMS
hey guys, I just wanted to let you know i've created a zenphoto pre-cache script that will iterate across all gallery images and discover what cached thumbnails and whatnot are missing, and generate them. it is suitable to run in cron once an initial run has been completed. https://github.com/benklop/zenphoto-precache if you want to take a look. so far it seems to work pretty well.
Comments
I created this since i've got some very large images, and a very slow server, and many of the caches take a few seconds to generate, making album pages horrible to wait for.
Thanks, contributions like this are always welcome. I will add it to the extensions section soon.
Are you aware that there is already a pre-cache facility provided by the included cacheManager plugin via a utility button? However, to work properly themes and plugins need to register any image size they use. (Edit: I see your script can be used as a cron job which is indeed different).
There is also an older python pre-cache script (I never used as my server does not have Python): http://www.zenphoto.org/news/precache-script/
I loosely based this on the python script you mentioned, though by the time i was done just about all of it was replaced. I did know about cacheManager, and in fact used it as a reference when figuring out how to extract information from the database to make this script work.
At first I made this because CacheManager works through the browser, and it will try to render more than one image at once which totally overwhelms my sad little server. Running it via cron to automatically keep things pre-cached is really my ultimate goal at this point, though.
This script also has the same limitations that the themes (and plugins too i'd imagine) need to register their image sizes.
I haven't tried to detect enabled plugins, though you can give the script a
--themes
flag with a list of additional theme (and i'd assume plugin) names to enable.how are enabled plugins stored in the database? If I can get that info, I could pretty easily have this automatically cache needed image sizes from plugins too.
All right and a cron doing this "in the background" is of course more efficient.
The enabled status of a plugin is stored in the options table named
zp_plugin_<name of the plugin>
with0|1
as the value.Hello all. This scripts helps my server a great deal. However, I am having problems getting all the photos processed. Each time I run the script, it tells me it is processing the same files again. Also, if I run in verbose mode, there are some files with 2 check marks and an x, some will all X's, etc. Each run, the file's "check marks and X's" are the same. Here is the output of the work to complete:
Will create 71298 new caches and refresh 0 existing caches (45417 already cached) for 38905 images (155 non-image files skipped)...
No matter how many times I run this, it get the same thing.
Any help would be appreciated. BTW, the script does not output any error at all. I thought maybe it was a memory issue, however, nothing is apparent. Just FYI, I am running this inside a docker container which hosts zenphoto, mysql, etc.
Thanks!
I can't help with the script itself and hope the @benklop still reads here.
Did you review your server error logs? And X means some processing error naturally, may be memory or corrupt meta data or a corrupt image itself.
But Zenphoto will create the image sizes needed when needed anyway so it is only once a delay. What does happen if you visit the theme pages with the images that don't get processed?
Did you try the album in question using the cache_manager?
When I browse to the page with un-cached photos, Zenphoto does cache/work correctly, it is just time consuming which means the response time is slower, often extremely so depending on the number of photos to be "cached/processed".
I turned on verbose logging for the script, which is how I noticed the line:
Will create 71298 new caches and refresh 0 existing caches (45417 already cached) for 38905 images (155 non-image files skipped)...
This "status" line is the same each time I run the script, and the files it processes are the same each time. As I mentioned the script does process many files each time. Ex:
Caching /var/www/html/cache/2000/Easter2000/P4230156_200_w80_h160_cw80_ch160_thumb.jpg
but they are never included in the "already cached list".
I also assume that the following: Scanning (✔✔✘) IMG_5101.JPG
indicates that 2 themes have this file cached and theme does not?
I checked in /var/log (Ubuntu image), but there are no errors indicated for any of the associated processes (MySQL, PHP, Apache,system, etc.).
I also "cleaned" the data base through the Zenphoto admin function. I also rebuilt the db meta data again using the Zenphoto admin function.
Storage is not an issue for me, latency and processing power (running in docker container on Synology NAS) is much more important to me.
When browsing photos which are pre-cached, the response is lightening fast!
I cannot say anything about this script and how it works. I have no idea where it gets the sizes to cache actually. This is outside of Zenphoto.
In Zenphoto there are default sizes you can set on a theme's options. These sizes are known but a theme must not use them. There is no fixed set of sizes that are always the same as with other CMS.
Themes can define/request numerous own custom sizes on the theme pages as can plugins.
So there can be several sizes per image that need to be cached.
These sizes should be registered to the cache_manager via the theme or plugin options so it handle them for precaching. I not sure that this script does cover all that?
Just to follow up. I ran the script twice back to back using verbose mode and redirected to a log file. I diff'd the output from both they were identical. That means, the exact same files were "cached" again, and missed all the rest. Thanks for the feedback.
Its author probably could tell about that why existing are recached (perhaps it is simple and just does all freshly?) or why other's are missed. I would suggest to open a ticket on the script's GitHub repository. The author may spot that better than a post here.
Awesome, thanks so much!