ZenphotoThe simpler media website CMS
I have been looking for a exif / xmp aware image management tool for a long time and today discovered this product. It looks nearly perfect for my needs when the xmpMetadata and tag_suggest plugins are enabled.
(Side note: I come from a long background of software development, so I was able to work around your install directions not telling the user where to get the software. I used "git clone". But maybe your install instructions should have a download link for others.)
I'm looking for general advice about adding ~30,000 photos to my install. I intend to put them in the /albums/ directory directly. I gather that an "album" should have 1000 to 2000 photos tops. Does that mean a single subdirectory under /albums/ or can I nest them? Currently I have them organized by camera, and some are well over 5,000. If I create /albums/camera-a/yyyy-mm/ directories and split it like that, would that be okay?
How does the import actually work? My test gallery is 28 photos and running a metadata resync after enabling xmpMetadata (with JPEG embedded scan enabled) was pretty slow. Would it be better if I extract xmp to sidecars? I generally prefer embedded to keep metadata from getting separated the image.
I've repurposed an old Dell Optiplex to host this, it's got a decent amount of RAM, but only a 4-core CPU. I intend for it to only be used by me, and I have great flexibility modifying the system, but I'm not keen on having to modify the images.
Comments
There is no Zenphoto limitation about nesting. I would not nest too deep for usability reason primarily though as you easily can get lost as a visitor then.
There are also not technical limits of how many image per album that by Zenphoto itself. We have users with large sites. It is server power that limits it. If you don't have a bigger dedicated server I would not add 30,000 all at once. Same for the sizes of the images (resolution size, not filesize). You have to try what your server manages.
Zenphoto is filesystem based and also creates smaller sizes on the fly, on visitor request. This may feel slow(er) on the first visit. Please read here how this image caching and discovery (import if adding via FTP directly) works:
https://www.zenphoto.org/news/caching/
Also search the forum for some newer topic about the same.
I'm perfectly fine with resized images only being created on demand, my concern is with scanning for metadata to allow searching. I want that to happen as quickly as reasonable.
Since this will be a single user system, I'm not concerned with visitors getting lost.
I found some early posts about large galleries, which is how I came to the impression about album size.
Since this project is just PHP code, my impression is all discovery happens based on page loads triggering PHP running and seeing new files. Apache / PHP likes to limit how long a process can run, and I know there are workarounds, like breaking a job into pieces and running multiple consecutive processes (possibly with page refreshes), but is there also a php-cli way to force discovery?
Meta data is scanned when the images are discovered.
Yes, that's how it works. No page visit, no discovery. This however also happens on the backend and not only the front end.
No. But we have cacheManager plugin that lets you go through all doing the "on the fly" part basically on request. On such a large site that will likely take some hours.