-
-
Notifications
You must be signed in to change notification settings - Fork 86
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add memories.image.highres/convert_all_images_formarts/format/quality/_max_x/_max_y/ #653
base: master
Are you sure you want to change the base?
Conversation
…ew.x, memories.preview.y memories.preview.format support jpeg webp avif memories.preview.quality between 0-100 memories.preview.x max & memories.preview.y size Example: 'memories.preview.format'=> 'webp', 'memories.preview.quality'=> '80', 'memories.preview.x'=> '2048', 'memories.preview.y'=> '2048',
Okay so there are some major issues here:
Now it might worth be worth considering capping the max size of the loaded image and not caching it, if that's what you desire; I just want to note that this will not be very beneficial in most cases, since network bandwidth tends to be much cheaper than CPU time in general, and image processing is very expensive. A second missing that would definitely be useful is disallowing loading max-res altogether, as a global admin setting. |
Yes, that's why I want it to be optimal and with limited resolution. Edit: |
No idea what kind of machine you're using, but WebP/AVIF encoding (at least with Imagick) is basically unusable. On my dev machine (AMD EPYC 64 core/128 thread, 256G RAM),
As I mentioned, I'm okay with a config option to restrict the max size and encode quality of the "full" image (at the admin's risk), but WebP is a no-go for now. Also the props need to be named more accurately, e.g. |
That's comparing apples and oranges. If you make a fair comparison of JPEG with WebP, the differences are much smaller (see the representative numbers above). An improvement of <20% but over 5-20 times slower. AVIF, on the other hand is very efficient, but no good encoders exist. |
We no longer use this API for image editing, so this is an acceptable compromise for now Signed-off-by: Varun Patil <[email protected]> #653
BTW I bumped down the default quality to 85; it was 95 earlier because this API was used for image editing. Also, there is indeed a "proper" solution to this whole problem, but that's unfortunately very hard to implement. When the user zooms in, we don't need to load the entire image at all, but only those parts that are visible in the viewport. So theoretically we could decode the image just once, store a temporary bitmap and cut out parts of the bitmap for the frontend on-demand. Since these parts are downsized to match the viewport, it might be reasonably possible to use webp here (and it'll be super efficient even with JPEG). There's some ways to do the UI (see https://dimsemenov.github.io/photoswipe-deep-zoom-plugin/), but the backend is very non-trivial. This is also how Google Photos solves this problem. |
Yes, when it comes to AVIF, I would advise against using resolutions above 2048 pixels. It is well-suited for generating Full HD images for mobile data. However, whether it's worth it to use AVIF considering the computational power required for a user with a standard server is quite substantial for the amount it saves. But it's not suitable for generating such high-resolution previews in a short time, especially through a PHP module. I agree that AVIF could be used as a benchmark.
WebP, yes, the processing times are acceptable. Block artifacts are not as quickly visible as in JPEG, but the image appears somewhat blurred due to the overall reduction. The quality varies depending on the settings. With my very high settings, it's questionable whether it really makes sense, and I agree with that. The perceived quality is somewhat better, around 25% to 35%, I would say. With many users, WebP can also have disadvantages, as 200 KB is not that significant for some users if it allows more users to access it simultaneously. JPEG remains the fastest option, and I fully agree with that. So, in summary, it heavily depends on the settings and the number of users on the cloud. However, AVIF is quite challenging for resolutions above Full HD, but it's impressive how beautiful the images still look and how the CPU melts.
Yes, I think I observed this solution in Google Photos as well. It's actually very efficient. With different zoom levels, DPI, and precise client positioning, the communication between the server and client needs to be well-coordinated for it to work. But yes, in essence, you only need to inform the server about the visible area and the screen resolution. Now comes the more complicated part: whether it's possible to implement this 1:1 with Imagick is questionable. However, it's definitely possible to generate a certain area and send it to the client. Whether it's equally efficient would need to be tested.
Wow that looks really great, yes that would actually be the perfect solution.
And yes, I should have named and explained it better. |
Haha very true.
It would actually be much easier if the server was implemented in something other than PHP. The problem with PHP is that
To get around the last two, we either need a dedicated PHP file to skip request bootstrapping (otherwise loading each chunk is too expensive) or run a separate server (with a separate endpoint) for this. For both, we also need an entirely separate (fast) auth mechanism (this part might be relatively easy, e.g. with JWT). The next (very) hard part is the UI. Unfortunately the viewer has become very complex due to the various corner cases (slideshow, live photo, full res on zoom in, videos of different types etc.). Using the above library directly would likely break things. |
Unfortunately, it is only worthwhile for 1080p users and would have to be limited until then and even then the utilization is very high. 'memories.image.highres_convert_all_images_formarts_enabled' => 'true', 'memories.image.highres_format' => 'webp', 'memories.image.highres_quality'=> '80', 'memories.image.highres_max_x'=> '6144', 'memories.image.highres_max_y'=> '6144',
lib/Controller/ImageController.php
Outdated
switch ($format) { | ||
case 'jpeg': | ||
$format = 'jpeg'; | ||
break; | ||
case 'webp': | ||
$format = 'webp'; | ||
break; | ||
/*case 'avif': //CPU Benchmark | ||
//$format = 'avif'; | ||
break*/ | ||
default: | ||
$format = 'jpeg'; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Don't need this check. If the admin sets the value to something invalid, it's not our problem.
lib/Controller/ImageController.php
Outdated
// Convert to JPEG | ||
try { | ||
$image->autoOrient(); | ||
$image->setImageFormat('jpeg'); | ||
$image->setImageCompressionQuality(95); | ||
$format = $this->config->getSystemValueString('memories.highres_format', 'jpeg'); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Util::getSystemConfig
. Also see Util::systemConfigDefaults
.
The config values need to be correctly scoped, e.g. memories.image.highres.format
lib/Controller/ImageController.php
Outdated
@@ -276,7 +276,8 @@ public function decodable(string $id): Http\Response | |||
$blob = $file->getContent(); | |||
|
|||
// Convert image to JPEG if required | |||
if (!\in_array($mimetype, ['image/png', 'image/webp', 'image/jpeg', 'image/gif'], true)) { | |||
$highres_enabled = $this->config->getSystemValueString('memories.image.highres_convert_all_images_formarts_enabled', 'false'); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is not desirable. What we really want is a cap on the resolution, so the original image should be used to determine if we want to scale it.
E.g. if the cap is set to 12MP, then scaling a 4MP image is meaningless (even decoding it is a resource waste)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also need to always exclude GIF, since webp animation depends on library support
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I just noticed it when testing that the webp image wasn't really animated.
Then I wrote code for each case.
But it was so terribly big that I deleted it and the best was this:
// Convert image to JPEG if required
//You might want to lower the maximum execution time here.
//And increase again, to the default value, when the image is finished.
//And set a maximum number of concurrent executions, which might prevent thrashing.
//JSON(tmp) Database where you just save the entries as a number and delete them when they are done or the entries are 5 minutes old.
$highres_enabled = $this->config->getSystemValueString('memories.image.highres.convert_all_images_formarts_enabled', 'false');
$format = $this->config->getSystemValueString('memories.image.highres.format', 'jpeg');
if ($highres_enabled == 'true') {
switch ($format) {
case 'jpeg':
if (!\in_array($mimetype, ['image/png', 'image/webp', 'image/gif'], true)) {
[$blob, $mimetype] = $this->getImageJPEG($blob, $mimetype);
}
break;
case 'webp':
if (!\in_array($mimetype, ['image/gif'], true)) {
[$blob, $mimetype] = $this->getImageJPEG($blob, $mimetype);
}
break;
}
} else {
if (!\in_array($mimetype, ['image/png', 'image/webp', 'image/jpeg', 'image/gif'], true)) {
[$blob, $mimetype] = $this->getImageJPEG($blob, $mimetype);
}
// why did i do that? :D
/*Set maximum width and height
$maxWidth = (int)$this->config->getSystemValue('memories.image.highres_max_x', '0');
$maxHeight = (int)$this->config->getSystemValue('memories.image.highres_max_y', '0');
if (!\in_array($mimetype, ['image/png', 'image/webp', 'image/jpeg', 'image/gif'], true)) {
// Check if the image exceeds the maximum resolution
//$imageInfo = getimagesize($blob); //Dont work with a Blob need Imagick
//$width = $imageInfo[0];
//$height = $imageInfo[1];
$img = imagecreatefromstring($blob); //Better als Imagick?
$width = imagesx($img);
$width = imagesy($img);
if ($maxWidth > 0 && $maxHeight > 0) {
if ($width > $maxWidth || $height > $maxHeight) {
[$blob, $mimetype] = $this->getImageJPEG($blob, $mimetype);
}
} else {
[$blob, $mimetype] = $this->getImageJPEG($blob, $mimetype);
}
}*/
}
I think I should go to sleep. :D
Have to admit, if a user loads this with always always full screen, he has a PC from the future.
PHP makes an effort to let the script run as long as possible as it is in the config and thus allows an infinite number of calls.
So always full you should be very careful.
But I think it's more of an experimental feature anyway.
Thank you for your support. :)
Learned some new things, using the bitmap to divide the image into parts, which google does, was very interesting.
I try to finish adapting the code after sleeping.
I think you've tried what I do before, heard the experience? :D
Edit: I added the funny code, so you should always sleep enough, otherwise you will do such funny things. :D
Update:
https://imagemagick.org/script/resources.php
https://www.php.net/manual/de/imagick.setresourcelimit.php
putenv("MAGICK_THREAD_LIMIT=maxcores");
$image->setResourceLimit($image::RESOURCETYPE_THREAD, maxcores);
putenv("MAGICK_THROTTLE_LIMIT=limit");
putenv("MAGICK_TIME_LIMIT=sec");
These parameters can be used to speed up the process and set a timeout.
However, a limit must be set for conversions, database entries, or a JSON file.
This stores the current number of running conversions and the timestamp of the last one, deleting the value after the timeout period. Resetting after a certain time helps avoid errors during server restarts.
This was the only way I could prevent server overload due to multiple users.
However, it does require a few lines of code and is only useful in environments that need protection against such attacks or have a high number of users using this function.
The resolutions are stored in the database for each image, for whatever purpose I needed them back then.
lib/Controller/ImageController.php
Outdated
$aspectRatio = $width / $height; | ||
if ($width > $height) { | ||
$newWidth = $maxWidth; | ||
$newHeight = $maxWidth / $aspectRatio; | ||
} else { | ||
$newHeight = $maxHeight; | ||
$newWidth = $maxHeight * $aspectRatio; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
$image->scaleImage(maxh, maxw, true);
does the same thing.
lib/Controller/ImageController.php
Outdated
$blob = $image->getImageBlob(); | ||
$mimetype = $image->getImageMimeType(); | ||
|
||
//getImageMimeType() dont work for webp you can use pathinfo() and strtolower() but i make it shorter |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
getImageMimeType
seems to work fine for me. What does it return for you? Imagick version?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ImageMagick 6.9.11-60 Q16 x86_64 2021-01-25 https://imagemagick.org
is already the newest version (8:6.9.11.60+dfsg-1.3ubuntu0.22.04.3)
imagemagick/jammy-updates,jammy-security 8:6.9.11.60+dfsg-1.3ubuntu0.22.04.3 i386
imagemagick/jammy 8:6.9.11.60+dfsg-1.3build2 i386
It doesn't work, I don't know why.
But actually that's not bad, even if it's missing which format it is, the browser still shows it.
I've also tried installing 7.1, but it won't work with PHP, nor have I been able to test it.
But when I save the images in the browser, they do so without formatting.
But he still knows that it is an image.
…eMimeType [Line 283](https://github.com/JanisPlayer/memories/blob/master/lib/Controller/ImageController.php#L283) needs to be changed and the setting name is wrong memories.image.highres.convert_all_images_formarts_enabled
It was unnecessary as it was just an idea to increase performance.
I just reviewed the code that I wrote back then, and I realized that there might be something we could do without. |
@JanisPlayer I'll try to take a look later today and get back. Basically I agree WebP would be useful if there's an alpha channel etc. What's most important here is to let the admin know what they're doing with good documentation. Especially if the output resolution is capped then WebP encoding performance may be acceptable.
Sure, but that likely needs you to re-encode the image anyway. If that's discarded it's a wasted effort. Memories in general needs a better server-side cache which may solve a lot of these issues. With a well-sized cache these WebP (or even AVIF) images can be used easily. |
"I experimented with estimating the potential file size of a JPEG, not the resolution, to determine whether compression would be worthwhile. However, accurately predicting the file size is challenging, and the estimates are often imprecise. I then attempted to set a target file size for the JPEG. In Imagick Desktop, there's a command for this purpose. In PHP, I had to take a different approach. To make this process efficient, it's necessary to reduce the steps and estimate how much each step could reduce the file size with minimal effort. This allows for resizing the image to a target file size at a reasonably good speed. However, there's room for improvement in the estimation techniques. The primary challenge is that I can't reliably predict whether data savings compared to the original file will occur with the initial compression. Additionally, achieving an ideal target file size for users without creating multiple JPEGs at different qualities can be difficult. While there are ways to enhance this process, it often involves generating multiple versions of a JPEG if the initial estimates don't meet the desired file size, especially when the goal is to keep the file size below a certain threshold compared to the original." I also tried reducing the resolution of the image to lower computational workload and memory usage. While it can be estimated to some extent, achieving perfection in this regard is still challenging. The only approach that works quite well is when the preview has lower resolution than the original, as that can be estimated more accurately. However, finding a solution with minimal computational effort and still producing good results for this scenario is genuinely difficult. So, I'm not sure if that's possible.
Yes, WebP often results in better file sizes compared to PNG, and it's relatively easy to implement. However, I recently noticed that Imagick's ability to detect whether an image uses an alpha channel or not may not work with every format; that could be the only issue. Absolutely, I completely agree. Documentation plays a crucial role in situations like this. If it's misconfigured or not properly estimated by the admin, it can significantly slow down Nextcloud or potentially lead to issues.
I also once asked Chat GPT if there is a PHP solution. It actually kept trying to rewrite this project using imagecopyresampled GD and provided me with some very interesting versions that needed correction. I also examined in the network protocol how Google determines the first image. First, a preview is created in the display resolution that the user can view. Then, at different zoom levels, similar to the alternative you and I mentioned, the image is divided until the last zoom level divides the image into many blocks. It is clear that encoding the new images takes time, and it cannot be done live. After all the experiments, I've learned one thing: it's challenging to implement this live while saving enough data, as initially assumed. So, the best solution, if it needs to be compressed and consume fewer data, might be as follows: First, make a rough estimate: Is the resolution lower than the original image? What is the quality? This is quite imprecise except for the resolution but is sufficient for a rough estimate and a check to see if the original image exceeds a maximum size, which would always be impractical for the user. Then, perhaps another solution as mentioned above: recompress with lower resolution steps until a target value below the original is achieved and try this value for full resolution. The downside of this method is that it's about 5-10% inaccurate at a resolution of 70% of the original image, and the accuracy worsens as the resolution deviates further. However, it's adequate for a usable estimate. But it doesn't save a tremendous amount of processing power, so it's only possible to achieve a smaller file with a lot of effort. But I think it's not worth it; it's better to just output the original if the initial estimate fails. So, I did my best, but unfortunately, it seems quite challenging, and the solutions for Deep Zoom that come to mind are almost like generating previews at different resolutions or rewriting Nextcloud in that regard, which, of course, consumes storage depending on the settings but causes less traffic, although it also depends on that. I had more fun experimenting with code for myself and can now say that solutions exist, but whether they are really useful is questionable. I can only say that compression is possible if the conditions and settings are optimal. If the estimate is off, performance is consumed. And better solutions simply require a new form of preview generator, not just in one app. Whether it's worth it, you'd have to test again. I would say that if you achieve the goal of not sending images that are larger than 3 MB, that's already a significant advantage. Sorry for the many edits and the length, but I was curious about what is possible. 😄 |
Solution to my suggestion:
#652
memories.image.highres_format support: jpeg, webp
memories.image.highres_quality 0-100
memories.image.highres_max_x & memories.image.highres_max_y for the max res of the picture
Example:
'memories.image.highres.convert_all_images_formarts_enabled' => 'true',
'memories.image.highres.format' => 'webp',
'memories.image.highres.quality'=> '80',
'memories.image.highres_max_x'=> '6144',
'memories.image.highres_max_y'=> '6144',
Too Do:
Line 283 needs to be changed and the setting name is wrong memories.image.highres.convert_all_images_formarts_enabled
The config settings should still be changed something like this, but before I take things that are too long again, I'll just ask.
'memories.image.highres.convert_all_images_formarts_enabled' => 'true',
'memories.image.highres.format' => 'webp',
'memories.image.highres.quality'=> '80',
'memories.image.highres.max_x'=> '6144',
'memories.image.highres.max_y'=> '6144',
And please try it yourself, I've tried it and I'm happy, but I don't know if you're happy with the result.
Unfortunately, I don't know how to build an interface for an APP.
Otherwise I would have done it and added the settings.
But in the short term I don't think it would be good enough, so I'll leave that to them if they wish.
I'm just a hobby developer, please watch out for mistakes.
I can also add the documentation if you want.
I also had problems with:
composer run cs:fix
In Factory.php line 319:
"./composer.json" does not match the expected JSON schema:
- name : Does not match the regex pattern ^a-z0-9/[a-z0-9](([_.]|- {1,2})?[a-z0-9]+)$
Maybe you can tell me what I did wrong.