-
Notifications
You must be signed in to change notification settings - Fork 61
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Volumetric Collapse When Creating 3D-LUT for Resolve #456
Comments
a) Seems like you did verification a wong way. b) there is something weird with that low contrast for an IPS display, ~500:1 vs 900-1000:1. Maybe you pushed too far away RGB gains, maybe a video levels vs full, maybe hughe correction applied in upper or lower end in VCGT, maybe you enabled uniformity compensation in a display with bad QC. All these things added together point to user misconfiguration rather than app bug. IDNK what you were trying to do with calibrated measurement report but your "uncalibrated.pdf" looks like "calibrated" (the first point in my reply) since you are making a comparison between custom taylor made profile and display itselft. |
a) I can try again with a simulation profile set, but I can also visually see that the lut is creating this desaturation and inaccurate colors. It causes very obvious blocking artifacts. Mac profiles/color management should be irrelevant as I am running resolve on Windows, but running displaycal on macos and connecting it over the internet. This is because I have not been able to get DisplayCAL working on Windows in the past, but maybe I should try with the new version. b) all of my configuration changes are in my original post, so if it was a configuration issue, it should be in there? I'll try setting the calibration to video levels instead of full, but that goes against the original best practices claimed by Florian: https://hub.displaycal.net/forums/topic/data-or-video-levels-davinci-resolve-calibration/ So if this is a misconfiguration with data levels, then the way DisplayCAL sets video/full levels is now incorrect. Also, just to clarify, the monitor is not plugged into a GPU, but a video I/O device from blackmagic design. There is no VCGT calibration with this sort of device, and I set DisplayCAL up accordingly. |
a) "Calibrated.pdf" shows that you are using as reference colorspace display profile. There may be other issues in the pipeline, but this is not the way to test a LUT3D for the reasons explained above. b) low contrast may be caused by other things than HDMI levels as explained above. Display itself may have such low contrast 500:1 if it taht model has ver bad/low quality control and/or you enabled uniformity compensation... but it is unusal. This is the reason I guess that there is some user misconfiguration which is not limited to DisplayCAL but to OSD settings. As a general rule before reporting possible issues: |
I tricked DisplayCAL into thinking there was a profile applied when there wasn't one (simply by not loading the LUT into resolve after creating it), in an attempt to get a verification on the display with no profile applied as a comparison report. When I get a spare moment (likely next month, I'm too busy to do much more than reply to these comments at the moment), I will do verification reports with the proper simulation profile. As for the OSD settings, this is a more professional kind of display than most, and the default setting is not intended for creating calibrations. So, I set it up to rec.709 mode, gamma 2.4, D65 white point, gain: R=50 G=50 B=47, brightness at 62, input range to limited (16-235), everything else on defaults. For gain levels, the default value is 50, so all I've done is reduce it in the blue channel to hit D65 better. I'm very confused why I would get a good report in uncalibrated.pdf and a bad report in broken_calibration.pdf if there is something wrong with my OSD/Resolve settings, I included uncalibrated.pdf specifically to prove there was no possible misconfiguration outside DisplayCAL. If I was configuring something wrong, surely I would have to change it in between verifications to get this kind of result? I'd rather not have to go back and run more tests if I don't have to, especially since my understanding was the resolve preset should work out of the box even with some OSD adjustments, but if you really think it's needed I can do it, but then the timeline for a solution is a month from now, and it could all just be a waste of my time.
One thing I could try to do is lower the black level signal on the OSD to try to get more contrast, or perhaps set the contrast dial to 100? I'll try these things and get back to you. If they don't work, I'll try setting everything to factory default and see what I get, but it will take a while for me to do all these tests. I'm also unsure what you mean by uniformity compensation, this is a decently accurate display out of the box and should have good panel uniformity, and as far as I know no OSD setting adjusts it. |
It does not work that way No simulation profile & use simulation as display profile = test display behavior against display profile LUT3D applied + No simulation profile & use simulation as display profile => Resolve will output Rec709 RGB data "reencoded" in display colorspace... but you'd had configured to check if it behaves as native gamut display colorspace. Hence desaturation show in CIE ab graph at the bottom and overall errors. That's does not exclude error in generated LUT3D, but means app misconfiguration. DisplayCAL reports offer a lot of details to spot user misconfiguration, you just need to look at details.
Try to measure conected through DisplayPort and in other factory OSD preset like Standard (sRGB may have issues)
Explained in this message above.
Checking contrast through DP will costs less than 5min and it is a tool for your work so....
Some monitors have a "uniformity compensation" setting. It MAY fix deltaC color uniformity and brightness uniformity on sides... at the expense of contrast. Sometimes is is activated for some OSD modes and user cannot deactivate. IDNK for yourt model. |
Also another thing, your display is not "WhiteLED", hence colorimeter correction is wrong: If it has 9x% P3 coverage it MAY be WLED PFS phosphor, that is not "White LED" (blue led + yellow phosphor). You have suitable corrections bundled with DIsplayCAL for your display (panasonic vvx CCSS or PFS family CCSS). Is this right? Also on widegamut displays we usually do not limit them to REC709, we profile them at native gamut instead and let DisplayCAL LUT3D handle mapping to Rec709 so you can use the same 100nit D65 setting to create other LUT3D to simulate other colorspace. But doing the way you did you can use factory Rec709 calibration trusting it to be somehow accurate. |
That's useful info, and perhaps the source of the issue. Unfortunately there is no correction profile for my exact display, but at least with PFS phosphor I should get closer. I'll run the DisplayPort test using the tool you mentioned and get back to you. |
Ok, so I did a new calibration with all your suggested changes and got worse results. Let me know if you see anything that looks misconfigured. Here's the verification HTML: https://drive.google.com/file/d/1K4AkLv7qfRd6c2_K1wbUGx81477Qoc4r/view?usp=sharing |
You did it wrong again.
Also CCMX colorimeter correction are not portable between i1d3. Their are OK for the display unit and colorimeter unit used for its creation. As said before you have a set of CCSS suitable for all these "multimedia" 9x% P3 displays dumped directly from vendor EDR (Xrite & other sources). |
I'm sure it is, but I lack the calibration knowledge to understand what all these terms (simulation profile, display profile, etc.) mean in this context. @eoyilmaz this highlights an issue that I think needs to be addressed long-term, we need to at least rethink the preset options in displaycal and perhaps even do a full UI redesign. I'm sure the current UI is great for advanced users, with all the options layed out clearly, but it's far too jargon heavy for anyone outside the calibration world. I'll get back to you with a new verification soon. |
I figured it out! But it's not what either of us thought... Basically, I assumed that the prompt to create a 3D-lut that pops up after calibration would use the settings I input into the 3D-lut tab... for some reason it does not, instead creating a lut reset to default gamma options. In order to get a working LUT, I had to manually re-create it from the .icc profile, instead of when DisplayCAL prompted me. I would argue this is still a bug, as it's not obvious displaycal is working off of defaults unless you really know what to look for in the logs, so maybe we should look into that next? Should I close this and make a new report? |
Ok, I'm going to close this and run a few more tests at some point to confirm this is what's happening. If I'm certain it's a bug, I'll make a new report. |
Describe the bug
Unsure if the term "Volumetric Collapse" from the HDR TV world is 100% accurate here, but it's what best describes what I'm seeing. It seems that when using relative gamma -and/or- Absolute Colorimetric with White Point Scaling Rendering Intent, there is a severe loss in saturation at certain luminance values. Looking into the calibration report, a potential source of the issue is relative gamma always being 2.2 even when set to 2.4. Here is the report showing the issue:
Broken_Calibration.pdf
To Reproduce
Steps to reproduce the behavior:
Expected behavior
A calibration better than uncalibrated, the report for the uncalibrated monitor is shown below (DisplayCAL was tricked to making a verification without a LUT applied):
Uncalibrated.pdf
Versions:
Additional context
The lut is being loaded through resolve's Tetrahedral interpolation through a decklink mini monitor 4k card, which uses an HDMI cable to connect to a ASUS ProArt PA278CGV monitor set to rec 709.
The text was updated successfully, but these errors were encountered: