OpenImageIO
OpenImageIO copied to clipboard
docs: Convert more examples within the imagebufalgo chapter.
Description
Convert C++ and Python examples from the "Image transformations and data movement" section of the "imagebufalgo" chapter into tests within the "docs-examples" testsuites (#3992).
Update 'oiiotool' tabs to use a simple 'tab' instead of 'code-tab' to prevent synchronized tab selection, which is inconsistent with the other tabs using the 'literalinclude' directive.
Checklist:
- [x] I have read the contribution guidelines.
- [x] I have updated the documentation, if applicable.
- [x] I have ensured that the change is tested somewhere in the testsuite (adding new test cases if necessary).
- [x] If I added or modified a C++ API call, I have also amended the corresponding Python bindings (and if altering ImageBufAlgo functions, also exposed the new functionality as oiiotool options).
- [x] My code follows the prevailing code style of this project. If I haven't already run clang-format before submitting, I definitely will look at the CI test that runs clang-format and fix anything that it highlights as being nonconforming.
CI errors are occurring when comparing resulting images with expected outcomes for theImageBufAlgo::fit and ImageBufAlgo::rotate tests (in C++ and Python):
rotate-45.exr : 256 x 256, 4 channel, half openexr
- SHA-1: D71846E92E351953CAB0A77F7FA0E8153871F642
+ SHA-1: 1D29EDC970AE6C6518D05B72D294B1168FEFD279
fit.exr : 640 x 480, 4 channel, half openexr
- SHA-1: 70A5442AACA66D4DD2B0B4E81E10C44C63BAC8AB
+ SHA-1: 7A758E26E4470375582078F6D4E1D3AC24B46F6A
I feel it could be due to rounding errors, but I'm not entirely sure 🧐
After analyzing the "dumpdata" produced on different CI jobs for the ImageBufAlgo::rotate test, I found only two pixels with a very minor value difference:
< SHA-1: D71846E92E351953CAB0A77F7FA0E8153871F642
---
> SHA-1: 1D29EDC970AE6C6518D05B72D294B1168FEFD279
15773c15773
< Pixel (138, 61): 0.355712891 0.189331055 0.189331055 1.000000000
---
> Pixel (138, 61): 0.355468750 0.189331055 0.189331055 1.000000000
54966c54966
< Pixel (163, 214): 0.704589844 0.704589844 0.704589844 1.000000000
---
> Pixel (163, 214): 0.704101562 0.704101562 0.704101562 1.000000000
Updating the test to use TIFF input and output instead of EXR solved the problem, but it doesn't solve it for the ImageBufAlgo::fit test.
Besides, both tests using TIFF and a few other tests are failing on the CI-Mac-ARM job:
rotate-45.tif : 512 x 384, 3 channel, uint8 tiff
- SHA-1: B6616C3F6DC18017114EA68F2F825824FBF543A7
+ SHA-1: 39165D99E92649A6474B3EB52A1E601E45C8E8B1
resize.exr : 640 x 480, 4 channel, half openexr
- SHA-1: 37DA4EB4E212FA81CFFEC3AAFDB5F89158769DC1
+ SHA-1: 79331C1ECCB76D26E6286955D9447A03EC0FD0A6
resample.exr : 320 x 240, 4 channel, half openexr
- SHA-1: 16FC7DCFE01DC312593B00B9F90D71BAF3D52450
+ SHA-1: 5C0CC0A293FCD30B2AE44A0BED7D27DB9B8C8E50
fit.tif : 640 x 480, 3 channel, uint8 tiff
- SHA-1: 82EA29C19B5C978A0D4C2FDBDDB8FDFA64CD1F74
+ SHA-1: B3E73BBFDFE2B43F02CE3A429591EC956EC192BA
warp.exr : 256 x 256, 4 channel, half openexr
- SHA-1: E3B18F22BE132AB6FF9F5F74EE4E25A98AC06304
+ SHA-1: EEB460766F07B47C647D630B568F25FFEEBF5A62
So, I feel like comparing image hashes for these specific tests is probably not a reasonable testing strategy, as multiple parameters likely play a role in the outcome. It would be interesting to get down the rabbit hole to understand what produces these differences (whether it comes from the OIIO codebase, a third-party dependency, or the platform?). But, for now, I will simply adjust these tests to move on with this issue.
The remaining CI errors don't seem to be related to this update:
147/178 Test #149: tiff-misc ...............................***Failed 3.98 sec
The tiff-misc failures are new, happening on several branches, none of which have changed the tiff code recently. I think something updated on the CI machines, new TIFF library or something like that. I will chase it down separately when I'm feeling better.
I think you're seeing some numerical differences. No need to switch to TIFF -- I don't see why that would fix it. But the way we usually deal with this is just to check in a second test output to the ref directory (call it out-alt.txt or something like that). The way the tests work is that they'll pass as long as it matches any output in ref, there can be multiple.
I'm happy to clean it all up in a few days, you don't need to chase down this numerical nonsense if you don't want to. It's not a necessary rabbit hole.
I actually removed the extra command to write output images to be compared for the tests with numerical differences. It is still being tested, but there is no longer any hash comparison so it wouldn't fail in any CI jobs.
Let me know if this is good enough!