Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"Correctness" is not very useful in image processing, it's all about perception and performance.


Well "correct" in the case of linear vs non-linear luminance is pretty much exactly about perception (or at the very least, working to prevent perception differences further down the pipeline), even if it is a somewhat minor effect.

"The Importance of Being Linear"[1] which just came up on HN a few days ago, discusses some issues with non-linearity, specifically in the context of 3D rendering.

[1] - http://http.developer.nvidia.com/GPUGems3/gpugems3_ch24.html


There are lots of specific examples where incorrectness leads to perceptually significant problems. I'd agree that you don't normally notice it in the wild, which is why so much software has gotten away with it for so long.

It is possible to optimize the linearization steps so that they don't detract too much from the performance, especially if you're using one of the better (i.e. slower) interpolation methods.


This page[1] documents some effects of gamma-incorrect resizing. Specifically, check out Figure 12.

[1] http://blog.johnnovak.net/2016/09/21/what-every-coder-should...




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: