I've been making a simple game in OpenGL, and implemented a screen fade-out using the old 'draw a black fullscreen quad and ramp up the alpha' trick. I'm doing all my shaders in linear space and using GL_FRAMEBUFFER_SRGB
to automatically apply gamma. When fading out the screen I noticed that the alpha blending also becomes nonlinear - to fade to ~25% brightness I need to use an alpha of ~0.95.
I fired up Unity for comparison, and its alpha blending also seems to work this way - when rendering in linear mode most of the opacity is bunched towards the top of the alpha range.
Is this expected behaviour? Is it correct from a PBR point of view? It seems very unintuitive. As a workaround, to get more intuitive blending, would it be reasonable to apply gamma correction of 0.45 to the alpha channel before blending?
EDIT: this is the result I get after blending a .95 alpha black quad over a pure white button. It could be bad gamma on my display as Simon F suggested in the comments, but it looks a lot brighter than 5% to me.
Oh! It just occurred to me: are you expecting a perceptually linear fade-out? I.e. Over, say, 16 frames, frames 4, 8, & 12 should look, respectively, 3/4s, 1/2, and 1/4 as bright as the original?
– Simon F Jul 09 '18 at 08:24Assuming an sRGB mapping, then 61 (i.e. hex 3D) is 5% brightness. If your display is reasonably calibrated, you can verify this by creating a, say 5x4 black image, set one pixel to white, and then tile that to a suitable size, e.g 200x200 pixels. Then draw a, say, 50x50 square in the middle and fill it with 0x3D3D3D. Take a few steps back, squint, and they should be almost indistinguishable. (Again it relies on your monitor behaving correctly). Maybe I should post the image.
– Simon F Jul 09 '18 at 13:37