3

I've been making a simple game in OpenGL, and implemented a screen fade-out using the old 'draw a black fullscreen quad and ramp up the alpha' trick. I'm doing all my shaders in linear space and using GL_FRAMEBUFFER_SRGB to automatically apply gamma. When fading out the screen I noticed that the alpha blending also becomes nonlinear - to fade to ~25% brightness I need to use an alpha of ~0.95.

I fired up Unity for comparison, and its alpha blending also seems to work this way - when rendering in linear mode most of the opacity is bunched towards the top of the alpha range.

Is this expected behaviour? Is it correct from a PBR point of view? It seems very unintuitive. As a workaround, to get more intuitive blending, would it be reasonable to apply gamma correction of 0.45 to the alpha channel before blending?

EDIT: this is the result I get after blending a .95 alpha black quad over a pure white button. It could be bad gamma on my display as Simon F suggested in the comments, but it looks a lot brighter than 5% to me.

enter image description here

russ
  • 2,392
  • 9
  • 18
  • IMHO, the alpha channel should always be linearand treated as such, though I've no idea what Unity is doing.

    Oh! It just occurred to me: are you expecting a perceptually linear fade-out? I.e. Over, say, 16 frames, frames 4, 8, & 12 should look, respectively, 3/4s, 1/2, and 1/4 as bright as the original?

    – Simon F Jul 09 '18 at 08:24
  • Yeah that's what I mean. When using the gamma rendering mode that's what happens, but in linear mode it gets bunched to one end - rendering a black quad over a white one at .95 alpha gives a final pixel value of 63,63,63. – russ Jul 09 '18 at 08:34
  • So blending a black over white with alpha=0.95 implies you want something that's only 5% of "fully bright"?

    Assuming an sRGB mapping, then 61 (i.e. hex 3D) is 5% brightness. If your display is reasonably calibrated, you can verify this by creating a, say 5x4 black image, set one pixel to white, and then tile that to a suitable size, e.g 200x200 pixels. Then draw a, say, 50x50 square in the middle and fill it with 0x3D3D3D. Take a few steps back, squint, and they should be almost indistinguishable. (Again it relies on your monitor behaving correctly). Maybe I should post the image.

    – Simon F Jul 09 '18 at 13:37
  • RE: "EDIT: this is the result I get after blending a .95 alpha black quad over a pure white button. It could be bad gamma on my display as Simon F suggested in the comments, but it looks a lot brighter than 5% to me." Ahh but our visual system is non-linear - much as our hearing is too. For example, the analogue volume control on a hifi amplifier will use a logarithmic potentiometer rather than a linear one (https://electronics.stackexchange.com/questions/101191/why-should-i-use-a-logarithmic-pot-for-audio-applications). You simply need to change your per-frame alpha non-linearly. – Simon F Jul 10 '18 at 10:47
  • 1
    Ok that makes sense. So a pixel value of 63 would be physically around .05 brightness on a properly calibrated monitor, but perceptually it will appear brighter. Thanks for your help. – russ Jul 10 '18 at 11:15

1 Answers1

3

Just turning the comments into a fully-fledged answer as it may prove useful to others.

IMHO, the alpha channel should always be linear and treated as such.

You described blending a black quad over a white background using alpha=0.95, and mapping the result back to non-linear space. We'd expect a resulting, physical intensity value of 0.05.

If the non-linear space is sRGB, then a linear value of "0.05" would correspond to an sRGB value of "0.248" which, when mapped to 8-bit, is around 63 (or hex, 3D)

If your display is reasonably calibrated, you can verify this by creating, say, a 5x4 black image, setting one pixel to white, and then tiling that to a suitable size, e.g 200x200 pixels. Then draw a, say, 50x50 square in the middle and fill it with 0x3D3D3D. Here's an example:

enter image description here

If you take a few steps back from your monitor and squint, the centre grey region and the dithered outer should be almost indistinguishable, demonstrating a correct physical model.

The problem you are describing appears to be related to the fact that our visual system is non-linear, much as our hearing is too (though probably not with the same curve). For example, the analogue volume control on a hifi amplifier will use a logarithmic potentiometer rather than a linear one .

You thus 'simply' need to change your per-frame alpha non-linearly to get a perceptually even fade out.

Simon F
  • 4,241
  • 12
  • 30