1

So I have implemented motion blur via depth re-construction using the depth texture and a previous view projection matrix. The shader basically finds the texel coordinate in by using the previous (frame) view projection matrix and subtract the current UV (what's given to fragment) to find a blurring vector. Then I simply blur along this vector by sampling along it and then normalize it with the number of samples, a constant I set at the beginning.

The problem I'm having is that even a tiny movement will cause the motion blur to ... well blur the scene. I find this too invasive and would like to ... "soften" this effect. I have tried to cap it by saying if the dot product of this blur vector is greater than 0.01, then blur. This sadly creates an oval like artifact in the middle the screen ...

Anyone have good suggestions on how to handle this?

ChaoSXDemon
  • 153
  • 5
  • "the dot product of this blur vector" with what? dot product has two operands. For your question, don't forget the possibility that an error in the code itself is what's making the effect too strong. – Dan Hulme Jul 05 '17 at 07:54
  • Are you happy to show your code? – trichoplax is on Codidact now Jul 05 '17 at 11:38
  • @DanHulme, I'm using the dot product of the blur vector, so basically dot(blurVector, blurVector). This is basically the length squared to see if the blur vector is large enough. – ChaoSXDemon Jul 05 '17 at 17:09
  • Is the 'oval like artefact' because the blur is sampled in device coordinate space? – PaulHK Jul 07 '17 at 06:01
  • @PaulHK, what do you mean? NDC? I'm drawing the screen as a post-processing effect, meaning I'm drawing a square that's the size of the screen and with texture coordinate (0,0) being top left and (1, 1) being bottom right. Therefore the UV will be interpolated from top left to bottom right. – ChaoSXDemon Jul 07 '17 at 20:30
  • Yes that's what I was thinking. The problem is that if you go, say, 0.01 in any direction, it will be stretched horizontally because it is not corrected for aspect. When sampling for a blur, you want to be in pixel space, or at least a aspect corrected version of NDC, e.g. vec2 samplePos = ndcPos + blurOffset * vec2(aspectRatio, 1); – PaulHK Jul 09 '17 at 08:03
  • @PaulHK I will try that but I doubt it will change much. I'm already calculating everything in pixel space ... so I reconstruct the 3D position from uv and depth texture, then multiply by inverse view-projection matrix. I then do perspective divide myself after multiplying by previous view-projection matrix. This coordinate is then converted to texture space [0, 1]. I then use this to subtract uv in fragment shader to get the delta uv. – ChaoSXDemon Jul 10 '17 at 17:37

0 Answers0