I am currently writing a simple demo for a 3D raytracing engine. The program basically has the following structure: An array stores all the planes using three coordinates. For every (2D) pixel on the screen I cast a ray and calculate the collision point with every plane. If the point is on the bounded plane (since mathematical planes are infinitely large), which is defined by its three points AND the collision point is nearest to the camera, it is the selected collision point. I calculate the angle and reflection vector using the formulas below. Based on that information I cast a few more rays recursively to achieve a "reflection on a rough surface" effect. The result looks almost right, but it has (had) a problem: The reflections did not look correct: img1
I realised that the problem was that if the ray is coming from below a plane, the normal vector of the plane still points up and therefore the reflection vector calculation is not correct. To solve this problem, my plan was to calculate the angle between both vectors (normal vector of plane and direction vector of ray). If that angle is >90°, the normal vector points in the wrong direction. If that's the case I invert the normal vector by negating its components. The problem is, when I do that, exactly nothing changes. The result still looks the same. To validate if my angle calculations were correct, instead of inverting the normal vector I just return null if the angle is >90°. Now the result looks better, but some planes do only "work" when the ray hits it from one direction. (of course, since I'm not inverting the normal vector). img2
As you can see, the white plane at the top is gone entirely and the blue plane has no reflection anymore, so my code checking the angle should be correct. But the reflections still look a bit strange and not how they should be.
This is the code I use to calculate the intersection point, angle, and reflection vector:
// a, b, c are the three points of the plane, s is the starting point of the ray and d its direction vector
static intersect(a, b, c, s, d) {
let n = new Vector(0, 0, 0);
n.x = ((b.y-a.y)(c.z-a.z)) - ((b.z-a.z)(c.y-a.y));
n.y = ((b.z-a.z)(c.x-a.x)) - ((b.x-a.x)(c.z-a.z));
n.z = ((b.x-a.x)(c.y-a.y)) - ((b.y-a.y)(c.x-a.x));
if (Geometry.angleBetweenVectors(d, n.negate()) >= Math.PI/2) {
// n = n.negate();
return null;
}
let t = - (n.xs.x + n.ys.y + n.zs.z - n.xa.x - n.ya.y - n.za.z) / (n.xd.x + n.yd.y + n.zd.z);
let intersectionPoint = new Point();
intersectionPoint.x = s.x + td.x;
intersectionPoint.y = s.y + td.y;
intersectionPoint.z = s.z + td.z;
// Reflection Angle
let angle = Math.PI/2-Geometry.angleBetweenVectors(d, n.negate());
let normLength = Math.sqrt(n.xn.x + n.yn.y + n.z*n.z);
// Reflection Vector
n.x /= normLength;
n.y /= normLength;
n.z /= normLength;
let scalar = 2 * (d.xn.x + d.yn.y + d.zn.z);
n.x = scalar;
n.y = scalar;
n.z = scalar;
let reflectionVector = new Vector();
reflectionVector.x = d.x - n.x;
reflectionVector.y = d.y - n.y;
reflectionVector.z = d.z - n.z;
return {point: intersectionPoint, angle: angle, reflectionVector: reflectionVector};
}
static angleBetweenVectors(a, b) {
let dirLength = Math.sqrt(a.xa.x + a.ya.y + a.za.z);
let normLength = Math.sqrt(b.xb.x + b.yb.y + b.zb.z);
let dotP = a.xb.x + a.yb.y + a.zb.z;
return Math.acos(dotP/(dirLengthnormLength));
}
I am not sure what the problem is and I suspect that my math is not quite right. It also seems strange to me that inverting the normal vector of the plane has no effect to the resulting image. I hope that someone can help me or point me in the right direction since I am really lost. Thanks a lot :)
Edit: This is the formula I used for calculating the reflection vector: $=−2(⋅)$, where r is the reflection vector, d is the direction vector of the ray (in my case) and n is the normalised normal vector of the plane. (Formula from here)