I’ve been spending the last few days reworking and doing more implementation on waypoints and displayed waypoint icons in the ol’ space game. This has involved adding multiple waypoint types for locations and objects, starting to plumb through the multiple world tech, and rewriting the icons to use flat 2D rendering instead of the projective cockpit stuff from the earlier iteration, as seen in the screenshot. This last part ended up involving some slightly annoying math.

See, I wanted the waypoint markers for planets to be a little line touching the top of the planet, as in the screenshot. I recently fixed up some longstanding hassles with projecting points between the 2D and 3D worlds in my engine, so it’s now really easy to take any point in the universe and project it onto the screen. You’d think all that was necessary was to start with the center of the planet, adjust it up along the camera’s vertical axis by the radius of the planet, project that to the 2D world and we’re good to go… right?

In the words of Wreck-it Ralph, HA. And also, no.

The problem is with the very nature of perspective projection. A planet is a 3D object with depth. The front of it bulges towards the camera, with the result that the point we just located is going to be concealed behind that bulge. As we get closer to the planet, the front of it bulges ever closer, and the point we projected gets ever more wrong. (The size of the bulge effect is completely unaffected by the camera’s field of view, by the way. Which makes sense: changing the FOV doesn’t give us any ability to see “around” the bulge any more than adjusting the zoom lens on a camera lets you see the back side of an object.)

How to address this? I ended up deriving a solution myself (after a few false starts) and feeling very smug about it, although you can also find a good description of the problem and a similar solution in this paper. Essentially: We want to locate the point in 3D space which is the highest possible point of the sphere on the vertical axis we can see. Imagine we draw a line from that point to the camera. The angle between that line, and a line connecting the point to the center of the planet, is going to be a right angle. Knowing this, with simple trigonometry we can derive the length of that line and the angle between the line and one connecting the camera and the planet’s center. Then, we take the camera-to-planet vector, rotate it around the horizontal axis by that angle and change its length to the value we derived, and we’ve found our point!

There is a little complication, though, in that this would only work if the planet was perfectly aligned in front of the camera. As the planet moves to the left or right, the lines from the camera to the center and edge will stop being only on the vertical axis. I resolved this by finding the point where the camera would be if you slid it horizontally to line up with the planet and rotating around that, though I’m fairly sure there’s a simpler way to address that issue. Probably involves changing coordinate spaces, because what graphics techniqueÂ *doesn’t* involve changing coordinate spaces, am I right? Huh? Guys, back me up on this one!

On an unrelated note, I like how all the fast travel screenshots in this game look pretty badass.