Mayfly Studio

Jan 27

Sooo I’m kind of at the point where I need to get some local space environment functionality into the game. Like, space stations to dock with and stuff like that. I’m not planning anything terribly fancy 3D modeling-wise right now, so I figured to make some simple placeholder art I’d crack open Blender again for the first time in a year or two and Jesus wept what the hell is it with that program’s user interface. I mean, it just, it… it… There is simply no excuse for such an obtuse, random mess. (Shamus Young’s Blender rant is still quite applicable here; the only thing that’s changed since 2006 is that the interface has been revised once or twice and is still equally awful, just in slightly different ways.) After I got stymied just trying to rotate the damn camera I gave up before things got violent and decided to once more embark on my usual fruitless quest to find a functional 3D modeler. With trepidation, knowing that the end point of the journey would be failure, I stepped out into the Internet, and —
Oh, Wings 3D. Huh. Well, that was easy. Took me like five minutes to slap that baby up there together. The only thing I had to look up was how to open the precise movement menu, and even that, I realized afterwards, was called out in the status bar. So. Uh. …Good, I guess.
Now, the one thing Blender does have which Wings 3D doesn’t is my custom exporter that I wrote a while back. In order to write one for Wings I’ll have to apparently do it in Erlang. Well, learning an entire new computer language is probably easier than figuring out how to move the camera in Blender and definitely more fun, so I promise to give it the old college try.
(The other thing to remain vigilant about is to what extent I can get flat surface normals exported out of the application. Oh sure, everything looks fine now, but I still remember the nightmarish and ultimately failed struggle to get Milkshape to produce the flat shading I wanted. So, you know, trust but verify as a great man once said.)

Sooo I’m kind of at the point where I need to get some local space environment functionality into the game. Like, space stations to dock with and stuff like that. I’m not planning anything terribly fancy 3D modeling-wise right now, so I figured to make some simple placeholder art I’d crack open Blender again for the first time in a year or two and Jesus wept what the hell is it with that program’s user interface. I mean, it just, it… it… There is simply no excuse for such an obtuse, random mess. (Shamus Young’s Blender rant is still quite applicable here; the only thing that’s changed since 2006 is that the interface has been revised once or twice and is still equally awful, just in slightly different ways.) After I got stymied just trying to rotate the damn camera I gave up before things got violent and decided to once more embark on my usual fruitless quest to find a functional 3D modeler. With trepidation, knowing that the end point of the journey would be failure, I stepped out into the Internet, and —

Oh, Wings 3D. Huh. Well, that was easy. Took me like five minutes to slap that baby up there together. The only thing I had to look up was how to open the precise movement menu, and even that, I realized afterwards, was called out in the status bar. So. Uh. …Good, I guess.

Now, the one thing Blender does have which Wings 3D doesn’t is my custom exporter that I wrote a while back. In order to write one for Wings I’ll have to apparently do it in Erlang. Well, learning an entire new computer language is probably easier than figuring out how to move the camera in Blender and definitely more fun, so I promise to give it the old college try.

(The other thing to remain vigilant about is to what extent I can get flat surface normals exported out of the application. Oh sure, everything looks fine now, but I still remember the nightmarish and ultimately failed struggle to get Milkshape to produce the flat shading I wanted. So, you know, trust but verify as a great man once said.)

Jan 26

[video]

Just workaday stuff here, pretty much. I spent today going back in and cleaning up the code for reprojecting objects that would be beyond the far clipping plane so they can get rendered normally, and sort correctly. Long and the short of it is, objects in a depth range from 0.997 to 1.002 (yes, depth buffer math is goofy) now get reprojected into the 0.997-0.99999 range. Though they’re not as teensy as the planets in the previous update, both of these objects would still be quite invisible otherwise.
I also plumbed the rescaling/positioning information through to objects associated with planets, like moons and the “halo” (used for atmosphere on planets and a half-assed glow on suns.) In the latter case, since the halos are transparent I have to take extra steps to sort them correctly: once the planet knows where it’s going to render, it just adds the transparent object to a list which, after all solid bodies are drawn, gets sorted and rendered back-to-front so all transparent blending is correct. The only downside of this scheme is that — I think — these objects still take up more depth buffer space than they would otherwise, so in the case of large halos I’ve seen them visibly overlap nearby objects that they shouldn’t have touched. I’m not sure how serious this issue is, though it would show up with very close binary stars, but it also tells me I should find a better way to draw a glow around stars anyway. I’ve never liked this method.
There is one remaining annoyance: the near clipping plane. Especially in the case of smaller planets, they vanish past that plane before you can get anywhere near them. I’ll probably try reprojecting near values as well, although my gut feeling is that’s absolutely not going to work. I can, of course, also move the near plane much closer, but I’m really skeptical if I can pull it in close enough to allow satisfyingly close planetary approaches.

Just workaday stuff here, pretty much. I spent today going back in and cleaning up the code for reprojecting objects that would be beyond the far clipping plane so they can get rendered normally, and sort correctly. Long and the short of it is, objects in a depth range from 0.997 to 1.002 (yes, depth buffer math is goofy) now get reprojected into the 0.997-0.99999 range. Though they’re not as teensy as the planets in the previous update, both of these objects would still be quite invisible otherwise.

I also plumbed the rescaling/positioning information through to objects associated with planets, like moons and the “halo” (used for atmosphere on planets and a half-assed glow on suns.) In the latter case, since the halos are transparent I have to take extra steps to sort them correctly: once the planet knows where it’s going to render, it just adds the transparent object to a list which, after all solid bodies are drawn, gets sorted and rendered back-to-front so all transparent blending is correct. The only downside of this scheme is that — I think — these objects still take up more depth buffer space than they would otherwise, so in the case of large halos I’ve seen them visibly overlap nearby objects that they shouldn’t have touched. I’m not sure how serious this issue is, though it would show up with very close binary stars, but it also tells me I should find a better way to draw a glow around stars anyway. I’ve never liked this method.

There is one remaining annoyance: the near clipping plane. Especially in the case of smaller planets, they vanish past that plane before you can get anywhere near them. I’ll probably try reprojecting near values as well, although my gut feeling is that’s absolutely not going to work. I can, of course, also move the near plane much closer, but I’m really skeptical if I can pull it in close enough to allow satisfyingly close planetary approaches.

Jan 22

Now, see, the cool thing about this screenshot is not that the planets are big, but that they are tiny.
In order to draw a big solar system with a minimum of hassle I needed — well, not “needed” in the sense that I’d die if I didn’t get one, but you know what I mean — a method for drawing objects like distant planets that are vastly farther away from the camera than the far clipping plane. So the idea came to me: what if I draw them closer to the camera, but just smaller?
And that’s what’s going on here. I’m using gluProject to find out the position of the planet onscreen, then gluUnproject to project it back out to a different depth value that is within the clipping planes. I do a similar thing with another vector indicating the edge of the planet’s disc to find out what the planet’s size should be. Then I actually render it at the reprojected position with the recomputed size. Works great!
Note that when I say “works great” I’m lying because the scale part works terrible — it breaks down for semi-obvious reasons when the object is large and near the edge of the screen. This actually isn’t how you should calculate the scale of a faraway object at all. That said, if I don’t want to be bothered to bang out the right math I could probably just gluUnproject a point relative to the center of the screen instead of one relative to the onscreen location of the object and it would be fine. Also, I’m going to have the distance correction blend in only on faraway objects, further suppressing the issue.
Will have to noodle around with it and see if I can count on consistent behavior or what, but aside from the scale issue this strikes me as a good approach.

Now, see, the cool thing about this screenshot is not that the planets are big, but that they are tiny.

In order to draw a big solar system with a minimum of hassle I needed — well, not “needed” in the sense that I’d die if I didn’t get one, but you know what I mean — a method for drawing objects like distant planets that are vastly farther away from the camera than the far clipping plane. So the idea came to me: what if I draw them closer to the camera, but just smaller?

And that’s what’s going on here. I’m using gluProject to find out the position of the planet onscreen, then gluUnproject to project it back out to a different depth value that is within the clipping planes. I do a similar thing with another vector indicating the edge of the planet’s disc to find out what the planet’s size should be. Then I actually render it at the reprojected position with the recomputed size. Works great!

Note that when I say “works great” I’m lying because the scale part works terrible — it breaks down for semi-obvious reasons when the object is large and near the edge of the screen. This actually isn’t how you should calculate the scale of a faraway object at all. That said, if I don’t want to be bothered to bang out the right math I could probably just gluUnproject a point relative to the center of the screen instead of one relative to the onscreen location of the object and it would be fine. Also, I’m going to have the distance correction blend in only on faraway objects, further suppressing the issue.

Will have to noodle around with it and see if I can count on consistent behavior or what, but aside from the scale issue this strikes me as a good approach.

Jan 20

The code is currently a slapdash mess, but I’ve managed to implement a system where the planets and local objects exist at two different scales. In normal travel mode, you won’t see distant planets growing closer because you’re just going too slow to get there (or at least that’s my excuse.) Switch to the special travel mode, though, and now you can fly super fast and reach the various planets. The general idea is that the still-unnamed special drive is used for long range travel within a solar system, and then once you reach a point of interest or just feel like stopping you drop out of it and fly normally to interact with the various space stations and whatnot you might find. I put together a toy solar system at 1/10th scale — which is still colossal — to see how it feels to fly around in. I’ll say this: it certainly feels big, even when driving around it at Ludicrous Speed.
(It turns out that this is the system that the Elite: Dangerous guys have settled on, too, which I thought was an amusing coincidence. But I’ll take it as evidence that it’s a workable idea.)
In order to get a better sense of how this hangs together I’ll need to implement some more stuff for drawing the planets when they’re far enough away that they would be behind the far clipping plane. I do need to have a depth buffer and thus a clipping plane, sadly, so render queries will work. I’m thinking that it should be possible to project the planets to their actual distances to figure out what size and position they “would” be rendered at, then locate the actual object closer and rescale/move it as appropriate. The extra benefit of this is I could create some dizzying effects for the super speed mode by scaling the planets in a clearly wrong fashion.

The code is currently a slapdash mess, but I’ve managed to implement a system where the planets and local objects exist at two different scales. In normal travel mode, you won’t see distant planets growing closer because you’re just going too slow to get there (or at least that’s my excuse.) Switch to the special travel mode, though, and now you can fly super fast and reach the various planets. The general idea is that the still-unnamed special drive is used for long range travel within a solar system, and then once you reach a point of interest or just feel like stopping you drop out of it and fly normally to interact with the various space stations and whatnot you might find. I put together a toy solar system at 1/10th scale — which is still colossal — to see how it feels to fly around in. I’ll say this: it certainly feels big, even when driving around it at Ludicrous Speed.

(It turns out that this is the system that the Elite: Dangerous guys have settled on, too, which I thought was an amusing coincidence. But I’ll take it as evidence that it’s a workable idea.)

In order to get a better sense of how this hangs together I’ll need to implement some more stuff for drawing the planets when they’re far enough away that they would be behind the far clipping plane. I do need to have a depth buffer and thus a clipping plane, sadly, so render queries will work. I’m thinking that it should be possible to project the planets to their actual distances to figure out what size and position they “would” be rendered at, then locate the actual object closer and rescale/move it as appropriate. The extra benefit of this is I could create some dizzying effects for the super speed mode by scaling the planets in a clearly wrong fashion.

Jan 17

Design Limitations

There’s enough different things I want to work on in the space engine that it’s hard to pick any one thing. So instead I’ll yammer unproductively.

Read More

Jan 12

Quick menagerie of gas giants. Although it’s subtle, I also fixed some issues with the galactic nebula in the background that were bugging me — basically, it was projected onto a sphere, which caused the 3D texture to generate artifacts around the galactic equator. Now it’s just a cylinder, no more artifacts.
It’s been fun fooling with the space engine again for a bit. Not quite sure if I’ll do more on it right now, as I’m a little hazy about where I’m going with the whole thing.

Quick menagerie of gas giants. Although it’s subtle, I also fixed some issues with the galactic nebula in the background that were bugging me — basically, it was projected onto a sphere, which caused the 3D texture to generate artifacts around the galactic equator. Now it’s just a cylinder, no more artifacts.

It’s been fun fooling with the space engine again for a bit. Not quite sure if I’ll do more on it right now, as I’m a little hazy about where I’m going with the whole thing.

Jan 11

[video]

Okay, that’s the lighting issues sorted out. Which is to say that’s not the lighting issues sorted out, because if there’s one fundamental thing I’ve learned about 3D graphics it’s that the lighting issues never, ever get sorted out.
Bitterness aside, the real problem with my lighting system was philosophical: namely, it tried to be a comprehensive solution without the needed systems support. A major part of most 3D renderers is the scene graph, which is a fancy term for “all the stuff we want to render this frame” and infrastructure which lets you query that. My little engine has an extremely rudimentary scene graph: basically just a list of objects (and object hierarchies) independently floating in the world. Trying to run a query of objects that might get lit, vs. objects that might light them, is a bewildering pain. The original “lighting system” I wrote — I hesitate to even dignify it with that term — had several light types and a vague wishful methodology for keeping a list of most influential lights on the world itself and sorting them based on distance and brightness. Needless to say it didn’t work even a little bit.
And you know? That’s fine. Not every engine needs to have a fancy lighting system. The stuff I’m writing with my little toy renderer is generally designed to produce very specific, tuned effects. So I’m gonna just run with that. The various planets and moons and whatnot happen to be organized hierarchically under a cSolarSystem parent object. I just put some accessors on that object which return a point light position and color based on the primary star. Good enough for government work.
As a side note, there’s a nice touch to how the rings are lit. A planet’s rings are made up of millions of tiny particles that just look like a single object from far away. So the overall color of the rings will be the color you’d expect to see from light bouncing off a round particle to your eye: if you were closer to the sun than the particle, you’d see a bright particle as most light was reflected directly at you, while if you were farther away you’d see a dim particle as most of the light reflected away from you. In this case, I’m taking a dot product of the vector from the light source to the ring system center, and the vector from the camera to the system center. This isn’t physically accurate, but computing the lighting on a per-pixel basis would create gradient effects and the fundamental principle of this graphical style is NO GRADIENTS!!! The single color I’m computing this way is a good approximation that delivers the result I want.
Anyway. With the lighting temporarily sorted, that means there’s no obstacle to making tomorrow Gas Giant Day. I’m very excited!

Okay, that’s the lighting issues sorted out. Which is to say that’s not the lighting issues sorted out, because if there’s one fundamental thing I’ve learned about 3D graphics it’s that the lighting issues never, ever get sorted out.

Bitterness aside, the real problem with my lighting system was philosophical: namely, it tried to be a comprehensive solution without the needed systems support. A major part of most 3D renderers is the scene graph, which is a fancy term for “all the stuff we want to render this frame” and infrastructure which lets you query that. My little engine has an extremely rudimentary scene graph: basically just a list of objects (and object hierarchies) independently floating in the world. Trying to run a query of objects that might get lit, vs. objects that might light them, is a bewildering pain. The original “lighting system” I wrote — I hesitate to even dignify it with that term — had several light types and a vague wishful methodology for keeping a list of most influential lights on the world itself and sorting them based on distance and brightness. Needless to say it didn’t work even a little bit.

And you know? That’s fine. Not every engine needs to have a fancy lighting system. The stuff I’m writing with my little toy renderer is generally designed to produce very specific, tuned effects. So I’m gonna just run with that. The various planets and moons and whatnot happen to be organized hierarchically under a cSolarSystem parent object. I just put some accessors on that object which return a point light position and color based on the primary star. Good enough for government work.

As a side note, there’s a nice touch to how the rings are lit. A planet’s rings are made up of millions of tiny particles that just look like a single object from far away. So the overall color of the rings will be the color you’d expect to see from light bouncing off a round particle to your eye: if you were closer to the sun than the particle, you’d see a bright particle as most light was reflected directly at you, while if you were farther away you’d see a dim particle as most of the light reflected away from you. In this case, I’m taking a dot product of the vector from the light source to the ring system center, and the vector from the camera to the system center. This isn’t physically accurate, but computing the lighting on a per-pixel basis would create gradient effects and the fundamental principle of this graphical style is NO GRADIENTS!!! The single color I’m computing this way is a good approximation that delivers the result I want.

Anyway. With the lighting temporarily sorted, that means there’s no obstacle to making tomorrow Gas Giant Day. I’m very excited!

Jan 08

Okay, this bit of the space engine is sort of working again too, emphasis on “sort of.” There should be a shadow on the rings, and there kind of is, except the lighting direction in the engine is simply broken and so the whole thing looks dark. Man, there’s nothing I love more than fighting with broken light direction calculations in OpenGL/GLSL. Wait a minute, not “love,” what’s that word…
As a side note, the Jovian planet generation isn’t nearly as sophisticated as the terrestrial planet stuff. Yes, it does use a 1D texture for cloud bands and a 3D texture for detail and blah blah blah, but the actual color/noise parameters are hardcoded right now. It would be fun to add code for generating Jovian planets which took into account the size of the planet and the heat delivered to the planet (or even generated by it) to come up with the cloud bands. Off the top of my head I’d judge that a larger planet receiving more heat from the sun will have a more dynamic atmosphere, with larger, more contrasty cloud bands containing more swirly detail. As the planet got colder and smaller, the atmosphere would be less active until you reached largely featureless gas giants like Uranus. As for color, that would be mostly random (the varying colors of gas giants in our own Solar System are due to trace gas proportions) but as we edged up towards brown dwarf category the planet would become more red due to its own radiation.
It’d also be good to add a system for generating a few storms, Great Red Spot-style; way back in the day when I was fooling around with ray-traced planet generation in a more realistic style, I had some nice tricks that should drop right in to a shader for adding good-looking storms. Like here, see?
There’s actually a huge variety of things I’d like to do in this engine now. That may be a bad thing as I could just be paralyzed between them and go do something else. Gonna wager on that being the final outcome here.

Okay, this bit of the space engine is sort of working again too, emphasis on “sort of.” There should be a shadow on the rings, and there kind of is, except the lighting direction in the engine is simply broken and so the whole thing looks dark. Man, there’s nothing I love more than fighting with broken light direction calculations in OpenGL/GLSL. Wait a minute, not “love,” what’s that word…

As a side note, the Jovian planet generation isn’t nearly as sophisticated as the terrestrial planet stuff. Yes, it does use a 1D texture for cloud bands and a 3D texture for detail and blah blah blah, but the actual color/noise parameters are hardcoded right now. It would be fun to add code for generating Jovian planets which took into account the size of the planet and the heat delivered to the planet (or even generated by it) to come up with the cloud bands. Off the top of my head I’d judge that a larger planet receiving more heat from the sun will have a more dynamic atmosphere, with larger, more contrasty cloud bands containing more swirly detail. As the planet got colder and smaller, the atmosphere would be less active until you reached largely featureless gas giants like Uranus. As for color, that would be mostly random (the varying colors of gas giants in our own Solar System are due to trace gas proportions) but as we edged up towards brown dwarf category the planet would become more red due to its own radiation.

It’d also be good to add a system for generating a few storms, Great Red Spot-style; way back in the day when I was fooling around with ray-traced planet generation in a more realistic style, I had some nice tricks that should drop right in to a shader for adding good-looking storms. Like here, see?

There’s actually a huge variety of things I’d like to do in this engine now. That may be a bad thing as I could just be paralyzed between them and go do something else. Gonna wager on that being the final outcome here.