Thursday, August 28, 2014
Okay, so designing things before I implement them doesn’t seem to be working out. Therefore, I’ll violate my own arguments against speculative development and just implement something I think might be cool, and see where it goes. This is a basic system for identifying the nearby triangles on a planet’s surface so we could potentially draw surface features.

Okay, so designing things before I implement them doesn’t seem to be working out. Therefore, I’ll violate my own arguments against speculative development and just implement something I think might be cool, and see where it goes. This is a basic system for identifying the nearby triangles on a planet’s surface so we could potentially draw surface features.

Wednesday, August 27, 2014

Current Status

So many times the past few weeks I’ve charged over to XCode full of vigor and ideas, ready to CHANGE THE WORLD, and then was brought up short by the minor problem that I really don’t know where I want to go with any of these game enginey things I’ve been messing around with.

This is a recurring problem.

Thursday, July 31, 2014
Eh, not too bad. I adjusted her right arm and gun to create a more consistent directional feel to the image — it’s kind of typical of my half-assed approach to laying out artwork that I missed that in my original pencil draft, but the result is surprisingly interesting composition. Also added elbow gloves and somehow an enigmatic smile got on her face, which is very strange as I’d pictured A. CYBORG as a dour, cranky and humorless straight-man kind of character. Hmm, I may have to reconsider some stuff.

Eh, not too bad. I adjusted her right arm and gun to create a more consistent directional feel to the image — it’s kind of typical of my half-assed approach to laying out artwork that I missed that in my original pencil draft, but the result is surprisingly interesting composition. Also added elbow gloves and somehow an enigmatic smile got on her face, which is very strange as I’d pictured A. CYBORG as a dour, cranky and humorless straight-man kind of character. Hmm, I may have to reconsider some stuff.

Monday, July 28, 2014
Working on a new A. CYBORG title screen for some reason.

Working on a new A. CYBORG title screen for some reason.

Sunday, July 13, 2014
As it happens, not long after I made my big pronouncement about taking a break from gamedev I had lunch with a friend of mine and he had some interesting ideas about spaceship AI and I decided to try and get some stuff basically working. Several hours of screaming and crying later, that little test guy in the middle distance is now using basic steering behaviors to path between randomly placed waypoints, such as the star he’s approaching now. So there’s that.
This tech is mildly more sophisticated than that used by the enemies in Neon Galaxy. The foes in that game just accelerated constantly towards their target, relying on frictional forces to wear away their residual velocity in less helpful directions in the meantime. By contrast, this ship is actually computing a steering force to adjust its current velocity towards a desired velocity that will take it towards its destination at a desired speed. It’s much easier to keep a handle on the ship’s speed without relying on guesswork-computed friction values this way.

As it happens, not long after I made my big pronouncement about taking a break from gamedev I had lunch with a friend of mine and he had some interesting ideas about spaceship AI and I decided to try and get some stuff basically working. Several hours of screaming and crying later, that little test guy in the middle distance is now using basic steering behaviors to path between randomly placed waypoints, such as the star he’s approaching now. So there’s that.

This tech is mildly more sophisticated than that used by the enemies in Neon Galaxy. The foes in that game just accelerated constantly towards their target, relying on frictional forces to wear away their residual velocity in less helpful directions in the meantime. By contrast, this ship is actually computing a steering force to adjust its current velocity towards a desired velocity that will take it towards its destination at a desired speed. It’s much easier to keep a handle on the ship’s speed without relying on guesswork-computed friction values this way.

Monday, July 7, 2014

Development Status

Figured I owed readers an update: I am not dead, ill, or going through some other personal trauma (not more so than usual, anyway.) I just haven’t been especially happy with the game designs I have floating around, so I’m going to take a little break from coding, maybe do some other creative things, and hopefully come up with an idea I really like.

FURTHER BULLETINS AS EVENTS WARRANT

Monday, June 9, 2014
What’s this? A mysterious derelict spacecraft floating at Waypoint 1??!?!
Or, you know, it’s just the other test ship model I slapped together a while back. But this is proof of concept of the multi-world system. When the player gets close enough to Waypoint 1, they are automatically knocked out of fast travel and placed in the 3D world associated with that waypoint, which happens to contain this spacecraft.
There are two other styles of waypoint: “objects” like the planet, which currently kill you if you run into them, and “proximity,” which doesn’t knock you out of fast travel automatically but, if you voluntarily leave FT near the waypoint, you are placed in the associated world. This could be done to create fields of asteroids, mines, or other debris in proximity to planets; if you drop out of fast travel anywhere near the planet you’d find yourself in the minefield. Alternatively it could be part of elaborately scripted missions, where you need to reach a certain distance from a planet before scanning it to find your next objective or some such thing.

What’s this? A mysterious derelict spacecraft floating at Waypoint 1??!?!

Or, you know, it’s just the other test ship model I slapped together a while back. But this is proof of concept of the multi-world system. When the player gets close enough to Waypoint 1, they are automatically knocked out of fast travel and placed in the 3D world associated with that waypoint, which happens to contain this spacecraft.

There are two other styles of waypoint: “objects” like the planet, which currently kill you if you run into them, and “proximity,” which doesn’t knock you out of fast travel automatically but, if you voluntarily leave FT near the waypoint, you are placed in the associated world. This could be done to create fields of asteroids, mines, or other debris in proximity to planets; if you drop out of fast travel anywhere near the planet you’d find yourself in the minefield. Alternatively it could be part of elaborately scripted missions, where you need to reach a certain distance from a planet before scanning it to find your next objective or some such thing.

Sunday, June 8, 2014
Just a bunch of waypoint-related gruntwork. WPs can detect if they are occluded (only by the nearest body) to lessen the chances of careless pilots crashing into planets trying to reach waypoints or moons on the other side. They also now have visibility ranges to keep them from overlapping their parent body’s waypoint at large distances.
The names of the sun, planets, and moon in this test environment (“Alsace,” “Seraphim,” “Clementine,” “Belfunk”) are taken from a 3D landscape game I was working on years and years ago, way back before I got into professional game development. Seeing them again creates feelings of strange nostalgia in me.

Just a bunch of waypoint-related gruntwork. WPs can detect if they are occluded (only by the nearest body) to lessen the chances of careless pilots crashing into planets trying to reach waypoints or moons on the other side. They also now have visibility ranges to keep them from overlapping their parent body’s waypoint at large distances.

The names of the sun, planets, and moon in this test environment (“Alsace,” “Seraphim,” “Clementine,” “Belfunk”) are taken from a 3D landscape game I was working on years and years ago, way back before I got into professional game development. Seeing them again creates feelings of strange nostalgia in me.

Saturday, June 7, 2014
I’ve been spending the last few days reworking and doing more implementation on waypoints and displayed waypoint icons in the ol’ space game. This has involved adding multiple waypoint types for locations and objects, starting to plumb through the multiple world tech, and rewriting the icons to use flat 2D rendering instead of the projective cockpit stuff from the earlier iteration, as seen in the screenshot. This last part ended up involving some slightly annoying math.
See, I wanted the waypoint markers for planets to be a little line touching the top of the planet, as in the screenshot. I recently fixed up some longstanding hassles with projecting points between the 2D and 3D worlds in my engine, so it’s now really easy to take any point in the universe and project it onto the screen. You’d think all that was necessary was to start with the center of the planet, adjust it up along the camera’s vertical axis by the radius of the planet, project that to the 2D world and we’re good to go… right?
In the words of Wreck-it Ralph, HA. And also, no.
The problem is with the very nature of perspective projection. A planet is a 3D object with depth. The front of it bulges towards the camera, with the result that the point we just located is going to be concealed behind that bulge. As we get closer to the planet, the front of it bulges ever closer, and the point we projected gets ever more wrong. (The size of the bulge effect is completely unaffected by the camera’s field of view, by the way. Which makes sense: changing the FOV doesn’t give us any ability to see “around” the bulge any more than adjusting the zoom lens on a camera lets you see the back side of an object.)
How to address this? I ended up deriving a solution myself (after a few false starts) and feeling very smug about it, although you can also find a good description of the problem and a similar solution in this paper. Essentially: We want to locate the point in 3D space which is the highest possible point of the sphere on the vertical axis we can see. Imagine we draw a line from that point to the camera. The angle between that line, and a line connecting the point to the center of the planet, is going to be a right angle. Knowing this, with simple trigonometry we can derive the length of that line and the angle between the line and one connecting the camera and the planet’s center. Then, we take the camera-to-planet vector, rotate it around the horizontal axis by that angle and change its length to the value we derived, and we’ve found our point!
There is a little complication, though, in that this would only work if the planet was perfectly aligned in front of the camera. As the planet moves to the left or right, the lines from the camera to the center and edge will stop being only on the vertical axis. I resolved this by finding the point where the camera would be if you slid it horizontally to line up with the planet and rotating around that, though I’m fairly sure there’s a simpler way to address that issue. Probably involves changing coordinate spaces, because what graphics technique doesn’t involve changing coordinate spaces, am I right? Huh? Guys, back me up on this one!
On an unrelated note, I like how all the fast travel screenshots in this game look pretty badass.

I’ve been spending the last few days reworking and doing more implementation on waypoints and displayed waypoint icons in the ol’ space game. This has involved adding multiple waypoint types for locations and objects, starting to plumb through the multiple world tech, and rewriting the icons to use flat 2D rendering instead of the projective cockpit stuff from the earlier iteration, as seen in the screenshot. This last part ended up involving some slightly annoying math.

See, I wanted the waypoint markers for planets to be a little line touching the top of the planet, as in the screenshot. I recently fixed up some longstanding hassles with projecting points between the 2D and 3D worlds in my engine, so it’s now really easy to take any point in the universe and project it onto the screen. You’d think all that was necessary was to start with the center of the planet, adjust it up along the camera’s vertical axis by the radius of the planet, project that to the 2D world and we’re good to go… right?

In the words of Wreck-it Ralph, HA. And also, no.

The problem is with the very nature of perspective projection. A planet is a 3D object with depth. The front of it bulges towards the camera, with the result that the point we just located is going to be concealed behind that bulge. As we get closer to the planet, the front of it bulges ever closer, and the point we projected gets ever more wrong. (The size of the bulge effect is completely unaffected by the camera’s field of view, by the way. Which makes sense: changing the FOV doesn’t give us any ability to see “around” the bulge any more than adjusting the zoom lens on a camera lets you see the back side of an object.)

How to address this? I ended up deriving a solution myself (after a few false starts) and feeling very smug about it, although you can also find a good description of the problem and a similar solution in this paper. Essentially: We want to locate the point in 3D space which is the highest possible point of the sphere on the vertical axis we can see. Imagine we draw a line from that point to the camera. The angle between that line, and a line connecting the point to the center of the planet, is going to be a right angle. Knowing this, with simple trigonometry we can derive the length of that line and the angle between the line and one connecting the camera and the planet’s center. Then, we take the camera-to-planet vector, rotate it around the horizontal axis by that angle and change its length to the value we derived, and we’ve found our point!

There is a little complication, though, in that this would only work if the planet was perfectly aligned in front of the camera. As the planet moves to the left or right, the lines from the camera to the center and edge will stop being only on the vertical axis. I resolved this by finding the point where the camera would be if you slid it horizontally to line up with the planet and rotating around that, though I’m fairly sure there’s a simpler way to address that issue. Probably involves changing coordinate spaces, because what graphics technique doesn’t involve changing coordinate spaces, am I right? Huh? Guys, back me up on this one!

On an unrelated note, I like how all the fast travel screenshots in this game look pretty badass.

Sunday, June 1, 2014
Getting some code infrastructure rearranged so that the player can actually travel between points of interest in the star system and see different things at each one; not quite there yet, but it’s getting close. Here’s the thought process I followed:
Okay, so I’ve decided to use my existing “world” system (which simply collects a bunch of entities into a single group that can be ticked and rendered all at once) to hold all the entities at a POI. Then, I can delete them all at once when the player leaves.
It would be nice if you could leave a POI, come back, and see the same configuration of objects. That would require somehow being able to serialize entity state into a POI when you leave and then deserialize it into new entities upon return.
Of course, when entities are serialized they wouldn’t tick, so (say) a battle that was going on would freeze until you returned. Maybe some ridiculous, roundabout way of letting entities simulate what would happen if they hadn’t been serialized…
Wait. Why am I deleting them at all?
it’s 2014 and there is more computing power in the slightly aging iMac sitting in front of me than existed in the entire God damn world in 1985. (Which raises uncomfortable questions about the things I unthinkingly waste that power on, but anyway.) Why shouldn’t all these worlds just run simultaneously, and I only render the one the player is in? Now if I was planning to simulate wars between thousands of spaceships that would still cause trouble, but it’s unlikely there will be a whole lot going on in any other world besides a single combat encounter — you’re summoned across the system to defend a station that’s under attack from no more than a dozen enemies, say.
So, yeah. Each POI in your current system will have a world associated with it, they all tick forward normally, and as the player arrives at one we swap it in to the “render this world” slot. Easy-peasy. Probably.

Getting some code infrastructure rearranged so that the player can actually travel between points of interest in the star system and see different things at each one; not quite there yet, but it’s getting close. Here’s the thought process I followed:

  1. Okay, so I’ve decided to use my existing “world” system (which simply collects a bunch of entities into a single group that can be ticked and rendered all at once) to hold all the entities at a POI. Then, I can delete them all at once when the player leaves.
  2. It would be nice if you could leave a POI, come back, and see the same configuration of objects. That would require somehow being able to serialize entity state into a POI when you leave and then deserialize it into new entities upon return.
  3. Of course, when entities are serialized they wouldn’t tick, so (say) a battle that was going on would freeze until you returned. Maybe some ridiculous, roundabout way of letting entities simulate what would happen if they hadn’t been serialized…
  4. Wait. Why am I deleting them at all?

it’s 2014 and there is more computing power in the slightly aging iMac sitting in front of me than existed in the entire God damn world in 1985. (Which raises uncomfortable questions about the things I unthinkingly waste that power on, but anyway.) Why shouldn’t all these worlds just run simultaneously, and I only render the one the player is in? Now if I was planning to simulate wars between thousands of spaceships that would still cause trouble, but it’s unlikely there will be a whole lot going on in any other world besides a single combat encounter — you’re summoned across the system to defend a station that’s under attack from no more than a dozen enemies, say.

So, yeah. Each POI in your current system will have a world associated with it, they all tick forward normally, and as the player arrives at one we swap it in to the “render this world” slot. Easy-peasy. Probably.