Difference between revisions of "RetroGirl/Gun"
m (move to category=Targeting Implementations) |
m (moved User:Voidious/Static Decision Tree Gun to RetroGirl/Gun: this is RetroGirl's main gun and I don't really have a better name for it) |
(No difference)
|
Revision as of 23:14, 1 June 2012
Background
I've been enjoying tinkering with WaveSim and genetic algorithms and have been trying to think of interesting ways to apply them. I got to thinking that a perceptual bot might be a good thing to try to evolve with GA. Eventually I focused on the perceptual gun. I can evaluate gun fitness orders of magnitude faster with WaveSim than other bot aspects with Robocode battles. Also I'm in kind of a gun nut phase.
It's not hard to imagine there could be movement tendencies shared by most bots, so I was pretty sure I could do better than Head-On Targeting, Linear Targeting, or Circular Targeting. Coming up with a way to decode gun logic from a bit string was the challenging part. As part of that, you have to make sure the translation between bits and gun logic is one that allows crossover and mutation to retain traits - otherwise, the GA wouldn't be able to evolve and improve the fitness.
I decided to try and use decision trees.
How it works
Using the tree to aim
The tree itself is really simple. Each node specifies either an output value (GuessFactor), or an attribute/value to split at. If the root is "lateral velocity"/5.5, then all situations with lateral velocity less than 5.5 move to the left sub-tree, the rest move to the right. This is repeated at each node until a node is reached that contains an output value. So to decide on a firing angle, we just move through the tree with the current situation's attributes, then translate the output value (GuessFactor) to a firing angle.
Decoding/building the tree
Interpreting a string of bits as a decision tree is really the interesting part. Here's how I do it.
First off, my decision tree is a filled out binary tree, so total nodes is <math>2^{height}-1</math>. The leaf nodes are output values and all the other nodes are splitting nodes.
The splitting nodes are each two bytes: splitting attribute and value. The leaf nodes are one byte: output value (GuessFactor). The first two bytes are the root, the next two are its left child, then its right child, then root.left.left, root.left.right, root.right.left... just like you'd lay out an array-based binary heap. Right now I ignore the first bit of each byte, so it's an unsigned number from 0 to 127.
I decide what attributes I'm going to use and create a 2D array, rootCube[num_attributes][2]
, specifying the min and max possible values for each attribute. When we query the tree, we pass a 1D array of attribute values, corresponding to these same attribute indices. So the splitting attribute byte value would be an index to the first dimension of those arrays.
As I build the tree from the bit string, I store with each node the hypercube of attribute value ranges that could be input to this node. For the root node, any value within rootCube
can come in, so I just make a copy of it. For every other node, I copy the parent's hypercube, then limit it based on the parent's splitting attribute/value.
But the byte for the split value is 0-127 - how does that work? When I set a node's split value, I consider the byte value as a fraction of the range (from the hypercube) for the splitting attribute. So if the lateral velocity range is [3, 6] and the byte for the splitting value is 42, I split at <math>3 + (((1 + 42) / 128) * (6 - 3)) = ~4</math>. If we mutate and the range becomes [0, 1], now we'll split at 0.33. If we didn't do it this way, you could end up with huge portions of a tree being unreachable - e.g., if you split at 4, then split at 3 on the right sub-tree for the same attribute. This setup ensures a coherent decision tree no matter what, and hopefully makes sense as far as passing on traits even if the split value can change.
So I create each node as defined by the byte values in the bit string and the hypercubes, connect nodes to their parents, and voila - gun logic!
Notes
My current test data set is 2 seasons against 500 bots covering the full range of the RoboRumble population. Head-On targeting gets 14.91%, my best tree so far gets 17.23%, Diamond's main gun gets about 23.5%. Right now my best tree is 1023 nodes (height=9 splits), with lateral velocity, bullet flight time, relative heading, wall distance, and reverse wall distance. I'm still playing with settings quite a bit.
I'll probably release my perceptual bot soon. As is usually the case, it turns out movement matters at least as much as gun =) ... I don't want to release it until I've got a nicely tuned movement and I can crush all existing perceptual bots. In the meantime, I may try this gun in Diamond for those opening ticks when I have little to no data. Right now I use HOT when I have no data, and this might even outperform my Main Gun for a fraction of the first round.