Dynamic Reweighting

Jump to navigation Jump to search
Revision as of 13 December 2013 at 16:32.
The highlighted comment was created in this revision.

Dynamic Reweighting

Having optimized weights appears to be very important, especially the weight on old vs new data. However, these optimal weights may not be the same against all opposing bots, particularly the case of hitting adaptive vs non adaptive movement. If your kD-Tree provides dynamic weighting functionality, you could theoretically have good weightings vs opponents after a few rounds. (See my post in "Hard-coded segmentation") The difficulty is thus finding the optimal weights for hitting a given opponent in short learning period. One approach would be to change your weights after a missed wave in a similar manner to training neural networks. When a wave hits, and your bullet misses, in non firing ticks (during which you have extra calculation time) do kNN searches to see if the weight on a particular predictor would have caused a better (closer to correct GF, or perhaps DV) prediction if it was higher or lower. Then, look at the results the past several calculations and look for trends, to avoid training on noise.(This is somewhat similar to what Gaff's gun does, it stores results from the past 5 missed waves and trains the neural network with all of them on a miss(If I remember correctly...)) This seems like it would tend to converge on better weights, which might help fight opponents with adaptive movement. It also would partially alleviate the curse of dimensionality on kNN classification, by lowering the weights of unimportant predictors (against a particular opponent) to close to zero, essentially removing them. Does anyone else have ideas for optimizing weights during combat?

    Straw (talk)07:29, 13 December 2013

    In a gun, the easiest way would be having many sets of weights and put them all in a virtual gun array. But if all the sets perform similar to each other, the virtual gun will not be able to pick the optimal one due to noise.

    And flatteners actively try to screw up most statistics, including virtual gun scores.

      MN (talk)15:37, 13 December 2013

      For tuning weights in a gun, I'd strongly recommend testing things offline with WaveSim or similar. It's a couple orders of magnitude faster to test against pre-gathered data.

      Also, I've done quite a bit of tuning of gun weights this way. The gun weights in Diamond were evolved through a genetic algorithm with WaveSim. But honestly it barely outperforms the hand-tuned weights I had before that. I'm fairly skeptical that it's worth ever straying from "generally optimal" in order to eventually get "bot-specific optimal". That is, I think a gun with generally optimal weights tuned over 10,000 matches will beat bot-specific optimal tuned over <= 35 rounds.

      Worth noting that all the tuning I've done was specifically on attribute weights though, not decay rate or other factors. My main gun doesn't decay, just my Anti-Surfer gun. (And you may not want to tune an AS gun against pre-gathered data, though Skilgannon has.)

        Voidious (talk)17:32, 13 December 2013