Good Jump
The highlighted comment was created in this revision.
Wow! SimpleBot jumped 130 rankings. 65 is really good. You forced me to make a new bot=).
It's simply because I finally decided to try some kNN instead of random ;) It's still very simple without any kernel density; ) And I'm really excited and expected to see your new bot beating this! :D
Yes neural networks are typical experiencing a hard time at the initial ;p How many layers are you using? Would you try using two networks, one fast learning, one deep learning?
It is a perceptron with some pre-processing. I can try deep learning. It may be useful at hitting simple=) movements.
I tested it with some tuning. It can't pass my Anti-Surfer Gun against any sample bot.=(
Have any of you had any success with a DC+NN gun? I got my NN gun working pretty well against surfers and, even though my DC gun is still doing better, I think I can improve it enough to surpass it. But it performs really bad against random movers compared to my anti-random gun, even when I set up more than one NN, like in Gaff. Does that combination make any sense?
- Yes it makes sense. I once got +%5.2 APS by getting the sum of Anti-Surfer and Anti-Random weighted by their hit rates. (The old gun was just AS).
- Do you use pre-processing in your NN? It can be really helpful.
- For example you have two attributes x, y
- Use some formula on it.
- For example input = {|x|, |0.5 - x|, |1 - x|, |y|, |0.5 - y|, |1 - y|}
- Give it to the Neural Network
Yeah, I did this. My worry about NN + DC gun is exactly because of this. The backprop algorithm gets really slow as my NN grows in the number of perceptrons, my general purpose DC gun is pretty slow and my anti-surfer DC gun is blazingly fast. When I put my AS NN together with my general purpose DC gun, I have to decrease the size of my NN and/or decrease the complexity of my training process to avoid skipping turns, which turns out to reduce my score. I was just wondering if the training process really needs to be that complex, or if I have room to improve other things like this pre-processing step or something like that, put that together with a very simple training process and still get a good result against surfers without using another NN, for example.
I just found it amazing that you and Darkcanuck could get such good results with NN. I found it really hard to debug and to work on compared to DC, since there is a lot of things you can tweak in a neural targeting system. Congrats for that, and I hope to see Xor results on his NN experiments soon as well :P