Thread history
Time | User | Activity | Comment |
---|---|---|---|
No results |
- I have a simple precise prediction but I see these kind of things more of a guaranteed improvement rather than a necessity for tuning.
- What do you mean by using only firing waves?=)
- I might have to move to -1 to 1 GF system rather than the current system with 51 bins.
- The problem I had was after tuning against 3 Raiko micro matches I had a 0.9% improvement in TCRM but after 3 matches against 9 different surfers I had a 5% score loss.
- Any attributes that help a lot against surfers?
Imo the attributes and weights for random movement and surfers may be completely different, so I just tune them completely separately.
And there are not one single magic attribute that helps a lot against random movement, so do surfers.
And you don’t expect good performance when you use virtual waves for surfers as well, since they are irrelevant.
Firing waves means only waves where there was a real bullet. Against non-adaptive movement the more waves the better, but adaptive opponents will dodge your bullets only so the other waves will give bad information.
For me what did well against adaptive movements is recording data, doing maybe 10 generations of genetic tuning, then re-recording the data.
Make sure to add the adaptive speed to your genetic parameters. You might also want to use parameters people dont surf with, I did some odd things in DrussGT.
But really, the secret to a good score is good movement.
- After Xor said precise intersection I was searching for another meaning in real waves.=)
- My fitness function is using the KNNPredictor class in WhiteFang so basically everything is included in the algorithm.
- When I actually succeed at making robocode allow more data saving I'll move onto the recursive technic.
- "But really, the secret to a good score is good movement." I know but I have been working on movement since 2.2.7.1 and I want to stop my suffering for a while. Maybe genetic algorithm against Simple Targeting strategies and for the flattener?
- Edit:
- After tuning with three more parameters three things happened:
- I had my AS gun outperformed my Main Gun against Shadow for the first time
- I found out that my GA always maximizes K minimizes Divisor(probably I forgot to activate bot width calculations) and minimizes shots taken.
- Manhattan distance works much better than Squared Euclidean
- The random weights started out with 1542 hits.
- GA got it to 1923 hits.
- I made K 100, Divisor 1 and Decay 0 and hits rose up to 2086.
- I used Manhattan distance and it got 2117 hits
- Finally when I rolled really high and low values to 10 and 0 it got 2120 hits.
I use a patched version of robocode to allow unlimited data saving only from my data recording bot. Anyway a normal robocode with debug mode on is sufficient to do so, just wish robots in your test bed being free from file writing bugs.
Have you ever tried using k = sqrt(tree size)? This is a common practice when it comes to knn.
- I finally succeeded at increasing the data file quota to 20MB and will probably increase it even more when I turn back to TCRM.
- I'll try the sqrt(treesize), I already have the code and it can be easily added to my algorithm.
- The only problem I have now is that robocode truncates my data files if I finish the battle at max TPS.
- Note: I am saving a double[] array, an Integer array and a Double Array
20MB is too small. I generally record 2G of data via roborunner, 4 robocodes with 500M each.
I’m not experiencing data truncation. I’m using a worker thread that logs data asynchronously with java.nio FileChannel. However OutputStream API should be enough and you shouldn’t experience data truncation anyway. Where do you do file writing? Did you flush the higher level stream when it’s done? If you don’t do, robocode will close the lower level ones, resulting lost of data.