Thread history

From User talk:Kev
Viewing a history listing
Jump to navigation Jump to search
Time User Activity Comment
06:17, 17 January 2023 Xor (talk | contribs) New reply created (Reply to BeepBoop seems to be the new king)
01:36, 17 January 2023 Beaming (talk | contribs) New reply created (Reply to BeepBoop seems to be the new king)
20:15, 16 January 2023 Kev (talk | contribs) New reply created (Reply to BeepBoop seems to be the new king)
08:23, 15 January 2023 Xor (talk | contribs) Deleted (content was: "I used to run RoboRumble on very old computers as well, but I think as long as the cpu constant is actually computed on the corresponding hardware, the results can be trusted if no other tasks are running. If the computer is under heavy...)
08:23, 15 January 2023 Xor (talk | contribs) New reply created (since deleted) (Reply to BeepBoop seems to be the new king)
08:22, 15 January 2023 Xor (talk | contribs) New reply created (Reply to BeepBoop seems to be the new king)
06:19, 15 January 2023 Beaming (talk | contribs) New reply created (Reply to BeepBoop seems to be the new king)
13:46, 13 January 2023 Xor (talk | contribs) New reply created (Reply to BeepBoop seems to be the new king)
12:58, 13 January 2023 Xor (talk | contribs) New reply created (Reply to BeepBoop seems to be the new king)
00:35, 5 January 2023 Beaming (talk | contribs) New reply created (Reply to BeepBoop seems to be the new king)
17:16, 4 January 2023 Xor (talk | contribs) Comment text edited  
17:16, 4 January 2023 Xor (talk | contribs) Comment text edited  
17:15, 4 January 2023 Xor (talk | contribs) Comment text edited  
05:05, 4 January 2023 Xor (talk | contribs) Comment text edited  
05:03, 4 January 2023 Xor (talk | contribs) New reply created (Reply to BeepBoop seems to be the new king)
04:57, 4 January 2023 Xor (talk | contribs) New reply created (Reply to BeepBoop seems to be the new king)
04:20, 4 January 2023 Beaming (talk | contribs) New reply created (Reply to BeepBoop seems to be the new king)
16:10, 3 January 2023 Kev (talk | contribs) New reply created (Reply to BeepBoop seems to be the new king)
16:02, 3 January 2023 Kev (talk | contribs) New reply created (Reply to BeepBoop seems to be the new king)
02:07, 26 December 2022 Xor (talk | contribs) New reply created (Reply to BeepBoop seems to be the new king)
01:22, 26 December 2022 Beaming (talk | contribs) New thread created  

BeepBoop seems to be the new king

Congratulation! BeebBoop is at the very top.

Do you mind to hint about new top of the line research direction?

Beaming (talk)01:22, 26 December 2022

Congratulations (again) from me too ;) BeepBoop since 1.2 had very surprising results (nearly 95!!!). And yet nothing worked when I tried to use gradient descent in training models. Would you mind to share a little bit more about this section? E.g. initialization, learning rate, how to prevent getting zero or negative exponent in x^a formula…

Xor (talk)02:07, 26 December 2022

I’ve been meaning to release the code for the training, but it’s currently a huge mess and I’m pretty busy! In the meantime, here are some details that might help:

  • I initialized the powers to 1, biases to 0, and multipliers to a simple hand-made KNN formula.
  • I constrained the powers to be positive, so I guess the formula should really be written as w(x+b)^abs(a).
  • I used Adam with a learning rate 1e-3 for optimization.
  • Changing the KNN formula of course changes the nearest neighbors, so I alternated between training for a couple thousand steps and rebuilding the tree and making new examples.
  • For simplicity/efficiency, I used binning to build a histogram over GFs for an observation. Simply normalizing the histogram so it sums to 1 to get an output distribution doesn’t work that well (for one thing, it can produce very low probabilities if the kernel width is small). Instead, I used the output distribution softmax(t * log(histogram + abs(b))) where t and b are learned parameters initialized to 1 and 1e-4.
--Kev (talk)16:10, 3 January 2023

Thanks for the detailed explanation! It is not easy to get so many details right, which explained how mighty BeepBoop is, not to mention the innovations.

Xor (talk)04:57, 4 January 2023
 
 

Thanks! My guess for the next innovation that could improve bots is active bullet shadowing. Instead of always shooting at the angle your aiming model gives as most likely to hit, it is probably better to sometimes shoot at an angle that is less likely to hit if it creates helpful bullet shadows for you. This idea would especially help against strong surfers whose movements have really flat profiles (so there isn’t much benefit from aiming precisely). I never got around to implementing it, so it remains to be seen if it actually is useful!

--Kev (talk)16:02, 3 January 2023

Thanks for insights and ideas. Bullet shielder temped me a while ago. I thought that if one cat intercept a bullet wave close to the launch point, a bullet shadow will be big enough to slide in. But that required good knowledge of when a bullet will be fired. I guess it can be done similar to how its done in DrussGT which has a tree to predict the opponent bullet segmented on energy and distance. (At least I remember reading in wiki about some one way to predict an enemy wave this way). But my attempts to do it were not very successful.

By the way, could you repackage your bot with an older Java version? I am running the rumble but it fails on your bot complaining about

Can't load 'kc.mega.BeepBoop 1.21' because it is an invalid robot or team.

I think the current agreement that Java JDK v11 or lower is accepted. If you look at rumble stats, you would see that your bot has less battles then many others.

Beaming (talk)04:20, 4 January 2023

I think RoboRumble should be inclusive, which means the client should be run with the latest LTS version of Java to allow more Java versions to participate. LTS versions are also meant to be more stable, which help with more stable results.

I also updated the guide to suggest Java 17, which is the latest LTS version for now, instead of Java 11. Would you mind upgrading the Java version of your client?

Xor (talk)05:03, 4 January 2023

Sure. I am upgrading my clients to Java 17. Seems to be ok, except the warning about depreciated calls to

WARNING: System::setSecurityManager has been called by net.sf.robocode.host.security.RobocodeSecurityManager (file:/home/evmik/misc/robocode-1.9.4.2/libs/robocode.host-1.9.4.2.jar)
WARNING: Please consider reporting this to the maintainers of net.sf.robocode.host.security.RobocodeSecurityManager
WARNING: System::setSecurityManager will be removed in a future release

I think it is addressed in the newer robocode versions, but rumble still accept only 1.9.4.2

Beaming (talk)00:35, 5 January 2023

It is never addressed. Also, there's currently no solution after Java removes SecurityManager, other than sticking to Java 17 (or newer LTS versions still with SecurityManager). Tank Royale could be the long-term plan, but it is only possible after some cross-platform sandbox solution having been implemented.

Xor (talk)12:58, 13 January 2023
 

Btw, BeepBoop seems to be losing APS due to inconsistency in RoboRumble client (e.g. skipped turns).

BeepBoopDiff.png

http://literumble.appspot.com/BotCompare?game=roborumble&bota=kc.mega.BeepBoop%201.21&botb=kc.mega.BeepBoop%201.2&order=-Diff%20APS

BeepBoop runs fine on my computer, with the same result as (previous) RoboRumble and without skipped turns. Could you share some information about your environment, e.g. clients running in parallel, dedicated (not running any other task) or not. This may heavily affect reproducibility of RoboRumble.

Xor (talk)13:46, 13 January 2023