BeepBoop seems to be losing ~1.5 APS

Jump to navigation Jump to search

BeepBoop seems to be losing ~1.5 APS

BeepBoopDiff.png

Comparing to the nearly identical 1.20, you can clearly see some random drops, which I suspect to be skipped turns on some old hardware.

The previous results of APS ~94.8 can be reproduced on my computer, so I think the previous results can be trusted.

@Beaming seems to be contributing to RoboRumble the most recently, we could work together to see if something could be done to ensure reproducibility of RoboRumble.

Xor (talk)09:33, 15 January 2023

I think I found a potential problem spot. One of my computers was 4 times slower and was using cpu constant from a 4 times faster computer. I recalculated cpu constant (by deleting the config file) and hope that the APS drop would resolve. It might explain why a better (subjectively) version of my bot in development performs worse than the old one.

It would be nice if rumble client recalculated CPU constant at the start. It take very little time and provides more stability. But I also recall discussion that active throttling in modern hardware makes this number just an estimate.

By the way in 2014 we had interesting discussion about Thread:User talk:Beaming/Smart bots competition allowing long calculation time for bots. Maybe it time to revive it since ML approaches developed quite a bit and they are CPU intensive.

From other hand making a top ranking fast bot is a challenge in itself.

Beaming (talk)20:51, 15 January 2023

I agree. Enforcing recomputing of CPU constant at the start and per e.g. 100 battles is necessary, as it highly affects results and is easy to get wrong. By recomputing periodically, it can also solve the problem of other heavy tasks that affects RoboRumble, without adding too much overhead.

I'm also thinking about adding some test set before allowing score submission, but that would be a long term plan.

I'll submit a PR for recomputing CPU constant, any suggestions are welcomed.

Xor (talk)11:51, 16 January 2023
 

I'm also interested in adding a separate rumble with long calculation time.

I'll add an option to multiply cpu constant by a factor (warning, advanced usage) from rumble config file, then *SmartRumble* could be realized. The initial participants could be copied from GigaRumble ;)

Xor (talk)11:57, 16 January 2023
 

I bought a low-end PC running Linux with 4 cores @ 1.33Ghz (cpu in 2016), and turbo-boost disabled. The cpu constant is 5x more than my master computer.

I tried to run the entire rumble with roborunner, two instances in parallel, (which takes ~20x time to complete, since I run 8 instances normally), and by far the scores look fine. So I guess what actually causes strange scores is indeed using inaccurate cpu constants.

Anyway, I haven't tried running other background tasks at the same time (because I don't have such tasks to run), so I'm not sure whether that affects the score as well.

Xor (talk)18:29, 24 January 2023
 

BeepBoop 1.21a seems to be losing only 0.1 APS now comparing to 1.2 (and 0.2 APS comparing to my local run).

However, there are still some pairings with weird score:

reeder.colin.WallGuy3 1.0
hs.SimpleHBot 1.3
darkcanuck.Holden 1.13a
tobe.Saturn lambda

I'm also running rumble client meanwhile, and couldn't find corresponding pairings from my logs.

@Beaming Could you please also have a look at the logs to see which machine is producing the weird scores?

I suspect most of the scores should be fine now, but some weird scores may still be producing under heavy load.

Xor (talk)04:34, 6 February 2023

Sure, but what should I look for in my logs? Is it even stored for long time? All I see is the last uploaded file.

Also, note that there uncertainties. 0.1 APS is not that much. Battles usually have 1-6 APS spread per pairing. Also, some bot keep logs, it might be that my setup has the longest (most complete) stored enemy visit counts logs.

Also, it is possible that original scoring was done on fast CPU where timeconstant was in favor of BeeBop.

But I also notice, that newly submitted bot start better, and then drop 0.5-1% aps as the rumble settles.

Beaming (talk)06:16, 6 February 2023

I run RoboRumble client with nohup, so I can just grep nohup.out. You can also use bash redirections to persist the log. Otherwise it's impossible to track down the weird scores.

The reason that bots drop 0.5-1% APS is because some battles are having insane results, greatly affecting final score.

When using controlled environment (both on newest hardware and low end 1.33Ghz processor), you get very stable score, getting less than 0.1 APS diff from 5000 battles to 20000 battles. This observation can also exclude the possibility of log saving bots. Logically, one battle is enough and increasing that to 20 battles aren't helping.

Xor (talk)09:59, 6 February 2023
 

Look at BeepBoop against reeder.colin.WallGuy3, it gets APS 81.28 instead of 98 with 4 battles. You can consider this as 3x 98 APS and 1x 30 APS. What happened when it gets 30 APS? I can only imagine a heavily loaded machine with much longer computation time than usual, and both participants are skipping a lot of turns due to insufficient time.

The problem of this is that you can never reproduce the result. It has nothing todo with uncertainties. Running BeepBoop against reeder.colin.WallGuy3 will always get ~98 APS as long as the environment is controlled. You never get to know what actually happened.

Xor (talk)10:07, 6 February 2023

I see your point, but we should not forget that there are probabilities involved. Maybe WallGuy3 get lucky once. For example, BeeBoop was spawn in the corner very close to the opponent.

Also, looking in the rumble logs is a bit useless without ability to correlate them with load of the system.

Ideally, we should solve it in the robocode client and not by running in pristine/controlled environment. None of us can dedicate computers to a single task. A potential solution is to have a thread which estimates CPU load (but even then there could be transient load spikes which might be undetected).

Beaming (talk)03:54, 8 February 2023

Probability can be reproduced. But weird scores cannot be reproduced. Even if some unlucky thing happens, it cannot explain:

reeder.colin.WallGuy3 1.0
hs.SimpleHBot 1.3
darkcanuck.Holden 1.13a
tobe.Saturn lambda

Why all four bots are "lucky" within only 4k battles, yet in my local run of over 20k battles they are never lucky. The deviation is very small for 20k battles, rejecting the presumption of lucky bots.

The point is to make the scores trackable, so that we could identify problems. Currently the client isn't automatically saving logs at all.

I quite understand that dedicated computer is not feasible for most people. My settings aren't pure either. However I never observed weird scores on my computer, making them a mystery. But those weird scores do hurt final result a lot. Weeks of dedicated work may not grant you 0.1 points, but you loss 0.2 points (sometimes) for unknown reason.

So please at least redirect the outputs of RoboRumble client to files, so that we could track down what condition produced the weird scores.

Xor (talk)04:59, 8 February 2023

Totally agreed about logging. Will do it right away.

Beaming (talk)05:02, 8 February 2023