From Robowiki
Jump to navigation Jump to search


Thread titleRepliesLast modified
Hickup in rankings1407:49, 22 February 2023
Some scores in TwinDuel have shifted a lot405:49, 19 June 2021
Allowed Robocode versions notification513:17, 29 April 2020, 2 March 2019
Upgrade to, 2 March 2019
What's required for a bot to have KNNPBI?714:38, 3 October 2018
Rankings Stable905:11, 20 June 2018
Faster recovery from incidentally removed participants?214:40, 14 June 2018
Upgrade to, 5 April 2018
HTTP Error 500, Internal Server Error?509:25, 1 November 2017, 19 October 2017
Varying NUMBATTLES of RoborumbleAtHome?1013:04, 12 October 2017
Adding opponent APS in bot comparison? 900:27, 10 September 2017
Literumble queue size? 108:35, 8 September 2017
weirdness in pairing123:36, 7 September 2017
How can I deploy roborumble client on a server?002:34, 21 August 2017
Preconfigured client link is down ;(413:40, 20 August 2017
Clear LiteRumble history509:35, 19 November 2015
LRP (ish)2102:49, 6 December 2014
Starting your own LiteRumble1006:31, 19 November 2014
First page
First page
Previous page
Previous page
Last page
Last page

Hickup in rankings

Hi all, seems that for some reason the participants list for roborumble, minirumble and nanorumble had a hickup (not for microrumble ??)

This means that the rankings start again from zero, but luckily all older results get in as soon as one battle has been fought. In order to get the rankings up-to-date again, every bot has to fight every other bot one time.

So, if you have the time and opportunity, please start your rumble client.

GrubbmGait (talk)18:12, 22 March 2022

It seems less worse than I thought at first sight. As soon as a bot has fought a battle, it appears in the ranking.

When all bots are present, all bots have to fight one battle to get all pairings back. Hopefully LiteRumble is smart enough to get that done in one pass.

GrubbmGait (talk)18:24, 22 March 2022

That happens once every a few years ;( Maybe LiteRumble should have some check to prevent removing more than 100 bots at a time? Anyway LiteRumble *is* smart enough to recover with only 2000+ battles (1000+ to add back, 1000+ again to fix pairings), instead of 1000000 battles.

Xor (talk)07:54, 23 March 2022

However a bug in the rumble client actually prioritized already existing bots over bots without ranking, except in the clean run case. So it actually takes much more battles to add missing bots back ;(

I submitted a PR for this bug.

Bots without rank should always take highest priority, in all scenarios.

Xor (talk)09:32, 23 March 2022

Bad news. Once a bot is added, the pairings are only updated for newly added battles. e.g. one bot with 400 pairings, gets to 401 when one battle is submitted. The rest of the pairings seems not to add automatically. So this means 1000000 battles are needed. ;(

Xor (talk)11:05, 23 March 2022

I submitted a PR to literumble. Hopefully this PR will solve the issue, and with 1000+ battles, the ranking should restore stable.

Xor (talk)12:15, 23 March 2022

Merged and deployed, thanks. Should we also update Robocode versions?

Skilgannon (talk)21:17, 23 March 2022

Great! Anyway I think we could wait for the next release, where the fix for unranked bots are merged ;)

Xor (talk)05:47, 24 March 2022

Looks like the ranking isn't recovering as fast as expected, bots with 400 pairings still goes to 401 instead of 1189 after 1 battle. Any idea of this behavior after the PR?

Xor (talk)06:05, 24 March 2022

Note that in the 507 still unstable bots of roborumble, 449 are from minirumble, 421 are from microrumble and 228 are from nanorumble. So it seems that bots that also in mini/micro/nano rumble takes longer to get to stable state.

Also note that 391 / 507 unstable bots are last updated since the deploy of the fix.

Xor (talk)06:27, 24 March 2022

It is restoring quite fast. At this moment half of the bots in roborumble have all their pairings back. And most of the others have at least 800 pairings.

GrubbmGait (talk)14:17, 23 March 2022

That may be caused by the batch processing. However, it should be fully recovered if it's caused by batch processing.

Xor (talk)06:07, 24 March 2022

It looks like the problem is more serious this time.

Have a look at this bot, a lot of pairings are last battled in 2020, and a lot of pairings have only 1 battle, meaning that some data isn’t recovered.

Xor (talk)14:17, 24 March 2022

Rankings in roborumble, minirumble and nanorumble are stable again, microrumble was stable from the beginning. But as Xor indicated, a lot of pairings have only 1 battle. Sometimes even 1 battle from 2 years ago. Is there something we can do to retrieve the lost battles, or should we continue with the current situation.

GrubbmGait (talk)16:23, 2 April 2022

Think again, there may not be data loss. For each pairing to have 1 battle, we need 500000 battles, which takes contiguous running of a few months. It's not surprising that it takes 2 years for a round.

Most of the 1 battle pairings are from 2022.3.23 (or 2020.10, didn't remember that though), the exact time the last hiccup happened. This strongly indicates that the data loss happens when a battle is fought for a missing pairing.

Xor (talk)15:54, 3 April 2022

Some scores in TwinDuel have shifted a lot

I'm noticing that some of LunarTwins' scores in TwinDuel have shifted dramatically from what they were as of RumbleArchives:TwinDuelRumble_20200126, despite the robots involved in said pairings not having been updated since. Particularly versus the following four:

  • bvh.two.Valkiries 0.44tmk3b
  • bvh.two.Ravens 0.2
  • gh.twin.GrauwuarG 0.41
  • 2.0c

which are four bots that have been unchanged since 20200126, that LunarTwins used to win decisively against, but appears to no longer do so in the TwinDuel LiteRumble. Also appears those for some reason appear to have had their pairing count versus LunarTwins reset more recently than some others for some unknown reason? Not sure. I'll be looking into it more some time, but it makes me wonder if this was due a change in robocode version.

Rednaxela (talk)21:40, 13 June 2021

To update, it seems robocode version through had entirely broken getTeammates/isTeammate, which breaks various TeamRumble/LiteRumble bots, including but surely not limited to LunarTwins.

This bug appears to have been introduced to as a side effect of the fix to this bug.

Version fixes a bug with getTeammates/isTeammate.

So Skilgannon if you're reading this, we should probably update the literumble version to, and also clear all TeamRumble/TwinDuel pairing data that was from a client with one of the flawed versions. Given things appear to have went from to in LiteRumble, it looks like it's just the results that need to be cleared from TeamRumble/TwinDuel pairing data. :)

Rednaxela (talk)22:28, 13 June 2021

Rumble is updated to only accept I will wipe the Team / TwinDuel, unfortunately they aren't stored by upload version.

Skilgannon (talk)23:34, 13 June 2021

Thanks for the prompt response! Should be able to get the pairings built back up again pretty quick I think.

Rednaxela (talk)23:37, 13 June 2021

So, while updating to fixed a badly broken TeamBot situation, it introduced a new problem that I first noticed with Tron. The precise cause is unclear to me at present and can't debug into a closed bot that isn't giving a stack trace, but some change between Robocode and Robocode appears to have broken bots that load data files that come preloaded in their JAR files. In the case of Tron this is used for a configuration properties file, and being unable to load this is causing Tron to start in challenge/reference mode instead of normal mode.

I'm doubtful this bug only affects Tron, and removing tainted data from the rumble could be troublesome.

Bug report: here

Rednaxela (talk)05:13, 19 June 2021

Allowed Robocode versions notification

Hi Skilgannon,

Would you mind bumping this thread when you are changing the Allowed Robocode versions?

I run my clients pretty much unattended, so they try to upload rankings and fail. Unfortunately, there is no way to notify a human unless one stares at the console all the time.

But updates in the wiki thread would propagate to my rss reader quite quickly.

Happy New Year and thank for running the LiteRumble.

Beaming (talk)19:10, 31 December 2015

No problem. I actually only changed it about 2 hours ago, if you didn't notice I would have posted something =) I'm also going to add back those historical bots which were removed because of the compatibility issues. Have a good New Year!

Skilgannon (talk)19:16, 31 December 2015

As promised (in 2015), bump, I've upgraded to =)

Skilgannon (talk)16:44, 2 March 2019

Thanks. I see it. Time to upgrade my clients. I was still running

Beaming (talk)19:09, 2 March 2019

Hi all, Literumble has moved up to (from

GrubbmGait (talk)09:16, 29 April 2020

Ah, I forgot about this thread. Yes, I updated the required version so that it will use the HTTPS links instead of HTTP, now that the robowiki supports these.

Skilgannon (talk)13:17, 29 April 2020

Literumble is now updated to accept battles only from! Happy rumbling!

Skilgannon (talk)14:57, 2 March 2019

Upgrade to

Robocode is released, with fixed meleerumble pairings and codesize utility (for lambdas), along with other fixes, should we upgrade now and see the changes?

Xor (talk)15:36, 16 January 2019

I've updated the accepted client version to =)

Skilgannon (talk)20:37, 17 January 2019 has -cp option set to wrong value, causing codesize not to work (and pushing mega bots to nano rumble) if its not fixed manually. We should disable this version and wait fnl for a fix...

I made a pull request:

Xor (talk)10:58, 18 January 2019

I've rolled back, let me know when is available with the fix =)

Skilgannon (talk)11:21, 18 January 2019

It's available now ;) And let's upgrade now, since it's already fully tested ;)

Xor (talk)14:41, 2 March 2019

What's required for a bot to have KNNPBI?

I've seen many bots have KNNPBI; however my bot still have no KNNPBI (all zeros) after a long period of time ;(

So what's required for a bot to have KNNPBI?

Xor (talk)19:46, 20 August 2017

Finally after having 978 pairings the KNNPBI is shown. Anyway, still wondering what made KNNPBI to be all zeros.

Xor (talk)20:05, 20 August 2017
Xor (talk)20:07, 20 August 2017

You should have more pairings but I don't know how does it decide.

Dsekercioglu (talk)20:21, 20 August 2017

The computation of those values are batched, they are computed every 24 hours. One possible explanation for the older version to be still all zeroes is that maybe only the latest version of a bot is considered when doing the computation? Not sure about this, though. Just makes sense, had a really superficial look at the code. You can probably try to figure it out here:

Rsalesc (talk)22:46, 20 August 2017

KNNPBI (and the other batched rankings, NNP, Vote) are calculated once every 8 hours, since they can't be calculated incrementally. If you remove your bot before it is calculated, it won't be calculated, since it doesn't get recalculated on old bots.

Skilgannon (talk)21:06, 21 August 2017

Btw, is that possible for a bot to have vote and ANPP calculated and shown correctly, while having NPP and KNNPBI unavailable?

The link below captures an instance of such weird thing.

Xor (talk)11:24, 1 October 2018

It could happen sometimes if the saving fails. It should correct itself soon though.

Skilgannon (talk)14:38, 3 October 2018

Rankings Stable

It seems that both 1v1 and melee now shows "Rankings Stable" instead of "Rankings Not stable".

I once thought that "Rankings Not Stable" is hardcoded to show that the rankings are never stable so one should always run more battles.

But today is the first time I noticed "Rankings Stable", quite surprising.

So, what's the mechanism behind "Rankings Stable" and "Rankings Not Stable"? Is "Rankings Stable" displayed whenever every bot gets a full pairings?

Xor (talk)07:14, 17 June 2018

Your observation coincide with mine. Once all bots paired with each other at least once, the ranking get the stable status. Sometimes it does not happen for a long time because of missing bots or some bots crashing with a newer version of robocode. This is why the participants list sometimes get pruned.

If the ranking is unstable for a long time, I usually look which bot is missing a pair and search for a reason in the rumble client log.

Usually, stabilization takes about a day for each new bot.

Beaming (talk)16:57, 17 June 2018

Yeah, Monk gets an incorrect url for nearly half a year, making newly updated bots missing that pairing. And in 1v1 there are more bots having problems with current settings (robocode and Java 8).

Should we have a clean up, or create a new rumble to remove bots having compatibility problem, which only adds noice to the rumble?

Xor (talk)04:08, 18 June 2018

I personally oscillating between "if the author does not care, why should I?" and "preserve the history". If you are in the second camp, let me remind about my FixingParticipantLinks script which relinks missing bots to strange automata archive.

What is our problem with Java 8? Do we already have bots with Java 9? Or robocode itself is not backward compatible and you see it on big enough robot pull?

Beaming (talk)16:35, 18 June 2018

My opinion is that as long as a bot works fine on current settings (robocode and Java 8), we should "preserve the history". But once it produces random result (e.g. crashing half of the time), we should remove it (until the author should fix it).

Bots known to crash on some machines:

apc.Caan 1.0
dam.MogBot 2.9
sgp.JollyNinja 3.53
Xor (talk)03:47, 19 June 2018

I've been away for quite some time and I'll probably come back once I graduate. I still care about my bots, though (despite Monk being buggy as hell atm). I used to make use of Drive to provide the links, but I didn't know they would break after some time. What would you guys suggest me to do? Is the solution proposed above (fix script) sufficient for now?

Rsalesc (talk)00:46, 20 June 2018

Well, do not trust the modern hype i.e. a cloud. But I guess you already know it.

If you cannot host your bot yourself. Put it in the cloud, usually within a day or earlier appears at [archive]. Then just update the link to point there. I think, as of now, it is the most reliable way. Many thanks to Rednaxela for this effort.

Beaming (talk)03:02, 20 June 2018

I actually think that hosting it by myself looks like way less reliable than using such consolidated service. But yeah, I'll do what you suggested.

Rsalesc (talk)04:15, 20 June 2018

Well, in this case, the drive works totally fine ;)

Just have a look at this commit:

Xor (talk)05:11, 20 June 2018

Confirmed, it changes to Stable when all bots have full pairings.

If you find a bot that repeatedly crashes IMO remove it from the rumble and out it in the list below. If the author has a page make a comment and hopefully they will fix it.

Skilgannon (talk)17:53, 19 June 2018

Faster recovery from incidentally removed participants?

Updating the participants list manually fails from time to time, network issue, misoperation, etc. But removing a bot from rumble takes O(1) time, but re-adding n bots takes O(mn) time where m is the total amount of participants.

However, it takes O(1) time in principle to re-add a bot, since no data is lost. Is that possible to tweak the literumble to support faster re-adding?

Xor (talk)07:03, 13 June 2018

Re-adding bots actually only takes 2*m, once to add the bot, and once to add all of the pairings after all of the other bots have been added. Doing something different would require updating the rumble protocol, which I'd prefer not to touch.

Skilgannon (talk)09:39, 13 June 2018

Well, thanks for explaining the mechanism. O(m) feels much better, and a cleaner code is also preferable.

Xor (talk)14:40, 14 June 2018

Upgrade to

Finally is out, with Skilgannon's shorter bot list update time. Any plan to move on? And then we can use lambdas without transpilers as well.

Xor (talk)14:56, 4 April 2018

According to git, there is a bug fix (fixing SittingDuck craches) either at or right after release. Unfortunately git does not have the tag. May be we can ask fnl to make one more release or at least clarify this part.

But I am personally not so exited about java 10 coming forward. Debian stable has only java-9 in the distribution. It will be a pain to switch to java 10. I think so far we agreed that robocode clients should support java 8 not even java 9.

Do those lambda things are so critical for bots development? I know you apparently use them but do you really need them. Do they give you extra speed or robustness?

Beaming (talk)18:44, 4 April 2018

yeah, Java 9/10 support is poor even in today, and they didn’t add anything as useful as lambdas, so stick with java 8 is never a big problem.

However, with lambdas, Java 8 is a completely different language. Lambdas gain more optimizations (e.g. omitting unnecessary object allocation) comparing to anonymous class, and with lambdas you can avoid several unnecessary boxing and unboxing which is unreasonably slow, so yes, they give more speed; Lambdas remove several crufts in Java development prior to lambdas, resulting cleaner and more readable code, so yes they give robustness.

I personally relay on lambdas heavily for cleaner and more readible code, but with customized build scripts, I can easily transpile my code to java 7, so moving on is not much beneficial for me. But it’s not easy to setup such a build script, so for other people, especially new-comers, they will never have a chance to use lambdas in robocode development. They will even be scared and despaired by the fact that their bot can not run on roborumble, even with Java 8.

By not moving on to robocode, we are wasting a large part of effort of moving to java 8. With the better bot list update mechanism, I think it’s definitely time to move on.

Xor (talk)03:10, 5 April 2018

HTTP Error 500, Internal Server Error?

Since today, when uploading results, this message keeps appearing in my RoboRumble client: Server returned HTTP response code: 500 for URL:

Did anybody observe similar results?

Cb (talk)20:59, 31 October 2017

Ah, this was caused by a bug due to a new check I added for dropping battles more than 24 hours old (to prevent old versions being added back by mistake). Can you try again and let me know if it is fixed for you now?

Skilgannon (talk)21:16, 31 October 2017

Yes, now it works, thanks :)

Cb (talk)01:28, 1 November 2017

It seems that when uploading out-dated pairings, it is still receiving http 500, which cause those out-dated pairings to be doubled (another bug) and re-uploaded twice each time... And then failing with doubled time, which grows like crazy.

Will you change the response to something like "200, out-dated pairings dropped" or so to fix this? Thanks ;)

Xor (talk)06:32, 1 November 2017

Well, that was the intention, but it seems that Python doesn't auto-convert datetimes to strings so my logging was crashing it. Should be fixed now!

Skilgannon (talk)09:25, 1 November 2017

More information: Server returned HTTP response code: 500 for URL:
Unable to upload results meleerumble,35,1000x1000,Xor,1505812985874,SERVER abc.Shadow 3.84i,24248,6434,28 0.1,16657,4100,5 Server returned HTTP response code: 500 for URL:
Unable to upload results meleerumble,35,1000x1000,Xor,1505812985874,SERVER abc.Shadow 3.84i,24248,6434,28 aaa.ScaledBot 0.01d,15278,3625,1 Server returned HTTP response code: 500 for URL:
Unable to upload results meleerumble,35,1000x1000,Xor,1505812985874,SERVER abc.Shadow 3.84i,24248,6434,28 mld.DustBunny 3.8,14411,3784,0 Server returned HTTP response code: 500 for URL:
Unable to upload results meleerumble,35,1000x1000,Xor,1505812985874,SERVER abc.Shadow 3.84i,24248,6434,28 cb.nano.Insomnia 1.0,11918,2981,1 Server returned HTTP response code: 500 for URL:
Unable to upload results meleerumble,35,1000x1000,Xor,1505812985875,SERVER abc.Shadow 3.84i,24248,6434,28 ayk.WallHugger 1.0,8941,2568,0 Server returned HTTP response code: 500 for URL:
Unable to upload results meleerumble,35,1000x1000,Xor,1505812985875,SERVER abc.Shadow 3.84i,24248,6434,28 yk.JahRoslav 1.1,8499,1922,0 Server returned HTTP response code: 500 for URL:
Unable to upload results meleerumble,35,1000x1000,Xor,1505812985875,SERVER abc.Shadow 3.84i,24248,6434,28 rampancy.Durandal 2.1d,7268,1522,0

this is what I'm getting constantly (everytime it uploads).

Xor (talk)06:33, 1 November 2017

It's been a while since we have updated the rumble client version, and the new version brings several important fixes. I'd really appreciate if someone set up a quick benchmark for a battle or two for each bot in the rumble, and then run it on old and new versions to make sure we don't have any regressions. Once this is done we can upgrade the client =)

Skilgannon (talk)21:10, 8 October 2017

As far as I know, Robocode hasn't been officially released yet. The website and GitHub still name as the latest version, and there is no download. You can only get it by building from the latest git master. What I linked to was a draft of the new changelog.

I don't think any new releases will be made until poor Fnl finishes dealing with all of the bugs reports I piled onto him. What I have been doing for the past week is emptying my mental list of annoyances with Robocode onto its bugtracker.

So currently, it is still in development, and it's a bit too early to do regression testing with this new version.

What does need testing, however, is Robocode on Java 9. We already found CPU constant calculation and team JARs to be broken there, and doubtlessly there are more issues.

MultiplyByZer0 (talk)23:01, 8 October 2017

Looks like Fnl is busy as we speak. May be it is the time to complain/bug report everything on our minds.

to MultiplyByZer0, thanks for spotting so many extra bugs.

Beaming (talk)23:37, 8 October 2017

Robocode has been released.

MultiplyByZer0 (talk)22:17, 19 October 2017

Great. As soon as we have a benchmark comparison making sure no subtle score changes have crept in or tons of bots are now broken I'm happy to change the LiteRumble over!

Skilgannon (talk)22:26, 19 October 2017

Varying NUMBATTLES of RoborumbleAtHome?

Recently, I noticed that more than half of the battles are dropped as queue is full — however, this won't happen even if I wait a few minutes. Seems that all the rumble clients are uploading battles periodically, and that upload is pretty concentrated — e.g. All four clients of mine upload ~200 battles within ~3 minutes, which makes the queue get full immediately. And If I take a look at literumble/statistics, I can see that there are 5 to 7 clients uploading within 2 minutes.

It generally takes a client about 15 min to finish 50 battles, but if we vary this to primes, the uploads will get evenly distributed, reducing the high concurrent which causes a lot of dropped battles.

Xor (talk)03:38, 8 October 2017

Reducing NUMBATTLES would probably help here too. It would also reduce the delay which is the main cause of duplicated pairings for new bots being entered. Maybe a NUMBATTLES of 20 in the main rumble would be good enough to solve the client component of this.

However, I think one of the main causes of the full queue is the batch processing for Vote/NPP/KNNPBI, since the queue needs to paused while this is running. Because it is paused the projected processing time goes very high, and it stops accepting new uploads. I have an idea on how to tune this, it should help a bit.

Skilgannon (talk)11:50, 8 October 2017

However, even a NUMBATTLES of 3 can't prevent most of the battles from being dropped ;/

Seems that with 8 clients running the rumble at the same time, no attempt will help without stopping some clients.

Worth mention that I can notice dropped battles when there are 6 clients, also not frequently. Seems that with 2 more clients, the effectiveness dropped considerably?

Btw, one thing that's really interesting is that the duplicates of multiple versions can last hours. Seems that some clients are not checking participants list for hours.

Xor (talk)16:56, 8 October 2017

Got it — maybe after the queue is paused for batch tasks and then resumed, it keeps near full as there are still much parings uploaded. Like some DoS, this decreases the ability to handle high concurrent (although the average pairings uploaded per minute is not very high, they came in during a short period of time, and get dropped)

Xor (talk)17:53, 8 October 2017

Then I think increase the queue size a little after batch task (and then decrease to normal size slowly to make sure new uploads won't wait forever after some flood upload)

Or, we can handle uploads during pause separately — don't let them take place in normal queue, rather, store them in a separate queue (and cap it with normal uploads per minutes * pause time).

Xor (talk)18:02, 8 October 2017

I was running 8 clients, that was probably causing it. Particularly melee clients cause a huge number of uploads for the amount of processing time required by the client.

I'll save my clients for when there are less others running =)

Skilgannon (talk)21:13, 8 October 2017

I've been experiencing constant "queue full" messages in the past 2 hours in MeleeRumble, with 3 melee clients + 3 rumble clients. This should really be happening this often?

Rsalesc (talk)04:19, 12 October 2017

I noticed that every time the queue is paused for batch tasks, not until I pause the clients for a few minus, did the massive queue full messages stop.

That may because when the queue size is near max size, the capacity of handling high concurrency decreases dramatically, although the average processing power doesn't decrease at all.

Use a separate queue when it is paused may help, imo.

Xor (talk)06:11, 12 October 2017

Adding opponent APS in bot comparison?

Would you mind adding another column called Opponent APS in bot comparison? When sorting with opponent APS, it could be really useful to see the difference of two bots against bots with different APS range, as in the Diff Distribution graphic, but with more information, especially the bot name. This could also help us to create a good test bed ;)

Xor (talk)17:46, 5 September 2017

I can take a look (although not this weekend, I'm away from home). However, what would you consider appropriate behavior on bots which had been removed from the rumble, but which are a shared pairing? The APS/diff image does this by just ignoring those pairs, but I don't think we want to do that here. Do I put a 0.0?

Skilgannon (talk)23:36, 7 September 2017

Can we assume that APS is relatively stable? Since we can click into the detail page to see the history APS even when that opponent is removed, can we simply put that value?

Opps this assumption breaks when comparing ancient bots ;( Then polluting the table must be a bad idea. However, why don't we use NaN or N/A instead of 0?

Xor (talk)02:05, 8 September 2017

NaN sounds most appropriate. I don't want to have to fetch each bot object that is not in the rumble anymore to look up its last APS.

Skilgannon (talk)08:30, 8 September 2017

Done. I also added a link on the BotDetails page to find the bot on the wiki.

Skilgannon (talk)18:19, 9 September 2017

Awesome! Thanks a lot.

Since you in a wish granting mood, would it be possible to have api call which returns only summary table with APS, PWIN, etc for a given bot in a given game. Right now, I parse but its spits the whole comparison table, which is overkill and wastes the bandwidth. All I need is info stored in the header table.

I do it to plot APS vs bot version for my bot, but I can imagine other will be interested in this too.

Beaming (talk)21:08, 9 September 2017

WoW that's amazing! Thanks a lot!

Xor (talk)00:27, 10 September 2017

Literumble queue size?

LiteRumble says OK. Queue full,XXX vs XXX discarded.

and it is discarding hundreds of battles :\

Xor (talk)08:14, 8 September 2017

If the queue gets too long then the priority battles have a severe lag, so the rumble gets really inefficient. Max queue size is based on projected processing time.

Skilgannon (talk)08:31, 8 September 2017

weirdness in pairing

Hi, after recent bot removal and restoring we have strange artifacts: asymmetrical pairing reports.

Have a look at Galzxy 01 stats and sample.Walls 1.0 stats. You can see that Galaxy 01 has 18 battles against Walls. Byt if you look at Walls stats there are no reports on these 18 battles with Galaxy 01. Galaxy 01 is simply missing in the list of Walls battles.

Beaming (talk)19:43, 7 September 2017

You just need to wait for Galzxy to get another battle, and it will be fixed again.

Skilgannon (talk)23:36, 7 September 2017

How can I deploy roborumble client on a server?

A thread, Thread:Talk:LiteRumble/How can I deploy roborumble client on a server?, was moved from here to Talk:RoboRumble. This move was made by Xor (talk | contribs) on 21 August 2017 at 00:34.

Preconfigured client link is down ;( is not available now ;(

And doesn't has an archive of it ;( does anyone have a backup of it?

Xor (talk)11:10, 20 August 2017

By the way, I'm really wondering how is LiteRumble working ;) I used to think the battles are all on the cloud, but then I discovered which shows a lot of contributors with familiar names ;) How can I set the battles to run on my computer and submit the results to LiteRumble? Didn't see any discussion about it.

Xor (talk)11:21, 20 August 2017

That isn't needed anymore, the newer versions of Robocode are preconfigured to support Literumble.

Just download, edit robocode/roborumble/[roborumble/meleerumble/etc].txt to have your name, and you can run battles on your computer to contribute to the rankings. The website just displays the battles that users have uploaded in a nice way.

Skilgannon (talk)12:08, 20 August 2017

wow that's very convenient

btw, is there any plan to support robocode

Xor (talk)12:14, 20 August 2017

It should be tested a lot to be sure that there isn't any errors.

Dsekercioglu (talk)13:40, 20 August 2017

Clear LiteRumble history

I have created my own LiteRumble instance running as a google app, as described in previous discussions. Now I want to know if it is possible to delete the battle history and the participated robots? I am experimenting with it since we want to have a roborumble event at our office and I want to delete my previous "testing" robots and matches and have a clean slate when we do the event.

Frbod1 (talk)16:05, 18 November 2015

You should be able to delete the data from the AppEngine web console. Otherwise you can simple make the clients upload to a different named rumble, and the old one can be for the demo/setup bots.

Skilgannon (talk)22:22, 18 November 2015

I have tried to remove the data from the datastore by selecting all database entries and delete them. But the data on the webpage is still there, so the data must be stored somewhere else. To create a new rumble seems like a annoying workaround :)

Frbod1 (talk)08:54, 19 November 2015

There may still be a copy in Memcache - if you clear Memcache and the datastore everything should be gone.

Skilgannon (talk)09:05, 19 November 2015

Haha, didn't reload the page before I submitted my last comment. I seem to have found the problem at the same time that you mentioned it. Thank you anyway!

Frbod1 (talk)09:35, 19 November 2015

Found the problem. Had to delete all database entries AND the memory cache.

Frbod1 (talk)09:14, 19 November 2015

One thing I really missed from the old rumble was the LRP, but without ELO/Glicko we can't really do the whole straight-line fit any more. So, instead I have added a Score Distribution image on every bot's details page. The red is APS and the green is Survival (as seen in image the mouseover). The image is directly embedded in the HTML using data URIs, so if you are using IE, 8 and later only, otherwise pretty much everything supports it. I'm also planning to add this to the BotCompare page so you can analyse differences in score compared to opponent score for both APS and survival.

Skilgannon (talk)00:28, 11 May 2013

Ahhh, neat stuff. That's very nifty with directly embedding the image data there. For some reason the image is displaying very tiny for me though under Firefox 20.0. It gets scaled to the box around it properly under Chromium, but not Firefox.

EDIT: Nevermind... the styles.css file was being cached and that was the problem. A ctrl-r fixed it.

Rednaxela (talk)00:48, 11 May 2013

Ah yeah, the styles.css was changed so you need to do a hard-reload.

I've now added the KNNPBI to the bot-details Scores Distribution, and the bot-compare has a Diff Distribution.

Skilgannon (talk)14:50, 11 May 2013

There is something fishy with a chart in the right part close to the end. If you look at above CunobelinDC score distribution you would see that there is no corresponding red points for stronger opponents, while blue and grean are there. This is quite common theme for other bots as well.

Also have a look at this EvBot score distribution you would see the problem with normalizing, i.e. about 1/4 of the space in the right part of the chart has no points. Which is non optimal use of the chart space.

Beaming (talk)17:44, 18 November 2013

Is it still showing the problem? I don't see anything wrong right now. I had some issues with (I suspect) bad bytecode and versioning, but that should be fixed now.

As for the EvBot chart, that is because in meleerumble nobody gets higher than ~75%, so the top 25% is empty. Although I guess I could normalise to the top score, I'd rather have the charts consistent as better bots are released.

Skilgannon (talk)18:55, 18 November 2013

Aha, I see now why melee charts were somewhat off.

But I insist that I do not see red points for X>95% for CunobelinDC. Look at 5 the rightmost green points, I cannot locate red (APS) or blue for the same X values. It might be aliasing problem or may be points are just on top of each other.

Beaming (talk)19:41, 18 November 2013

Green is survival, and so the X value is the average survival score of the enemy bot. The red and blue use enemy APS as the X value, not survival, and since survival scores are higher the green dots go further to the right.

I've actually thought about changing the X axis to just be enemy APS to make it easier to interpret. Or ordering the X-axis by rank instead of using APS values.

Skilgannon (talk)10:22, 19 November 2013

I've changed it so they all use APS on the X axis, so it should be clearer now.

Skilgannon (talk)12:03, 19 November 2013

Starting your own LiteRumble

Does anyone have some advice for starting up a custom and/or private LiteRumble? I've got a new batch of programming students that I'm leading through Robocode and I'd love to run a custom bracket with just my kids in it as I've done in years past.

Tkiesel (talk)19:50, 18 November 2014

Sure, it's easy enough.

  1. Create your own app on Google AppEngine
  2. Download and extract the code from bitbucket
  3. Change the app name in app.yaml to the name of the app you created
  4. Download and install the Google AppEngine python SDK
  5. Run the following in the code directory: update . && update batchratings.yaml
  6. This should give you an empty LiteRumble instance running on your app

Once you have a copy of LiteRumble running, all you need to do is modify the rumble client in roborumble.txt to point to your new server for uploads. You also need a new participants list, which you can host on appengine too if you don't mind continually re-deploying, or you can make a wiki page somewhere. The client just parses everything between the two <pre> tags.

Have fun!

Skilgannon (talk)20:14, 18 November 2014

Excellent. I can just host participants on a Dropbox text file. Thanks for the info!

By the way, a favorite thing I do when introducing my kids to Robocode is to have a pair of them (driver and gunner) pilot sample.Interactive at a moderate simulation speed against some sample bots until they get used to it. Then they face DrussGT. Thought you'd want to know that you've caused some laughter and groans of frustration from some prospective high school coders!

Tkiesel (talk)20:23, 18 November 2014

Brilliant. I've always found the sample.Interactive very difficult to control, I don't think I'd stand a chance against DrussGT =) I bet if I set the bullet colour to something more similar to the background it would make it even harder for interactive users >:-D

Skilgannon (talk)20:30, 18 November 2014

That's always the kicker is that they have a very very hard time adapting to a top of the line bot like DrussGT or Diamond. I've had students say it's like the bot is reading their mind. Then I drop the bomb that the bot can't see bullets, while the students can. It's a great and impactful "Math is POWERFUL" moment!

Of course, set the sim speed low enough and get a patient non-wasteful gunner, and they will trash DrussGT because they can dance juuust aside of each bullet. But as long as I set the sim speed such to keep them on their toes, it's a rough but educational ride. Fun for spectators too!

Tkiesel (talk)20:35, 18 November 2014

I have some ideas about dealing with interactive users - closer range, not letting energy levels get below the enemy, varying colours of dark blue and grey bullets - perhaps that should be something I work on next. I've neglected Robocode and I've been working on more pure ML/AI problems instead, but this is something more in the behavioural side which AFAIK hasn't been done yet.

Skilgannon (talk)20:55, 18 November 2014

Awww, high school students have all the fun. XD

Chase05:05, 19 November 2014

The sample bot Interactive is hard to control. For 1v1, all you would really have to change in response to what you see is orbit direction, distancing, current aiming GF, and bulletpower/when to fire. Everything else could be automatic 99+% of the time.

Would anyone be interested in a SuperInteractive wiki collaboration? Perhaps a challenge for driving it against DrussGT?

Sheldor (talk)06:06, 19 November 2014

I was thinking of a fairly simple "SuperInteractive" which does regular wave-surfing, but also allows you to click on enemy bullets, which it will then dodge. Targeting, I feel, would be stronger without any human intervention.

Skilgannon (talk)06:31, 19 November 2014

Tkeisel Can I contribute to your Literumble please?

Tmservo (talk)04:11, 19 November 2014
First page
First page
Previous page
Previous page
Last page
Last page