View source for Talk:LiteRumble

From Robowiki
Jump to navigation Jump to search

Contents

Thread titleRepliesLast modified
Hickup in rankings1406:49, 22 February 2023
Some scores in TwinDuel have shifted a lot404:49, 19 June 2021
Allowed Robocode versions notification512:17, 29 April 2020
1.9.3.5013:57, 2 March 2019
Upgrade to 1.9.3.4413:41, 2 March 2019
What's required for a bot to have KNNPBI?713:38, 3 October 2018
Rankings Stable904:11, 20 June 2018
Faster recovery from incidentally removed participants?213:40, 14 June 2018
Upgrade to 1.9.3.1202:14, 5 April 2018
HTTP Error 500, Internal Server Error?508:25, 1 November 2017
1.9.3.0421:26, 19 October 2017
Varying NUMBATTLES of RoborumbleAtHome?1012:04, 12 October 2017
Adding opponent APS in bot comparison? 923:27, 9 September 2017
Literumble queue size? 107:35, 8 September 2017
weirdness in pairing122:36, 7 September 2017
How can I deploy roborumble client on a server?001:34, 21 August 2017
Preconfigured client link is down ;(412:40, 20 August 2017
Clear LiteRumble history508:35, 19 November 2015
LRP (ish)2101:49, 6 December 2014
Starting your own LiteRumble1005:31, 19 November 2014
First page
First page
Previous page
Previous page
Last page
Last page

Hickup in rankings

Hi all, seems that for some reason the participants list for roborumble, minirumble and nanorumble had a hickup (not for microrumble ??)

This means that the rankings start again from zero, but luckily all older results get in as soon as one battle has been fought. In order to get the rankings up-to-date again, every bot has to fight every other bot one time.

So, if you have the time and opportunity, please start your rumble client.

GrubbmGait (talk)17:12, 22 March 2022

It seems less worse than I thought at first sight. As soon as a bot has fought a battle, it appears in the ranking.

When all bots are present, all bots have to fight one battle to get all pairings back. Hopefully LiteRumble is smart enough to get that done in one pass.

GrubbmGait (talk)17:24, 22 March 2022
 

That happens once every a few years ;( Maybe LiteRumble should have some check to prevent removing more than 100 bots at a time? Anyway LiteRumble *is* smart enough to recover with only 2000+ battles (1000+ to add back, 1000+ again to fix pairings), instead of 1000000 battles.

Xor (talk)06:54, 23 March 2022

However a bug in the rumble client actually prioritized already existing bots over bots without ranking, except in the clean run case. So it actually takes much more battles to add missing bots back ;(

I submitted a PR for this bug. https://github.com/robo-code/robocode/pull/61.

Bots without rank should always take highest priority, in all scenarios.

Xor (talk)08:32, 23 March 2022
 

Bad news. Once a bot is added, the pairings are only updated for newly added battles. e.g. one bot with 400 pairings, gets to 401 when one battle is submitted. The rest of the pairings seems not to add automatically. So this means 1000000 battles are needed. ;(

Xor (talk)10:05, 23 March 2022

I submitted a PR to literumble. https://github.com/jkflying/literumble/pull/3 Hopefully this PR will solve the issue, and with 1000+ battles, the ranking should restore stable.

Xor (talk)11:15, 23 March 2022

Merged and deployed, thanks. Should we also update Robocode versions?

Skilgannon (talk)20:17, 23 March 2022

Great! Anyway I think we could wait for the next release, where the fix for unranked bots are merged ;)

Xor (talk)04:47, 24 March 2022
 

Looks like the ranking isn't recovering as fast as expected, bots with 400 pairings still goes to 401 instead of 1189 after 1 battle. Any idea of this behavior after the PR?

Xor (talk)05:05, 24 March 2022

Note that in the 507 still unstable bots of roborumble, 449 are from minirumble, 421 are from microrumble and 228 are from nanorumble. So it seems that bots that also in mini/micro/nano rumble takes longer to get to stable state.

Also note that 391 / 507 unstable bots are last updated since the deploy of the fix.

Xor (talk)05:27, 24 March 2022
 
 
 

It is restoring quite fast. At this moment half of the bots in roborumble have all their pairings back. And most of the others have at least 800 pairings.

GrubbmGait (talk)13:17, 23 March 2022

That may be caused by the batch processing. However, it should be fully recovered if it's caused by batch processing.

Xor (talk)05:07, 24 March 2022
 

It looks like the problem is more serious this time.

Have a look at this bot, a lot of pairings are last battled in 2020, and a lot of pairings have only 1 battle, meaning that some data isn’t recovered. http://literumble.appspot.com/BotDetails?game=roborumble&name=nz.jdc.nano.NeophytePattern%201.1&order=-Battles

Xor (talk)13:17, 24 March 2022

Rankings in roborumble, minirumble and nanorumble are stable again, microrumble was stable from the beginning. But as Xor indicated, a lot of pairings have only 1 battle. Sometimes even 1 battle from 2 years ago. Is there something we can do to retrieve the lost battles, or should we continue with the current situation.

GrubbmGait (talk)15:23, 2 April 2022

Think again, there may not be data loss. For each pairing to have 1 battle, we need 500000 battles, which takes contiguous running of a few months. It's not surprising that it takes 2 years for a round.

Most of the 1 battle pairings are from 2022.3.23 (or 2020.10, didn't remember that though), the exact time the last hiccup happened. This strongly indicates that the data loss happens when a battle is fought for a missing pairing.

Xor (talk)14:54, 3 April 2022
 
 
 
 
 

Some scores in TwinDuel have shifted a lot

I'm noticing that some of LunarTwins' scores in TwinDuel have shifted dramatically from what they were as of RumbleArchives:TwinDuelRumble_20200126, despite the robots involved in said pairings not having been updated since. Particularly versus the following four:

  • bvh.two.Valkiries 0.44tmk3b
  • bvh.two.Ravens 0.2
  • gh.twin.GrauwuarG 0.41
  • krillr.mini.JointStrikeForce 2.0c

which are four bots that have been unchanged since 20200126, that LunarTwins used to win decisively against, but appears to no longer do so in the TwinDuel LiteRumble. Also appears those for some reason appear to have had their pairing count versus LunarTwins reset more recently than some others for some unknown reason? Not sure. I'll be looking into it more some time, but it makes me wonder if this was due a change in robocode version.

Rednaxela (talk)20:40, 13 June 2021

To update, it seems robocode version 1.9.3.8 through 1.9.4.1 had entirely broken getTeammates/isTeammate, which breaks various TeamRumble/LiteRumble bots, including but surely not limited to LunarTwins.

This bug appears to have been introduced to 1.9.3.8 as a side effect of the fix to this bug.

Version 1.9.4.2 fixes a bug with getTeammates/isTeammate.

So Skilgannon if you're reading this, we should probably update the literumble version to 1.9.4.2, and also clear all TeamRumble/TwinDuel pairing data that was from a client with one of the flawed versions. Given things appear to have went from 1.9.3.5 to 1.9.3.9 in LiteRumble, it looks like it's just the 1.9.3.9 results that need to be cleared from TeamRumble/TwinDuel pairing data. :)

Rednaxela (talk)21:28, 13 June 2021

Rumble is updated to only accept 1.9.4.2. I will wipe the Team / TwinDuel, unfortunately they aren't stored by upload version.

Skilgannon (talk)22:34, 13 June 2021

Thanks for the prompt response! Should be able to get the pairings built back up again pretty quick I think.

Rednaxela (talk)22:37, 13 June 2021
 
 

So, while updating to 1.9.4.2 fixed a badly broken TeamBot situation, it introduced a new problem that I first noticed with Tron. The precise cause is unclear to me at present and can't debug into a closed bot that isn't giving a stack trace, but some change between Robocode 1.9.4.1 and Robocode 1.9.4.2 appears to have broken bots that load data files that come preloaded in their JAR files. In the case of Tron this is used for a configuration properties file, and being unable to load this is causing Tron to start in challenge/reference mode instead of normal mode.

I'm doubtful this bug only affects Tron, and removing tainted data from the rumble could be troublesome.

Bug report: here

Rednaxela (talk)04:13, 19 June 2021
 

Allowed Robocode versions notification

Hi Skilgannon,

Would you mind bumping this thread when you are changing the Allowed Robocode versions?

I run my clients pretty much unattended, so they try to upload rankings and fail. Unfortunately, there is no way to notify a human unless one stares at the console all the time.

But updates in the wiki thread would propagate to my rss reader quite quickly.

Happy New Year and thank for running the LiteRumble.

Beaming (talk)18:10, 31 December 2015

No problem. I actually only changed it about 2 hours ago, if you didn't notice I would have posted something =) I'm also going to add back those historical bots which were removed because of the compatibility issues. Have a good New Year!

Skilgannon (talk)18:16, 31 December 2015
 

As promised (in 2015), bump, I've upgraded to 1.9.3.5 =)

Skilgannon (talk)15:44, 2 March 2019

Thanks. I see it. Time to upgrade my clients. I was still running 1.9.2.5

Beaming (talk)18:09, 2 March 2019
 

Hi all, Literumble has moved up to 1.9.3.9 (from 1.9.3.5)

GrubbmGait (talk)08:16, 29 April 2020

Ah, I forgot about this thread. Yes, I updated the required version so that it will use the HTTPS links instead of HTTP, now that the robowiki supports these.

Skilgannon (talk)12:17, 29 April 2020
 
 

Literumble is now updated to accept battles only from 1.9.3.5! Happy rumbling!

Skilgannon (talk)13:57, 2 March 2019

Upgrade to 1.9.3.4

Robocode 1.9.3.4 is released, with fixed meleerumble pairings and codesize utility (for lambdas), along with other fixes, should we upgrade now and see the changes?

Xor (talk)14:36, 16 January 2019

I've updated the accepted client version to 1.9.3.4 =)

Skilgannon (talk)19:37, 17 January 2019

1.9.3.4 has -cp option set to wrong value, causing codesize not to work (and pushing mega bots to nano rumble) if its not fixed manually. We should disable this version and wait fnl for a fix...

I made a pull request: https://github.com/robo-code/robocode/pull/14

Xor (talk)09:58, 18 January 2019

I've rolled back, let me know when 1.9.3.5 is available with the fix =)

Skilgannon (talk)10:21, 18 January 2019

It's available now ;) And let's upgrade now, since it's already fully tested ;)

Xor (talk)13:41, 2 March 2019
 
 
 
 

What's required for a bot to have KNNPBI?

I've seen many bots have KNNPBI; however my bot still have no KNNPBI (all zeros) after a long period of time ;( http://literumble.appspot.com/BotDetails?game=roborumble&name=aaa.SimpleBot%200.022d

So what's required for a bot to have KNNPBI?

Xor (talk)18:46, 20 August 2017

Finally after having 978 pairings the KNNPBI is shown. Anyway, still wondering what made KNNPBI to be all zeros.

Xor (talk)19:05, 20 August 2017
Xor (talk)19:07, 20 August 2017

You should have more pairings but I don't know how does it decide.

Dsekercioglu (talk)19:21, 20 August 2017
 

The computation of those values are batched, they are computed every 24 hours. One possible explanation for the older version to be still all zeroes is that maybe only the latest version of a bot is considered when doing the computation? Not sure about this, though. Just makes sense, had a really superficial look at the code. You can probably try to figure it out here: https://bitbucket.org/jkflying/literumble/src/38f6e71de1c6?at=default

Rsalesc (talk)21:46, 20 August 2017
 
 

KNNPBI (and the other batched rankings, NNP, Vote) are calculated once every 8 hours, since they can't be calculated incrementally. If you remove your bot before it is calculated, it won't be calculated, since it doesn't get recalculated on old bots.

Skilgannon (talk)20:06, 21 August 2017

Btw, is that possible for a bot to have vote and ANPP calculated and shown correctly, while having NPP and KNNPBI unavailable?

The link below captures an instance of such weird thing. http://web.archive.org/web/20181001092218/http://literumble.appspot.com/BotDetails?game=roborumble&name=aaa.n.ScalarN%200.011d.147

Xor (talk)10:24, 1 October 2018

It could happen sometimes if the saving fails. It should correct itself soon though.

Skilgannon (talk)13:38, 3 October 2018
 
 
 

Rankings Stable

It seems that both 1v1 and melee now shows "Rankings Stable" instead of "Rankings Not stable".

I once thought that "Rankings Not Stable" is hardcoded to show that the rankings are never stable so one should always run more battles.

But today is the first time I noticed "Rankings Stable", quite surprising.

So, what's the mechanism behind "Rankings Stable" and "Rankings Not Stable"? Is "Rankings Stable" displayed whenever every bot gets a full pairings?

Xor (talk)06:14, 17 June 2018

Your observation coincide with mine. Once all bots paired with each other at least once, the ranking get the stable status. Sometimes it does not happen for a long time because of missing bots or some bots crashing with a newer version of robocode. This is why the participants list sometimes get pruned.

If the ranking is unstable for a long time, I usually look which bot is missing a pair and search for a reason in the rumble client log.

Usually, stabilization takes about a day for each new bot.

Beaming (talk)15:57, 17 June 2018

Yeah, Monk gets an incorrect url for nearly half a year, making newly updated bots missing that pairing. And in 1v1 there are more bots having problems with current settings (robocode 1.9.2.5 and Java 8).

Should we have a clean up, or create a new rumble to remove bots having compatibility problem, which only adds noice to the rumble?

Xor (talk)03:08, 18 June 2018

I personally oscillating between "if the author does not care, why should I?" and "preserve the history". If you are in the second camp, let me remind about my FixingParticipantLinks script which relinks missing bots to strange automata archive.

What is our problem with Java 8? Do we already have bots with Java 9? Or robocode itself is not backward compatible and you see it on big enough robot pull?

Beaming (talk)15:35, 18 June 2018

My opinion is that as long as a bot works fine on current settings (robocode 1.9.2.5 and Java 8), we should "preserve the history". But once it produces random result (e.g. crashing half of the time), we should remove it (until the author should fix it).

Bots known to crash on some machines:

apc.Caan 1.0
dam.MogBot 2.9
sgp.JollyNinja 3.53
Xor (talk)02:47, 19 June 2018
 

I've been away for quite some time and I'll probably come back once I graduate. I still care about my bots, though (despite Monk being buggy as hell atm). I used to make use of Drive to provide the links, but I didn't know they would break after some time. What would you guys suggest me to do? Is the solution proposed above (fix script) sufficient for now?

Rsalesc (talk)23:46, 19 June 2018

Well, do not trust the modern hype i.e. a cloud. But I guess you already know it.

If you cannot host your bot yourself. Put it in the cloud, usually within a day or earlier appears at [archive]. Then just update the link to point there. I think, as of now, it is the most reliable way. Many thanks to Rednaxela for this effort.

Beaming (talk)02:02, 20 June 2018

I actually think that hosting it by myself looks like way less reliable than using such consolidated service. But yeah, I'll do what you suggested.

Rsalesc (talk)03:15, 20 June 2018
 

Well, in this case, the drive works totally fine ;)

Just have a look at this commit: http://robowiki.net/w/index.php?title=RoboRumble/Participants/Melee&diff=52900&oldid=52879

Xor (talk)04:11, 20 June 2018
 
 
 

Confirmed, it changes to Stable when all bots have full pairings.

If you find a bot that repeatedly crashes IMO remove it from the rumble and out it in the list below. If the author has a page make a comment and hopefully they will fix it.

Skilgannon (talk)16:53, 19 June 2018
 

Faster recovery from incidentally removed participants?

Updating the participants list manually fails from time to time, network issue, misoperation, etc. But removing a bot from rumble takes O(1) time, but re-adding n bots takes O(mn) time where m is the total amount of participants.

However, it takes O(1) time in principle to re-add a bot, since no data is lost. Is that possible to tweak the literumble to support faster re-adding?

Xor (talk)06:03, 13 June 2018

Re-adding bots actually only takes 2*m, once to add the bot, and once to add all of the pairings after all of the other bots have been added. Doing something different would require updating the rumble protocol, which I'd prefer not to touch.

Skilgannon (talk)08:39, 13 June 2018

Well, thanks for explaining the mechanism. O(m) feels much better, and a cleaner code is also preferable.

Xor (talk)13:40, 14 June 2018
 
 

Upgrade to 1.9.3.1

Finally 1.9.3.1 is out, with Skilgannon's shorter bot list update time. Any plan to move on? And then we can use lambdas without transpilers as well.

Xor (talk)13:56, 4 April 2018

You do not have permission to edit this page, for the following reasons:

  • The action you have requested is limited to users in the group: Users.
  • You must confirm your email address before editing pages. Please set and validate your email address through your user preferences.

You can view and copy the source of this page.

Return to Thread:Talk:LiteRumble/Upgrade to 1.9.3.1/reply.

yeah, Java 9/10 support is poor even in today, and they didn’t add anything as useful as lambdas, so stick with java 8 is never a big problem.

However, with lambdas, Java 8 is a completely different language. Lambdas gain more optimizations (e.g. omitting unnecessary object allocation) comparing to anonymous class, and with lambdas you can avoid several unnecessary boxing and unboxing which is unreasonably slow, so yes, they give more speed; Lambdas remove several crufts in Java development prior to lambdas, resulting cleaner and more readable code, so yes they give robustness.

I personally relay on lambdas heavily for cleaner and more readible code, but with customized build scripts, I can easily transpile my code to java 7, so moving on is not much beneficial for me. But it’s not easy to setup such a build script, so for other people, especially new-comers, they will never have a chance to use lambdas in robocode development. They will even be scared and despaired by the fact that their bot can not run on roborumble, even with Java 8.

By not moving on to robocode 1.9.2.6+, we are wasting a large part of effort of moving to java 8. With the better bot list update mechanism, I think it’s definitely time to move on.

Xor (talk)02:10, 5 April 2018
 
 

HTTP Error 500, Internal Server Error?

Since today, when uploading results, this message keeps appearing in my RoboRumble client:

java.io.IOException: Server returned HTTP response code: 500 for URL: http://literumble.appspot.com/UploadedResults

Did anybody observe similar results?

Cb (talk)19:59, 31 October 2017

Ah, this was caused by a bug due to a new check I added for dropping battles more than 24 hours old (to prevent old versions being added back by mistake). Can you try again and let me know if it is fixed for you now?

Skilgannon (talk)20:16, 31 October 2017

Yes, now it works, thanks :)

Cb (talk)00:28, 1 November 2017
 

It seems that when uploading out-dated pairings, it is still receiving http 500, which cause those out-dated pairings to be doubled (another bug) and re-uploaded twice each time... And then failing with doubled time, which grows like crazy.

Will you change the response to something like "200, out-dated pairings dropped" or so to fix this? Thanks ;)

Xor (talk)05:32, 1 November 2017

Well, that was the intention, but it seems that Python doesn't auto-convert datetimes to strings so my logging was crashing it. Should be fixed now!

Skilgannon (talk)08:25, 1 November 2017
 

More information:

java.io.IOException: Server returned HTTP response code: 500 for URL: http://literumble.appspot.com/UploadedResults
Unable to upload results meleerumble,35,1000x1000,Xor,1505812985874,SERVER abc.Shadow 3.84i,24248,6434,28 maribo.mini.MiniQuester 0.1,16657,4100,5
java.io.IOException: Server returned HTTP response code: 500 for URL: http://literumble.appspot.com/UploadedResults
Unable to upload results meleerumble,35,1000x1000,Xor,1505812985874,SERVER abc.Shadow 3.84i,24248,6434,28 aaa.ScaledBot 0.01d,15278,3625,1
java.io.IOException: Server returned HTTP response code: 500 for URL: http://literumble.appspot.com/UploadedResults
Unable to upload results meleerumble,35,1000x1000,Xor,1505812985874,SERVER abc.Shadow 3.84i,24248,6434,28 mld.DustBunny 3.8,14411,3784,0
java.io.IOException: Server returned HTTP response code: 500 for URL: http://literumble.appspot.com/UploadedResults
Unable to upload results meleerumble,35,1000x1000,Xor,1505812985874,SERVER abc.Shadow 3.84i,24248,6434,28 cb.nano.Insomnia 1.0,11918,2981,1
java.io.IOException: Server returned HTTP response code: 500 for URL: http://literumble.appspot.com/UploadedResults
Unable to upload results meleerumble,35,1000x1000,Xor,1505812985875,SERVER abc.Shadow 3.84i,24248,6434,28 ayk.WallHugger 1.0,8941,2568,0
java.io.IOException: Server returned HTTP response code: 500 for URL: http://literumble.appspot.com/UploadedResults
Unable to upload results meleerumble,35,1000x1000,Xor,1505812985875,SERVER abc.Shadow 3.84i,24248,6434,28 yk.JahRoslav 1.1,8499,1922,0
java.io.IOException: Server returned HTTP response code: 500 for URL: http://literumble.appspot.com/UploadedResults
Unable to upload results meleerumble,35,1000x1000,Xor,1505812985875,SERVER abc.Shadow 3.84i,24248,6434,28 rampancy.Durandal 2.1d,7268,1522,0

this is what I'm getting constantly (everytime it uploads).

Xor (talk)05:33, 1 November 2017
 
 

It's been a while since we have updated the rumble client version, and the new version brings several important fixes. I'd really appreciate if someone set up a quick benchmark for a battle or two for each bot in the rumble, and then run it on old and new versions to make sure we don't have any regressions. Once this is done we can upgrade the client =)

Skilgannon (talk)20:10, 8 October 2017

As far as I know, Robocode 1.9.3.0 hasn't been officially released yet. The website and GitHub still name 1.9.2.6 as the latest version, and there is no 1.9.3.0 download. You can only get it by building from the latest git master. What I linked to was a draft of the new changelog.

I don't think any new releases will be made until poor Fnl finishes dealing with all of the bugs reports I piled onto him. What I have been doing for the past week is emptying my mental list of annoyances with Robocode onto its bugtracker.

So currently, it is still in development, and it's a bit too early to do regression testing with this new version.

What does need testing, however, is Robocode on Java 9. We already found CPU constant calculation and team JARs to be broken there, and doubtlessly there are more issues.

MultiplyByZer0 (talk)22:01, 8 October 2017

Looks like Fnl is busy as we speak. May be it is the time to complain/bug report everything on our minds.

to MultiplyByZer0, thanks for spotting so many extra bugs.

Beaming (talk)22:37, 8 October 2017
 

Robocode 1.9.3.0 has been released.

MultiplyByZer0 (talk)21:17, 19 October 2017

Great. As soon as we have a benchmark comparison making sure no subtle score changes have crept in or tons of bots are now broken I'm happy to change the LiteRumble over!

Skilgannon (talk)21:26, 19 October 2017
 
 

Varying NUMBATTLES of RoborumbleAtHome?

Recently, I noticed that more than half of the battles are dropped as queue is full — however, this won't happen even if I wait a few minutes. Seems that all the rumble clients are uploading battles periodically, and that upload is pretty concentrated — e.g. All four clients of mine upload ~200 battles within ~3 minutes, which makes the queue get full immediately. And If I take a look at literumble/statistics, I can see that there are 5 to 7 clients uploading within 2 minutes.

It generally takes a client about 15 min to finish 50 battles, but if we vary this to primes, the uploads will get evenly distributed, reducing the high concurrent which causes a lot of dropped battles.

Xor (talk)02:38, 8 October 2017

Reducing NUMBATTLES would probably help here too. It would also reduce the delay which is the main cause of duplicated pairings for new bots being entered. Maybe a NUMBATTLES of 20 in the main rumble would be good enough to solve the client component of this.

However, I think one of the main causes of the full queue is the batch processing for Vote/NPP/KNNPBI, since the queue needs to paused while this is running. Because it is paused the projected processing time goes very high, and it stops accepting new uploads. I have an idea on how to tune this, it should help a bit.

Skilgannon (talk)10:50, 8 October 2017

However, even a NUMBATTLES of 3 can't prevent most of the battles from being dropped ;/

Seems that with 8 clients running the rumble at the same time, no attempt will help without stopping some clients.

Worth mention that I can notice dropped battles when there are 6 clients, also not frequently. Seems that with 2 more clients, the effectiveness dropped considerably?

Btw, one thing that's really interesting is that the duplicates of multiple versions can last hours. Seems that some clients are not checking participants list for hours.

Xor (talk)15:56, 8 October 2017
 

Got it — maybe after the queue is paused for batch tasks and then resumed, it keeps near full as there are still much parings uploaded. Like some DoS, this decreases the ability to handle high concurrent (although the average pairings uploaded per minute is not very high, they came in during a short period of time, and get dropped)

Xor (talk)16:53, 8 October 2017
 

Then I think increase the queue size a little after batch task (and then decrease to normal size slowly to make sure new uploads won't wait forever after some flood upload)

Or, we can handle uploads during pause separately — don't let them take place in normal queue, rather, store them in a separate queue (and cap it with normal uploads per minutes * pause time).

Xor (talk)17:02, 8 October 2017

I was running 8 clients, that was probably causing it. Particularly melee clients cause a huge number of uploads for the amount of processing time required by the client.

I'll save my clients for when there are less others running =)

Skilgannon (talk)20:13, 8 October 2017

I've been experiencing constant "queue full" messages in the past 2 hours in MeleeRumble, with 3 melee clients + 3 rumble clients. This should really be happening this often?

Rsalesc (talk)03:19, 12 October 2017

I noticed that every time the queue is paused for batch tasks, not until I pause the clients for a few minus, did the massive queue full messages stop.

That may because when the queue size is near max size, the capacity of handling high concurrency decreases dramatically, although the average processing power doesn't decrease at all.

Use a separate queue when it is paused may help, imo.

Xor (talk)05:11, 12 October 2017
 
 
 
 
 

Adding opponent APS in bot comparison?

Would you mind adding another column called Opponent APS in bot comparison? When sorting with opponent APS, it could be really useful to see the difference of two bots against bots with different APS range, as in the Diff Distribution graphic, but with more information, especially the bot name. This could also help us to create a good test bed ;)

Xor (talk)16:46, 5 September 2017

I can take a look (although not this weekend, I'm away from home). However, what would you consider appropriate behavior on bots which had been removed from the rumble, but which are a shared pairing? The APS/diff image does this by just ignoring those pairs, but I don't think we want to do that here. Do I put a 0.0?

Skilgannon (talk)22:36, 7 September 2017

Can we assume that APS is relatively stable? Since we can click into the detail page to see the history APS even when that opponent is removed, can we simply put that value?

Opps this assumption breaks when comparing ancient bots ;( Then polluting the table must be a bad idea. However, why don't we use NaN or N/A instead of 0?

Xor (talk)01:05, 8 September 2017

NaN sounds most appropriate. I don't want to have to fetch each bot object that is not in the rumble anymore to look up its last APS.

Skilgannon (talk)07:30, 8 September 2017

Done. I also added a link on the BotDetails page to find the bot on the wiki.

Skilgannon (talk)17:19, 9 September 2017

Awesome! Thanks a lot.

Since you in a wish granting mood, would it be possible to have api call which returns only summary table with APS, PWIN, etc for a given bot in a given game. Right now, I parse http://literumble.appspot.com/BotDetails?api=true but its spits the whole comparison table, which is overkill and wastes the bandwidth. All I need is info stored in the header table.

I do it to plot APS vs bot version for my bot, but I can imagine other will be interested in this too.

Beaming (talk)20:08, 9 September 2017
 

WoW that's amazing! Thanks a lot!

Xor (talk)23:27, 9 September 2017
 
 
 
 
 

Literumble queue size?

LiteRumble says OK. Queue full,XXX vs XXX discarded.

and it is discarding hundreds of battles :\

Xor (talk)07:14, 8 September 2017

If the queue gets too long then the priority battles have a severe lag, so the rumble gets really inefficient. Max queue size is based on projected processing time.

Skilgannon (talk)07:31, 8 September 2017
 

weirdness in pairing

Hi, after recent bot removal and restoring we have strange artifacts: asymmetrical pairing reports.

Have a look at Galzxy 01 stats and sample.Walls 1.0 stats. You can see that Galaxy 01 has 18 battles against Walls. Byt if you look at Walls stats there are no reports on these 18 battles with Galaxy 01. Galaxy 01 is simply missing in the list of Walls battles.

Beaming (talk)18:43, 7 September 2017

You just need to wait for Galzxy to get another battle, and it will be fixed again.

Skilgannon (talk)22:36, 7 September 2017
 

How can I deploy roborumble client on a server?

A thread, Thread:Talk:LiteRumble/How can I deploy roborumble client on a server?, was moved from here to Talk:RoboRumble. This move was made by Xor (talk | contribs) on 21 August 2017 at 00:34.

Preconfigured client link is down ;(

https://dl.dropboxusercontent.com/u/4066735/literumble-template.zip is not available now ;(

And archive.org doesn't has an archive of it ;( does anyone have a backup of it?

Xor (talk)10:10, 20 August 2017

By the way, I'm really wondering how is LiteRumble working ;) I used to think the battles are all on the cloud, but then I discovered http://literumble.appspot.com/RumbleStats which shows a lot of contributors with familiar names ;) How can I set the battles to run on my computer and submit the results to LiteRumble? Didn't see any discussion about it.

Xor (talk)10:21, 20 August 2017
 

That isn't needed anymore, the newer versions of Robocode are preconfigured to support Literumble.

Just download 1.9.2.5, edit robocode/roborumble/[roborumble/meleerumble/etc].txt to have your name, and you can run battles on your computer to contribute to the rankings. The website just displays the battles that users have uploaded in a nice way.

Skilgannon (talk)11:08, 20 August 2017

wow that's very convenient

btw, is there any plan to support robocode 1.9.2.6?

Xor (talk)11:14, 20 August 2017

It should be tested a lot to be sure that there isn't any errors.

Dsekercioglu (talk)12:40, 20 August 2017
 
 
 

Clear LiteRumble history

I have created my own LiteRumble instance running as a google app, as described in previous discussions. Now I want to know if it is possible to delete the battle history and the participated robots? I am experimenting with it since we want to have a roborumble event at our office and I want to delete my previous "testing" robots and matches and have a clean slate when we do the event.

Frbod1 (talk)15:05, 18 November 2015

You should be able to delete the data from the AppEngine web console. Otherwise you can simple make the clients upload to a different named rumble, and the old one can be for the demo/setup bots.

Skilgannon (talk)21:22, 18 November 2015

I have tried to remove the data from the datastore by selecting all database entries and delete them. But the data on the webpage is still there, so the data must be stored somewhere else. To create a new rumble seems like a annoying workaround :)

Frbod1 (talk)07:54, 19 November 2015

There may still be a copy in Memcache - if you clear Memcache and the datastore everything should be gone.

Skilgannon (talk)08:05, 19 November 2015

Haha, didn't reload the page before I submitted my last comment. I seem to have found the problem at the same time that you mentioned it. Thank you anyway!

Frbod1 (talk)08:35, 19 November 2015
 
 

Found the problem. Had to delete all database entries AND the memory cache.

Frbod1 (talk)08:14, 19 November 2015
 
 

One thing I really missed from the old rumble was the LRP, but without ELO/Glicko we can't really do the whole straight-line fit any more. So, instead I have added a Score Distribution image on every bot's details page. The red is APS and the green is Survival (as seen in image the mouseover). The image is directly embedded in the HTML using data URIs, so if you are using IE, 8 and later only, otherwise pretty much everything supports it. I'm also planning to add this to the BotCompare page so you can analyse differences in score compared to opponent score for both APS and survival.

Skilgannon (talk)23:28, 10 May 2013

Ahhh, neat stuff. That's very nifty with directly embedding the image data there. For some reason the image is displaying very tiny for me though under Firefox 20.0. It gets scaled to the box around it properly under Chromium, but not Firefox.

EDIT: Nevermind... the styles.css file was being cached and that was the problem. A ctrl-r fixed it.

Rednaxela (talk)23:48, 10 May 2013

Ah yeah, the styles.css was changed so you need to do a hard-reload.

I've now added the KNNPBI to the bot-details Scores Distribution, and the bot-compare has a Diff Distribution.

Skilgannon (talk)13:50, 11 May 2013
 

There is something fishy with a chart in the right part close to the end. If you look at above CunobelinDC score distribution you would see that there is no corresponding red points for stronger opponents, while blue and grean are there. This is quite common theme for other bots as well.

Also have a look at this EvBot score distribution you would see the problem with normalizing, i.e. about 1/4 of the space in the right part of the chart has no points. Which is non optimal use of the chart space.

Beaming (talk)16:44, 18 November 2013

Is it still showing the problem? I don't see anything wrong right now. I had some issues with (I suspect) bad bytecode and versioning, but that should be fixed now.

As for the EvBot chart, that is because in meleerumble nobody gets higher than ~75%, so the top 25% is empty. Although I guess I could normalise to the top score, I'd rather have the charts consistent as better bots are released.

Skilgannon (talk)17:55, 18 November 2013

Aha, I see now why melee charts were somewhat off.

But I insist that I do not see red points for X>95% for CunobelinDC. Look at 5 the rightmost green points, I cannot locate red (APS) or blue for the same X values. It might be aliasing problem or may be points are just on top of each other.

Beaming (talk)18:41, 18 November 2013

Green is survival, and so the X value is the average survival score of the enemy bot. The red and blue use enemy APS as the X value, not survival, and since survival scores are higher the green dots go further to the right.

I've actually thought about changing the X axis to just be enemy APS to make it easier to interpret. Or ordering the X-axis by rank instead of using APS values.

Skilgannon (talk)09:22, 19 November 2013

I've changed it so they all use APS on the X axis, so it should be clearer now.

Skilgannon (talk)11:03, 19 November 2013
 
 
 
 
 

Starting your own LiteRumble

Does anyone have some advice for starting up a custom and/or private LiteRumble? I've got a new batch of programming students that I'm leading through Robocode and I'd love to run a custom bracket with just my kids in it as I've done in years past.

Tkiesel (talk)18:50, 18 November 2014

Sure, it's easy enough.

  1. Create your own app on Google AppEngine
  2. Download and extract the code from bitbucket
  3. Change the app name in app.yaml to the name of the app you created
  4. Download and install the Google AppEngine python SDK
  5. Run the following in the code directory: appcfg.py update . && appcfg.py update batchratings.yaml
  6. This should give you an empty LiteRumble instance running on your app


Once you have a copy of LiteRumble running, all you need to do is modify the rumble client in roborumble.txt to point to your new server for uploads. You also need a new participants list, which you can host on appengine too if you don't mind continually re-deploying, or you can make a wiki page somewhere. The client just parses everything between the two <pre> tags.

Have fun!

Skilgannon (talk)19:14, 18 November 2014

Excellent. I can just host participants on a Dropbox text file. Thanks for the info!

By the way, a favorite thing I do when introducing my kids to Robocode is to have a pair of them (driver and gunner) pilot sample.Interactive at a moderate simulation speed against some sample bots until they get used to it. Then they face DrussGT. Thought you'd want to know that you've caused some laughter and groans of frustration from some prospective high school coders!

Tkiesel (talk)19:23, 18 November 2014

Brilliant. I've always found the sample.Interactive very difficult to control, I don't think I'd stand a chance against DrussGT =) I bet if I set the bullet colour to something more similar to the background it would make it even harder for interactive users >:-D

Skilgannon (talk)19:30, 18 November 2014

That's always the kicker is that they have a very very hard time adapting to a top of the line bot like DrussGT or Diamond. I've had students say it's like the bot is reading their mind. Then I drop the bomb that the bot can't see bullets, while the students can. It's a great and impactful "Math is POWERFUL" moment!

Of course, set the sim speed low enough and get a patient non-wasteful gunner, and they will trash DrussGT because they can dance juuust aside of each bullet. But as long as I set the sim speed such to keep them on their toes, it's a rough but educational ride. Fun for spectators too!

Tkiesel (talk)19:35, 18 November 2014

I have some ideas about dealing with interactive users - closer range, not letting energy levels get below the enemy, varying colours of dark blue and grey bullets - perhaps that should be something I work on next. I've neglected Robocode and I've been working on more pure ML/AI problems instead, but this is something more in the behavioural side which AFAIK hasn't been done yet.

Skilgannon (talk)19:55, 18 November 2014
 

Awww, high school students have all the fun. XD

Chase04:05, 19 November 2014
 

The sample bot Interactive is hard to control. For 1v1, all you would really have to change in response to what you see is orbit direction, distancing, current aiming GF, and bulletpower/when to fire. Everything else could be automatic 99+% of the time.

Would anyone be interested in a SuperInteractive wiki collaboration? Perhaps a challenge for driving it against DrussGT?

Sheldor (talk)05:06, 19 November 2014

I was thinking of a fairly simple "SuperInteractive" which does regular wave-surfing, but also allows you to click on enemy bullets, which it will then dodge. Targeting, I feel, would be stronger without any human intervention.

Skilgannon (talk)05:31, 19 November 2014
 
 
 
 

Tkeisel Can I contribute to your Literumble please?

Tmservo (talk)03:11, 19 November 2014
 
First page
First page
Previous page
Previous page
Last page
Last page