View source for Talk:RoboRumble/Participants
Entering rumble via infobox
You know, it would be kinda cool if people could enter their robots in the rumble via Template:Infobox Robot. Just add the appropriate data to the box and an argument that says, effectively, "Yes, enter my bot in the rumble," and boom, it gets picked up. RobertWalker 19:10, 12 December 2007 (UTC)
- Thats sorta possible, if say, we added a category to the template then had it use that category and trace the link to the bots page and look for its jar. However thats a lot of extra skipping around, and its a realy strain on server resources to have to 'check' for these things. --Chase-san 20:07, 12 December 2007 (UTC)
- A couple issues with that, though not show-stopping issues: one, we have bots in the rumble with no bot pages. (Vanessa, for instance.) Two, you don't always have your latest version in the rumble, or you have to post a temporary RRGC version or something like that. With all the little caveats, the Participants page might still be the most elegant solution. --Voidious 20:13, 12 December 2007 (UTC)
Is there a reason this list isn't being used for the rumble yet? Also, is there an updated zip of the rumble bots around? (Or could someone make it? :-D? I'd like to start running battles again. -- Alcatraz 12:52, 8 December 2008 (EST)
- robowiki.net is running again, but maybe is time to activate this participant list? --lestofante 13:40, 5 January 2009 (UTC)
- Agree. Many down of the old wiki can make newbie like me try many new idea on rumble! --Nat 11:44, 6 January 2009 (UTC)
Anyone still interesting with this issue? This can accomplish via a bot. If we have property name rumbleLocation in infobox, and the template automatically put any robot with that parameter into some category, I can make my soon-created bot handler that, say once per hour? For special RRGC/WSGC or a bot without page, there can still use another participant list and it will be merged to another page when the bot run. » Nat | Talk » 09:35, 2 May 2009 (UTC)
- Personally anyway, it seems that trying to make entry via infobox just overcomplicates things really. I'd much perfer there just be one simple way to do it: Add it to the participants page by hand. --Rednaxela 14:33, 2 May 2009 (UTC)
- I agree. The only change that I'd like to see (eventually) is to have the server maintain the participants list and bot storage. But I'm not ready to commit to programming that yet. --Darkcanuck 17:25, 2 May 2009 (UTC)
Hey man, you don't need to remove Tigger, one of us would gladly host it. Darkcanuck has posted most rumble bots to his server already, so you can make it point here if you want: http://darkcanuck.net/rumble/robots/stefw.Tigger_0.0.23.jar. Of course it's your call, but it seems a shame to remove such an old-school bot. --Voidious 14:01, 14 May 2009 (UTC)
- I wouldn't just call it an old-school bot, but also a very interesting one that gives a fair number of surfers some trouble if I remember right.. :) --Rednaxela 14:54, 14 May 2009 (UTC)
- Well, it trounces Komarious, gives PulsarMax, Lukious, Engineer, and WinterMute a rather hard time... not sure if that's because of it's unique Tile Coding things, but it might well be --Rednaxela 15:35, 14 May 2009 (UTC)
- It could be its unique stats system. Since it's a reference bot for the TC2K6, the Targeting Challenge 2K6/Results give us at least some insight into its movement. Clearly, some top bots can really zero in on Tigger, but a lot of still very strong guns have some trouble with it. And the low scores against Linear Targeting and Circular Targeting seem odd, but might mean that he always enables a flattener, or just has some anomaly or bug that has a similar effect. --Voidious 15:58, 14 May 2009 (UTC)
- Well, it's reference scores against linear/circular targeting don't look that weird to me. I mean, the only reference bots that do better against the simple targeting are either surfers or the tremendously well-tuned multi-mode known as GrubbmGrb --Rednaxela 16:51, 14 May 2009 (UTC)
- Tigger is a surfer, though. :-P And his score against HoT is respectable. --Voidious 17:17, 14 May 2009 (UTC)
Hey guys, should we re-enter Tigger? StefW's only reason given was about Geocities going down, so I say we do it. Any objections? --Voidious 02:04, 20 July 2009 (UTC)
- Agree. And the Geocities isn't going down till October, I can still access my webpage right now. But we can have the Darkcanucks' one. I think we usually grab it from the zip files, btw. » Nat | Talk » 13:17, 20 July 2009 (UTC)
- Re-enter it. It is a decent and quite unique bot and also away to honour StefW for his development of the initial onPaint. --GrubbmGait 22:03, 23 July 2009 (UTC)
I just fixed a ton of them! Also, thanks to Darkcanuck for the http://darkcanuck.net/rumble/robots/ hosting of them. Now.. I hope we can keep them more fixed than they have been, as that was rather tedious :P --Rednaxela 00:25, 6 August 2009 (UTC)
I just added the fix version of TheBrainPi, I did some testing and it didn't seem to throw any exceptions. I uploaded it to my google site because roborepository would upload it as mine, and show Zyx as author and that didn't seem right, but I don't know if it is better if Darkcanuck can host it in his sever? All I changed in the code has a comment that contains the words Unofficial fix very close from which it can be easy to see the changes. --zyx 01:21, 5 September 2009 (UTC)
- Very cool of you to take care of that! Check out the comparison: . It's little things like this that make me appreciate what a great community we have here. --Voidious 16:28, 5 September 2009 (UTC)
- Ok, the fixed version is now on my server (along with all other current 1v1 and melee bots) so you can change the link if you like. Thanks for doing this! --Darkcanuck 19:51, 5 September 2009 (UTC)
Wow, there are a lot more duplicates in the participants list than I realized. Any objections to me removing all but the highest ranking version of each of these bots?
altglass.Exterminans2oo8 alpha0328,http://d-gfx.kognetwork.ch/robocode/altglass.Exterminans2oo8_alpha0328.jar altglass.Exterminans2oo8 Build0411,http://d-gfx.kognetwork.ch/robocode/altglass.Exterminans2oo8_Build0411.jar am.Miedzix 2.0,http://www.robocoderepository.com/BotFiles/3383/am.Miedzix_2.0.jar am.Miedzix 3.0,http://darkcanuck.net/rumble/robots/am.Miedzix_3.0.jar cjk.Merkava 0.1.1,http://www.robocoderepository.com/BotFiles/2637/cjk.Merkava_0.1.1.jar cjk.Merkava 0.2.0,http://www.robocoderepository.com/BotFiles/2640/cjk.Merkava_0.2.0.jar cjk.Merkava 0.3.0,http://darkcanuck.net/rumble/robots/cjk.Merkava_0.3.0.jar kurios.DOSexe .9a,http://www.kuriosly.com/roborumble/kurios.DOSexe_.9a.jar kurios.DOSexe .9b,http://www.kuriosly.com/roborumble/kurios.DOSexe_.9b.jar pak.Dargon 1.0b,http://www.robocoderepository.com/BotFiles/3388/pak.Dargon_1.0b.jar pak.Dargon .2c,http://www.robocoderepository.com/BotFiles/3389/pak.Dargon_.2c.jar paulk.PaulV3 1.7,http://www.robocoderepository.com/BotFiles/3502/paulk.PaulV3_1.7.jar paulk.PaulV3 1.6,http://www.robocoderepository.com/BotFiles/3497/paulk.PaulV3_1.6.jar paulk.PaulV3 1.5,http://www.robocoderepository.com/BotFiles/3496/paulk.PaulV3_1.5.jar paulk.PaulV3 1.3,http://www.robocoderepository.com/BotFiles/3495/paulk.PaulV3_1.3.jar zyx.micro.Ant 1.1,http://www.robocoderepository.com/BotFiles/3481/zyx.micro.Ant_1.1.jar zyx.micro.Ant 2.1,http://sites.google.com/site/zyxsite/robocode/zyx.micro.Ant_2.1.jar
Also planning to remove "whind.StrengthBee 0.6.4", as that was just a test of Strength with CassiusClay/Bee gun. And is this "rule" actually written anywhere? (I know it's kind of a "soft rule", but I still think it's a good one, in general.)
--Voidious 22:39, 9 October 2009 (UTC)
I agree with that, it is common sense. About micro.Ant, I didn't know there were two versions of it (in life not only in the rumble). I'm removing v2.1 but I think that maybe they only share the name, there is a good chance they have no code in common, it's been quite a while since I wrote that. --zyx
Common sense indeed. The number of participants has gone from 300 when I started to nearly 750 now, so duplicates and testbots should be removed, preferrable by the author. Authors should even consider if older bots with successors and without 'unique' setup could be removed. (Says the man with 8 1v1 and 6 meleebots.) But the latter is strictly a matter for the author and not for the community. --GrubbmGait 09:51, 10 October 2009 (UTC)
Yes that's a good idea. How about also removing bots that reguarly freeze/skip many turns/take lots of memory and/or have to be stopped by robocode? --Positive 11:05, 10 October 2009 (UTC)
Cool - I'll give this another couple days before removing any of the dups myself. zyx, that's funny =), and if they are indeed different bots, of course feel free to leave them both in. GrubbmGait, well said, I completely agree. Positive, I personally agree about bots that crash frequently (DogManSPE and SmallDevil come to mind), but IIRC, I suggested removing DogManSPE once before and met with some resistance. =) --Voidious 17:02, 10 October 2009 (UTC)
- I think you may have forgotten to do this... --Darkcanuck 03:55, 13 July 2011 (UTC)
I completely agree as well, I will probably be removing a number of out dated or test robots of mine (such as prototype, sou, orbit, and problembot). Also duplicates and robots that crash often or have to be stopped by robocode, or that use problematic methods (such as the static reference of Advanced/Team robot). These should be removed, unless they can be repaired, or if some reason exists for retaining them. For example gg.Wolverine I repaired a long time ago to keep it from losing battles due to calls to getXX. But it wasn't a simple fix. — Chase-san 14:56, 24 July 2010 (UTC)
Wow, that was a long time ago. Finally removed the ones that hadn't been already. Also skimmed the participants list and didn't notice any other duplicate versions. --Voidious 01:53, 14 July 2011 (UTC)
Melee list, gg.Wolverine and wiki.Wolverine, wiki.Wolverine is a fixed version of gg.Wolverine (probably shouldn't have ever have been changed from gg now that I think about it). — Chase-san 02:10, 14 July 2011 (UTC)
Removal of crashing bots
Ugh... I was about to post here proposing the removal of DogManSPE due to the great deal of instability it has and then I read in the previous section that there was some resistance to such removal in the past. Personally... I'm finding this one to be highly irritating because it always shows up as a high score diff when comparing bot versions, and often a big enough difference to have a non-negligible impact on overall score. --Rednaxela 03:59, 13 January 2010 (UTC)
Looking on the old wiki for references to DogManSPE, the only mentions I see are countless complaints and comments about it being a high PBI bot for them (due to it happening to not crash against them), and some mention on oldwiki:RoboRumble/RankingChat20070224 which isn't really resistance to DogManSPE specifically I think, particularly considering how it doesn't look like it was entered by it's author in the first place. --Rednaxela 04:12, 13 January 2010 (UTC)
As you may have seen in those discussions, I'm also in favor of removing DogManSPE for that reason. But I've lived with it this long... =) So whatever. --Voidious 04:33, 13 January 2010 (UTC)
I have no sympathy for DogManSPE. That said, if there were enough battles then this crashing effect would be smoothed out. There are other crashing bots stuck at the bottom of the rumble, presumably abandoned by their authors (eg. ElverionBot, Dreadknoght)... --Darkcanuck 04:51, 13 January 2010 (UTC)
Update of RoboRumble Version
What I don't get is why RoboRumble uses Robocode version 22.214.171.124 (I think, it might be slightly newer). Right now we are on release 126.96.36.199 Beta. The least we could do is use 188.8.131.52... --PiRocks
Because the 1.7.2 line is still not stable enough. True, there is a lot of changes, but from 184.108.40.206 to 1.6.2, 1.6.2 to 1.7 and 1.7 to 1.7.1 has a lot of very big changes, which is inevitable for bugs to occur. There were at least three discussions of this, but I can't remember where they are. --Nat Pavasant 12:45, 24 June 2010 (UTC)
- Actually, unless you know bugs that you haven't reported, 220.127.116.11 and 18.104.22.168 Beta may be stable enough (Unless PiRock's 'workaround' truly shows a security bug in the current version). I haven't had a good enough chance to fully test either, but I was very heavily testing just before 22.214.171.124 release and I'm pretty sure 126.96.36.199/188.8.131.52 will be stable enough. Just need to test it properly. --Rednaxela 13:04, 24 June 2010 (UTC)
- Unfortunately, my workaround most definately shows a security error. 184.108.40.206 Beta doesn't protect the main ThreadGroup, so my robot accesses it and waits for enemy robots to be scanned. Then once scanned, it calls interrupt() on their Thread and they get destroyed due to inactivity. —Preceding unsigned comment added by PiRocks (talk • contribs)
- Heh... I've created a bug report for this now. Also, removed the bot from the rumble, because 1) Generally not good to have exploit bots there and, 2) As you noted it's broken anyway in 220.127.116.11. Thanks for uncovering the issue. --Rednaxela 00:22, 25 June 2010 (UTC)
Removal of old bots.
I didn't want to be the one to suggest this. I dislike hanging myself in public. But I do think it is a good idea. There are so many now, that the rankings have many very old very poorly performing robots. We already want to remove robots which time out due to getXXXX calls, or which just don't work anymore. But consider back in 2004 or so and we only had around 350 robots in the rumble, that number has doubled and then some.
I am all for the history, but the number of old, poorly performing, buggy or just out of date robots is staggering. With almost 800 participants the number of battles needed to stabilize its ranking is rather high. All I am suggesting is a one time trim, pick a target number and carefully select the robots in which we can live with (an no one complains about) dropping. I say at the very least 400.
(I don't expect many people to like this idea, however much I do)
Only time for one quick comment before bed, but... Reducing the number of bots would not decrease the number of battles required to get a stable ranking. Only removing the bots with the highest variance would do that. (Assuming you've faced everyone at least once.) Even if you are just facing one bot, it would take 1500-2000 battles to get a result as accurate as we want in the rumble. At least, I'm pretty sure that's the case. Hopefully some resident math wiz can back me up. =) --Voidious 04:35, 10 August 2010 (UTC)
I don't have time right this moment to comment on any other aspects, but I'll quickly comment on the impact of number of bots on rank stability. First off, you're most definitely correct asymptotically Voidious, in that number of participants wouldn't impact stability as the number of battles becomes very large. When the number of battles is not very large though, I can think of two effects which cause a deviation:
- The obvious one, is that before pairings are complete, the accuracy/stability of the rank is reduced due a high chance of scores against many bots not being what's expected (i.e. problembts). 500 battles with 300 participants will be more stable than 500 battles with 800 participants, due to this.
- When pairings are complete, battles are not distributed randomly, and not necessarily evenly. Because each pairing is weighted equally, each additional battle added has more impact on score stability when added to a pairing with few battles so far. What this means is, say you have Robot A, which has results against Robots X, Y, and Z. If the pairings A vs X, A vs Y, and A vs Z each have 2 battles, the resulting score is more stable than if A vs X has 1, A vs Y has 3, and A vs Z has 2.
I'm not sure if factor #2 has that big an affect really, but it's a non-zero effect. #1 is definitely a noticeably effect I'd say, though only temporary. --Rednaxela 13:16, 10 August 2010 (UTC)
- As for #2, I don't really think that matters. I agree that 1/3/2 is less stable than 2/2/2, but 2/6/4 should be the same stability as 1/1/3/3/2/2, I think. So I don't think it really pertains to number of bots. --Voidious 14:53, 10 August 2010 (UTC)
Well, I imagine my opinion is pretty predictable =), but I vote not to remove old bots. Personally, I like the ridiculous diversity we have in the rumble. I like when some bot I've never heard of exposes some obscure flaw in my own bot. I like the idea of old Robocoders coming back 5 years later, checking the rumble and finding their bots still competing. And while we may have twice as many bots as 2005, our computers are 5x faster (or more) anyway.
I also don't really see the value in removing them and I just don't see a fair way to choose which bots to remove. Being old or having a low ranking doesn't seem like good criteria to me. "Buggy" is tough to gauge, and we've already removed or fixed the buggiest rumble bots (with inactive authors). If enough people are in favor, maybe we can vote. But this is not a decision to be taken lightly, so if it does come to that, I think we should give it lots of lead time (like months) to discuss and vote on it.
--Voidious 14:53, 10 August 2010 (UTC)
- I honestly do agree with you in the most part. However I still feel like there are just to many. I would must rather everyone take a gauge of their robots and remove all but they want rankings for in the rumble. But considering many authors are no longer around, that isn't an entirely feasible option. — Chase-san 19:19, 10 August 2010 (UTC)
How about this, instead of choosing which robots to remove, we choose which robots to keep, which is a much less stressful and guilty endeavor. But limit it to a few dozen robots per person to add back in (nothing exact), just so no one just copies the entire list and says 'all of them'. Of course we could end up having the entire list back given enough people. — Chase-san 19:27, 10 August 2010 (UTC)
- I am in support of removing any nonworking robots (the ones that time out from getXXX or any other errors) but I think that the other older robots that still function should be kept. --Exauge ◊ talk 23:08, 10 August 2010 (UTC)
- I agree. I know that I would be a bit peeved if I came back to Robocode a few years later only to find that my bot and all the bots I had known had been removed because they were considered weak and old. On the other hand, if it *specifically* was slowing down the rumble due to crashing, timing out etc. then I wouldn't mind as much. --Skilgannon 19:03, 11 August 2010 (UTC)
- Well to be fair, in your case, I find it rather unlikely they would be all removed. Though in this case I am only suggesting a one time trim, and if those people want they can reenter thier robots in the rumble at that time. Also might get an older robocoder to rejoin the game (which is always nice) — Chase-san 16:45, 12 August 2010 (UTC)
- Instead of removing the robots from the lists, they could be moved to another page (old participants list). This way they are not completely gone, and we would have a kind of "history" of robots that previously took part in the rumble. Just a suggestion. --Fnl 20:42, 11 August 2010 (UTC)
- Some bots aren't downloadable, so my rumble client is never using them in fights. It's jab's bots hosted on freewebs.com and simonton's bots hosted on frozenonline.com
- Anyone have these bots and the ability to give them a hosting home? -Tkiesel 18:00, 14 November 2010 (UTC)
Personally, I vote against sample bot in RoboRumble. Even though right now ranking stabilize fast, they still add loads to server (and the server is already under very high loads). Since most sample bots don't perform well in main Rumble anyway, probably beaten by most nanobots, I see no reason for sample bots in main Rumble, unlike the melee one.
I'd vote to keep them in. I think their importance as bots vs how much they increase rumble size is worth it. But either way is fine with me. If there's not a clear concensus, maybe we could actually have a poll... --Voidious 16:17, 4 July 2011 (UTC)
I also vote to keep them in, obviously, as I am the one who put them. Everyone knows them, and its funny to see how many megabots perform less than 100% survival against them. And there are worse bots in the rumble and nobody complains about them. --MN 00:35, 5 July 2011 (UTC)
Unfortunately there´s no "extends Robot" category in the rumble. They would be a lot more competitive in such category. --MN 00:35, 5 July 2011 (UTC)
I'd disagree with "And there are worse bots in the rumble and nobody complains about them", because there has been complaining about those others sometimes... Overall I feel neutral about the presence of the sample bots, because they do have the redeeming quality of being decent reference points for beginners. --Rednaxela 02:59, 5 July 2011 (UTC)
I still personally think we should at some point do a major trim on the number of participants, or have a Rumble2 (with only new/er participants), but I thought they were removed for the reason Nat originally posted. If you they serve as good references for beginners that is fine, I won't object. — Chase-san 09:25, 5 July 2011 (UTC)
I noticed my rumble client grind to a near halt when running gf.Centaur.Centaur against any other bot, its slower then pitting Diamond against Shadow. Anyone else getting similar behavior? — Chase-san 06:36, 17 July 2011 (UTC)
- I haven't noticed whether or not Cenetaur is slow for me, but just as a note, "Diamond against Shadow" is not something I'd expect to be hugely slow anyway as far as things go, since Shadow is one of the fasted high-ranking bots last I checked. --Rednaxela 07:06, 17 July 2011 (UTC)
- Diamond's not particularly slow either, at least compared to other surfers... Anti-Surfer Challenge/Pre-Chat has a good comparison of CPU speed of various surfing movements, though it might be slightly out of date (like I bet DrussGT is faster now). --Voidious 15:30, 17 July 2011 (UTC)
- Yep, Centaur makes my system crawl, even against a samplebot. Battles between two topbots can take longer, not particularly because they are slowbots, but because they perform on par, stretching battles to the last drop of energy. --GrubbmGait 16:47, 17 July 2011 (UTC)
- Sorry didn't mean to insult anyone with my reference. Two top surfers vs each other are usually the slowest battles. I just picked two I particularly liked. — Chase-san 17:07, 17 July 2011 (UTC)
- Centaur 0.6.5 is indeed very slooooow to run. If I didn't have other things to do, I would go play with it manually and see if it's skipping a lot of turns or doing other odd things. -- Skotty 02:31, 19 July 2011 (UTC)
|Thread title||Replies||Last modified|
|Down to #9||3||05:01, 9 January 2018|
|On removing bots worse than SittingDuck||16||07:37, 8 September 2017|
|Remove bots with underscores in version numbers?||11||15:37, 3 September 2017|
|Entry Time||2||16:59, 6 April 2016|
|fix/remove missing bots||4||04:24, 5 October 2015|
|uji.SiberianKhatru 2.0||0||02:18, 29 September 2015|
|Add a bot to the rumble||2||14:59, 7 October 2013|
|Dropbox offline?||1||16:32, 30 May 2013|
|Java 7||4||03:23, 17 December 2012|
|Suicidal Bots||0||15:15, 20 September 2012|
|Toad vs UberBot||0||15:00, 17 August 2012|
|MogBot and Hamilton||2||02:35, 6 June 2012|
|Bots without hosting||6||09:06, 11 May 2012|
|lxx.Tomcat 3.55 vs cs.ags.Scarlet 1.1c||21||15:50, 29 February 2012|
|Bot not uploading||0||21:42, 26 February 2012|
|Missing racso bots||0||00:15, 3 November 2011|
|Trouble with Nucleii.ED4 1.0||1||22:24, 29 September 2011|
|Supersample.SuperCrazy and supersample.SuperCrazy||2||06:49, 6 September 2011|
Wow, I look away for a year and I am knocked down to 9th place. I might have to make a new robot, if only I had more time these days.
There are a few bots doing worse than sample.SittingDuck, but imo we should keep them, at least not removing them in one step.
1. Removing a lot of bots in roborumble in a short period of time disables the comparation between different versions of every bot in APS, survival, PWIN, etc. As they all depends highly on the distribution of the participants. Doing so makes the recorded APS, Survival and PWIN in version history completely useless, and you can't even reload the scores as the score against those removed bots are always counted in inactive versions.
2. They are a great indicator of whether your bot has some serious bug that happens rarely. And we DO need enough weak bots to have the chance to trigger it.
3. They don't waste time in roborumble as they die too fast, making a battle almost done immediately.
OK, reduce load is good.
Anyway, I think we should not remove any bot except your own without discussion, unless those bots are violating rules or completely broken (e.g. invalid package).
Now, maybe we may add another rule: If your bot is worse than sample.SittingDuck, it may have some serious errors, therefore may be removed to reduce load for everyone who runs roborumble.
There are also some bots that try to do as badly as possible, so do worse than SittingDuck by not getting survival bonus etc.
WoW that's amazing
Now I'm considering a new bot that ranks the last in the rumble. aaa.WorstBot. I think I can use ReversedWaveSurfing to make sure the enemy hit me 100%, and when he is not firing or ramming, go hit the wall and hit and hit and hit the wall, while avoiding ramming into the opponent accidentally.
The reversed rumble is as interesting as the normal one, I think we should permit this.
Well, may be I removed them somewhat hasty. But I get tired fixing broken download links, and was looking for excuse to trim the list. If authors do not care about their bots why should I? Though, among the removed ones, I think only Galaxy was with a broken link.
We do need a procedure to remove dangling links bots. Without it the rating will be in "unstable" state forever. Look at the rumble  quite a lot of bots missing about 10 pairings rigth now (2017/09/07). Why? Because opponents are not downloadable.
There are bots which comunity would always take care: former champions, open source bots, and bots which have wikipage describing their logic. I.e. the ones from wich we can learn. But if it has no wiki page, closed source, and its link is expired so it blocks ratings stabilization, then I tempted to come with a big eraser.
1. I would still encourage the removal of the "worst" bot from the main rumble. Everybody get 100% against them, so they do not change resulting APS too much, since there are only handful of them. If we want to compete in being the "worst", we can start a separate rumble.
2. We might worry about APS change which is used for comparison. But it will be different anyway because there are always (well often) a new bot or new version of bot entering the competition, so old APS is slowly loosing its value anyway.
1. not downloadable opponents are not a problem, as we have rather archive or no one could publish a score about it. Therefore I think we should keep every bot as much as possible, unless we can't find any valid link for it.
2. A separate rumble for worst bots is completely different from competing together, unless we copy all the normal rumble bots to the worst bots rumble as well, which is a even more waste.
3. Even if the history APS is losing its value anyway, it does so slowly, therefore it's not a big problem. But removing a lot bot all at once is a BIG PROBLEM.
Let me argue that missing jars is the problem. Suppose you enter a new bot in the rumble and suppose that 10 weakest bots are inaccessible. Then this new bot, will have overall smaller APS than some old bot who already had a chance to pair up with weak ones.
So your APS would be smaller, not because your bot is weaker but because bots are missing. Literumble will show this as "Rankings Not Stable" but its probably not what you want.
Opposite will happen if 10 strong bots are missing.
Thus I am pushing for removal of missing bots so the rankings is done on the same set.
This only happens when no one that runs roborumble has access to those bots. And the real problem is our priority battle algorithm — it takes more than 3000 battles to be guaranteed to have a full pairings, but the default battles threshold is 2000, which leaves new bots typically with missing pairings.
Unstable rankings has nothing to do with those inaccessable bots.
I suggest we tweak the priority battle algorithm a little — when there's no one that has battles below threshold, prioritizing on pairings. Only when everyone has full pairings, run random battles.
However, the first problem to solve is that we make sure every bot is accessible to everyone.
And NOT everybody is getting 100% against worst bots. e.g. ags.RougeDC is getting around 50% against aaa.WorstBot, which must indicate a bug.
And, removing ten 100% APS bots will decrease everyone by 1.0 APS, which is considerable. Although the rank is not affected by much, doing so destroys the existing meaning of ranges of APS. e.g. 90 used to be a barrier of the extremely strong bots, but if we remove 10 100% APS bots, the barrier will be 89. A big shift in everything is very inconvenience.
A lot of bots is missing 10 pairings simply because there are 10 bots removed temporary, and it takes a lot of time to add the pairings back.
Therefore we should really take caution when removing bots — It takes less than one second to remove, but it takes almost a day to add it back.
> Well, may be I removed them somewhat hasty.
I didn't think it was hasty at all. On the other hand, I was glad that finally someone was taking the initiative.
I personally agree with Beaming's argument. However, it doesn't matter that much to me, so I won't waste breath arguing about it.
My thoughts on removing are that we should only remove if:
- We don't have a .jar in the robocode-archive
- It is broken in new JVM / Robocode versions, and closed source so we can't try to fix, or open source but the fix would be really complicated, or the bot has no historical significance and nobody cares.
You do not have permission to edit this page, for the following reasons:
- The action you have requested is limited to users in the group: Users.
- You must confirm your email address before editing pages. Please set and validate your email address through your user preferences.
You can view and copy the source of this page.Beaming (talk)
Shall, we instruct liteclient to look at the robocode-archive automaticly, if primary link is down?
I can probably cook up a patch for it rather quickly.
Maybe better would be a script which transforms the Participants page? This way we aren't hardcoding new URLs in the client that will need more maintenance, and we can easily run the script again to see which bots need fixing, and we know from the changelog of the participants page which ones don't have hosting anymore. This script could be made available on the wiki page. And it is 100% compatible with current rumble client.
It seems that roborumble thinks SimpleBot 0.023h_knn to be a bot named "SimpleBot_0.023h" with version number "knn", and failed to retire it from the rumble.
How can I remove it anyway from the rumble?
Do not worry, it is already gone. It just take a bit of time for rumble to process it.
By the way, you can add flag for your bot.
No it is still there. And it is not removed because the literumble says ERROR. name/game does not exist: aaa.SimpleBot_0.023h knn/roborumble
If you see http://literumble.appspot.com/Rankings?game=roborumble, and search, there will be 3 SimpleBot, as the old ones are failed to remove.
the problem is not in roborumble@home, as the request body is version=1&game=roborumble&name=aaa.SimpleBot_0.023h_knn&dummy=NA which is raw data.
But the server handles version numbers with underscores badly, the server thinks it is "aaa.SimpleBot_0.023h knn".
Hmm, yeah, LiteRumble itself seems to see the versions correctly. The rankings, anyway. I'm not sure if it's a client bug encoding the URL or the bot removal endpoint on the server that has a parsing bug (or both, even). In any case, I think we may need Skilgannon to manually retire the old bots, or maybe tell us how to do it.
Hey, sorry for the delay. This has been a longstanding problem with the rumble client. In some interfaces it replaces the spaces with underscores, and in others it doesn't and expects the server to do it. In order to be compatible I had to replace the last _ with a space in the server. It can't be the first _, because many bots have _ in their names. However I hadn't considered what would happen if we have _ in the version. Manually removing is a little complicated, I think I'd actually have to patch the code. However that doesn't prevent this from happening again.
I guess maybe I need some more intelligent logic in splitting the version and the name. Maybe this should be done as a challenge? It is a fairly 'soft' problem, and we have a bunch of input data (the rumble population) that we can verify the algorithm on. Bonus points if it is written in Python.
So the basic requirements are:
- Has the same behavior as 'replace last _ with space' for the current rumble population
- If you add an _suffix in the version number of any of the current bots, correctly detect that it is in the same prefix version, and the suffix you add is part of the version.
However I think this problem can be solved by simply trying all of the combinations when failed to remove a bot ;) As the problem seams to happen mostly on removing bots.
When removing bots, first try the canonical method, if failed, try everything that matches the name where '_' are replaced by `[ _]` (regex that matches space or underscore).
Say, if it failed to remove "SimpleBot_0.023h knn", try /^SimpleBot[ _]0\.023h[ _]knn$/ until success ;)
How did you even get an underscore in the version field?
The robot packager says "Please enter a version number for this robot (up to 10 word chars: letters, digits, dots, but no spaces)", and greys out the "Next" button if you type an underscore.
I never tried package my bot with robocode ;) I use a utility which transpiles my bot to Java 6 (I'm using Java 8 and it's features) and package it at the same time.
> I use a utility which transpiles my bot to Java 6
You can just compile with:
javac -target 1.5 -source 1.5 [...]
How long is usually between listing a robot under particiapants and having it participate in roborumble?
It depends on whether people are running RoboRumble clients, but I can see (here) that some people are, so it should be pretty quick. I checked your bot and was able to fix the Dropbox link - it needs to be a download link directly to the JAR. Hopefully that does the trick. Good luck in the Rumble!
There is something wrong with you bot. I see the following line in the roborumble console
Participant ignored due to invalid line: Grystrion.Grystrion2_1.0
Once you submit your bot to the wiki, it is good idea to run the roborumble on your own computer. Firs of all, you see if the roborumble can download and execute your bot. Secondly, you would help to run the competition, easing the load from other people CPUs.
Would you please elaborate? The bots for download are provided by the rumble, so removing them locally on a client would not change them missing a pairing. Do I miss the idea?
Also, there are at least for new bots which are less than week old, 2 by me and 2 by Cyragia. These are still pairing.
With this being said, there are couple bots which my client is unable to download.
Sorry, I meant to remove them from the participants list (or cut/paste them to the "broken bots" part at the bottom). There are 1112 bots total. If you look at the rankings sorted by number of pairings (), there's the 4 new bots by you and Cyragia at the top (with 1099 or 1104 pairing), then a few more between them and the bots with full (1111) pairings. So those bots are not getting battles vs the new bots, I guess. I thought it would be easy for whoever is actively running a client to see which bots are unable to download and go ahead and remove them. Or maybe they are available from Rednaxela's archive, if you feel like digging around to fix the URL. I'll try to get to it soon if nobody else does, but I don't have a rumble client setup right now.
I cleaned rumble there were a couple bots which my client cannot download. But there were a couple which report "an invalid robot or team". I have no clue what it means but I tend to blame my client, since they are downloadable and there were some relatively fresh pairing (month old) for them. But I removed them anyway, since de facto I am the only one persistently running battles.
I did not change anything in melee, since, right not, it looks like every bot have a pairing wiht another one.
I can't download bots from dropbox submitted in the last days. Is it offline?
Is Java 7 still a no no for Rumble bots? I'm about to update my development machine to Linux Mint 14, and wondering if I should go for OpenJDK 6 or 7.
Thanks for the info.
Do OpenJDK 6 and 7 play nice installed on the same system at once? I've got a game that has issues with OpenJDK 6. Maybe it's a sign that I need to just stick with Robocode over other distractions.
Oh, and having a great time with student LEGO robots running RobotC. It's been a trip adjusting to C's quirks, and oh how I miss objects! *chuckles*
I think you can just use Java 7 and just make sure Eclipse (or whatever) is set to compile with Java 6 compatibility.
Lego robots sounds pretty fun!
Whoops - apparently I triggered an actual competition for the worst-scoring bot ever... nice try on that one, conscience.Suicidal! But given that it achieves this only by wasting computing time like hell, I suggest to ban this approach, and remove conscience.Suicidal from the participants list.
(Also note that the bot doesn't actually achieve zero points; it just manages to be thoroughly disqualified, preventing its points from being included in the score table ("SYSTEM: No score will be generated"). Therefore, cli.PerfectLoser still remains unchallenged :-P)
Toad vs UberBot is the only pairing missing from General 1v1 right now. I excluded UberBot before bed because it was failing for me (guessing it's Java 7). And IIRC DrussGT vs Toad got no battles until my clients joined in yesterday. So it may be that other clients are having trouble with Toad, and I'm having trouble with UberBot, so that pairing never gets battles.
I have both bots excluded melee/single as well. They crash my systems (mac/linux) every second battle or so. MogBot has a serious thread writing file issue and i was not longer in the mood to deal with this bot, crashing my system in the middle of the night. So +1 for removing it from me.
Yeah, maybe we should just remove them. It seems like something is removing them from the rankings, as well. (I don't see them right now - was watching for when Diamond 1.7.9 would get its remaining battles vs bots I've excluded.) I've got extra.Sauce and extra.LightSauce excluded too.
Maybe I'll also take a look if they're open source and if there's something I can do to post fixed versions to the rumble. MogBot is one of the earliest Pattern Matching bots, so I'd love to keep him around - I learned about PM from the author's tutorial many moons ago. =)
Yep i remove Mogbot from the meleerumble. MogBot is open source so if you find something to change the thread writing stuff .. feel free. Nice story about MogBot, for me it was until now just a really annoying bot, but with this history well, i change my mind.
I figured out if you exclude bots from the rumble and you are the main contributor the rumbleserver removes these bots until they get new battles. Otherwise there are a lot of bots in the participants list that i don't have excluded but also don't get battles.
Hi folks. Several bots in the 1v1 Rumble that aren't in the superpack and also seem to be hosted on domains that have been dropped. I see several at rednaxela-robocode.dyndns.org that can't be downloaded, for example. Can anyone give these bots a good home?
-Tkiesel 13:18, 3 May 2012 (UTC)
Hi mate. have you downloaded the "Participants_20120307.zip" from Start with rumble its under the 1vs1 database section. This one is just 2 month old and should all bots have you need.
You are right to give the bots a new home but till then the database will do :)
Thanks for the heads up! When I get the time, maybe I'll give the relevant bots a home on my DropBox account!
-Tkiesel 14:43, 3 May 2012 (UTC)
Sorry about how rednaxela-robocode.dyndns.org got messed up. It's on a stable server, but there was some shenanigans with the DNS. To make ammends, I've created something that hopefully will be quite useful to the robocode community, which I will ensure is very stable into the future.
http://robocode-archive.strangeautomata.com/ now provides an archive of all robots in the rumble, which automatically updates on an hourly basis. The "participants-latest.zip" file always contains all robots currently in the rumble (historical versions trimmed out), but the "robots" directory there contains historical versions as well.
That said, I'd like to note that there are 4 robots that I am missing and cannot locate in any available zip file or site:
- maribo.IotaCT 1.0
- maribo.Omicron 1.0
- extra.Sauce .01
- extra.LightSauce 0.01
Hi mate. This sounds really great. I have put some of the missing bots here -
[missing Bots]. And i change the Start with RoboRumble links to your server later.
I have maribo.IotaCT 1.0 and maribo.Omicron 1.0 on my computer. They were put up on RobocodeRepository recently, so they aren't in the superpack. I don't know where else to upload them. Maribo
It's strange - i have no problems in same environment: java version "1.7.0" Java(TM) SE Runtime Environment (build 1.7.0-b147) Java HotSpot(TM) 64-Bit Server VM (build 21.0-b17, mixed mode)
Can you do some more investigations (Tomcat vs Druss & Scarlet vs Druss etc. and see in which battles memory usage will higher) to detect which robot is eats memory and, if it's Tomcat. i will try to fix it
Scarlet uses lots of memory. The longer the battle the more memory used. So battles against top bots will use the most.
Made some tests. I set -Xmx2g, tested the pairings below twice each and tracked how much memory the JVM allocates:
Those numbers are maximum heaps. The actual used heaps are usually between 30% to 70% of those numbers.
So it seems both Tomcat and Scarlet use a lot of memory... hmm. I can see how a battle of one vs the other could cause problems on a memory limited machine. What are the maximum heaps used when run with -Xmx512MB ?
The reason Scarlet uses so much memory is it is just two bots that have been welded together. Each side manages its own information and systems.
Why would that matter? My gun and movement don't really share any data. I might save on the size of a few class files they share, but that's negligible (the whole JAR file is a couple hundred kb).
Unless there's two movements and two guns being initialized and/or processed, but I'd be surprised if that's how you guys have it setup, and it would be easy to fix.
I also do some investigations:
- Tomcat 3.54a vs Tomcat 3.54a: ~320 mb
- Scarlet 1.1c vs Scarlet 1.1c: ~300 mb
- Druss vs Druss: ~129 mb
- Diamond vs Diamond: ~116 mb
Numbers - it's smallest used memory showed by robocode after gc. So both Tomcat & Scarlet is memory consuming robots. I tried to optimise Tomcat, but did not get any success. Anybody have ideas, how i can detect which part of Tomcat uses most part of memory?
Are you using fasttrig? If so, how big are your arrays?
Are you using multi-dimensional arrays? It is very easy to use LOTS of memory. I have many multi-dimensional arrays, but use a lazy initialisation so that memory is only used if it gets written to first.
If you have your own kd-tree, check that it is splitting how you want it to. If it only splits one point off each time, it will get deep very quickly. This uses lots of memory, as each object needs a container and a bounding box.
Check if you have static arrays/lists that could initialised after each round.
Don't keep logs of data that won't get used again. Once it won't be used (eg at the end of the round) clear it explicitly by setting to null or calling .clear()
Check you don't have circular dependencies. Make sure you break the dependency if you delete something.
These are my first thoughts =)
- I use fast trig, but initially Tomcat eats less than 100 mb of memory
- I do not use multi-dimensional arrays
- I use Rednaxela's kD-tree and own R-tree. Splitting in r-tree is fairly well
- I keep static links only to movement & gun logs.
I tried to change double to float everywhere, but it has not any influence on memory usage. My problem, that i have 16 logs with visit data, 16 logs with hit data & 6 logs with tick data, but i can not find way, how to reduce it without APS degradation. But any way, thanks for response:)
I can increase the maximum heap in my clients, although it will change RoboRumble "rules". So I would like the opinion of the people here.
Not much to add, lots of good suggestions already, but do you keep all your waves around forever? I clear the objects between rounds because all the pertinent data is in the kd-trees. I have more than a few trees in both gun and movement, too, for what it's worth. Do you have a big object as the "value" in your trees?
"static links only to movement & gun logs" and your R-tree are what jump out at me. I would try chopping off certain parts of your bot and seeing which brings your memory usage down to earth, though some kind of heap profiler (like MN suggested) would be more elegant. =)
Off-topic: Circular dependencies shouldn't matter, at least in modern JVMs. If the only references are within a circle unreachable from the root of code execution, it will be garbage collected.
No, all waves, bullets, targets etc. are dropped out at end of each round. But yes, "value" objects are pretty big, thanks, i will think about it. I do not understand you about my R-tree.
I don't know about the Nene side of the code, but on the RougeDC side, Scarlet's gun is somewhat heavy on memory. IIRC it's partially because of it's folded single-tick pattern matching. I might look at some heap profiling to see exactly how the memory usage is broken down some time, but I'm pretty busy these days.
In general, I kind of wish Java allowed the security manager to deal with memory usage, because then the robocode engine could have a 'real' rule about memory use. Instead we're just kind of left with "not too much please".
Yeah, I was wondering if it were at all possible to limit memory per bot with the Java classloaders... I guess not? It sucks that one of these two bots is probably being punished just for using more memory than average, even though it might still be well short of its fair share of 256 megs.
Though if we used per bot limits in Melee, I wonder if a lot of us would have issues. You can probably get away with far more than your "fair share" of 51 megs in Melee, since you're usually not facing 9 other memory hungry bots.
Yeah, so far as I can tell java classloaders have no control over object initialization.
Well, for melee, do keep in mind that the default heap size in the shell script that launches robocode is larger so it would be more than 51 megs per bot (can't remember exactly what off hand)
I can only think of a dedicated JVM per bot to achieve this kind of limit. This is how it is done in Java clouds out there.
The challenge comes when you try to integrate the JVMs together to calculate battle state and skipped turns.
The robocode engine has already went through some changes to effectively support a similar (remote-ish VM) situation for .NET bots. I imagine that past work would make it less of a problem to use per-bot VMs in java. The biggest issue may be latency between the VMs slowing down the turns.
Anyone elses rumble clients having issues loading Nucleii.ED4 1.0? My client says it's an invalid robot, which is keeping it from doing much if there are no priority battles for robots under 2K battles.
Is the rumble client case insensitive? It can´t download supersample.SuperCrazy if Supersample.SuperCrazy is already downloaded...
A compatibility failure between the downloaders 'check if jar exists' file checker and the robocode 'load given robot jar' file loader. The former is case insensitive (probably had a good reason for it at the time) where as the latter is not.
Now that I think about it, the former may just be a "check if robot exists in robot database", rather then a file checker.