Upgrade client version
← Thread:Talk:RoboRumble/Upgrade client version/reply (26)
Just removed ScalarN d.160
The only related change in d.160 is experimenting lambda expressions
And it seems that codesize.jar cannot analyze bots with lambda expressions and returning non-valid results
We should disable 1.9.3.3 and wait for codesize fix along with my melee pairing fix.
I submitted an issue: https://github.com/robo-code/codesize/issues/3
The codesize utility currently uses BCEL 5.2 which cannot handle Java 6, 7, 8 features properly. And the roborumble client has a logical bug that treats codesize calc failure as codesize 0 and allowing it to participant in even nano. The correct logic should be treating codesize calc failure as infinity, only allowing it to participant in MegaRumble.
You do not have permission to edit this page, for the following reasons:
You can view and copy the source of this page.
Return to Thread:Talk:RoboRumble/Upgrade client version/reply (28).
Codesize 1.2 for Java 7, 8, 9 (Experimental) is ready.
Do we need to fix other stuff for RoboRumble before I make another release?
The melee pairing fix is now fully tested and available as a pull request:
https://github.com/robo-code/robocode/pull/13
discussions:
http://robowiki.net/wiki/Thread:Talk:RoboRumble/Upgrade_client_version/reply_(2)
Your pull request has now been merged and I have assembled and put a beta version here: https://robocode.sourceforge.io/files/robocode-1.9.3.4-Beta-setup.jar
If you are satisfied with this version, I will make the release. :-)