Congrats!

Jump to navigation Jump to search
Revision as of 28 July 2012 at 13:34.
The highlighted comment was created in this revision.

Congratulations on releasing this!

I've got it working already. Super easy.

It is moving along noticably quicker than RoboResearch! Awesome work!

    Tkiesel05:17, 26 July 2012

    Cool, so nice to hear! =) I think some people will miss the RoboResearch GUI, but maybe I or someone else can add one sometime. And there's still quite a few little things left on the to do list. But I'm pretty happy with it. =)

      Voidious05:28, 26 July 2012
       

      Hi Voidious. Nice program have you put there together, respect. I'm really excited about the melee feature. I had a couple of tries with RoboResearch but got it never to work on melee benchmarks. Easy to install and use, nice job.

      Have you thought about some sort of dynamic score output? For me it would be very useful if i could write my own benchmark score, because in melee it is sometimes better to get a score view along some certain battle states. Like start/middle/end game or score against every opponent by its own. If you have for example Diamond :) and some samples together i would like to know how much score i loose to the samples (or in general weaker bots) if a top bot is on the field.

      I will have a look at the sources, and maybe it is possible to make the scores dynamic. Maybe you have something in mind and we could share some ideas. I'm very fond of the idea to have a nice and easy melee test platform.

      The remote client feature of Jdev Distributed_Robocode would be awesome. Unfortunately it seems to need Java 7 and therefore is out of my reach.

      Take Care

        Wompi09:11, 28 July 2012
         

        Hey Wompi, thanks for the thoughts. The score output could definitely use a lot more features/options, it's pretty bare bones right now. You also make me realize that I don't even score per bot scores in the data file, so I'll need to fix that first. One thing is, as much as possible, I want to make the right decision automatically about how to show scores instead of making you remember lots of settings, but in cases where different things make sense to different people I'm OK with adding optional flags or whatever.

        So for Melee, right now we have something like:

          voidious.Diamond 1.8.4.x12 vs abc.Shadow 3.84i, sample.Crazy 1.0: 61.02, positive.Portia 1.26e took 34.8s, avg: 59.93.
        Overall score: 55.34, 1.5 seasons
        

        So maybe if there's more than one opponent, we'd add the per bot scores each on their own line after that? Like:

          voidious.Diamond 1.8.4.x12 vs abc.Shadow 3.84i, sample.Crazy 1.0: 61.02, positive.Portia 1.26e took 34.8s, avg: 59.93.
            vs abc.Shadow 3.84i: 55.05 (22000 : 19000), avg: 53.70
            vs sample.Crazy 1.0: 90.1 (22000 : 2000), avg: 90.2
            vs positive.Portia 1.26e: 53.43 (22000 : 20341), avg: 54.15
        Overall score: 55.34, 1.5 seasons
        

        What do you think? Would you also like to see bullet damage / survival data? I always collect all the different fields for scoring, but for the most part was only going to show whatever you had configured as the scoring style. But I've been thinking lately it might be nice to show bullet damage / survival too.

          Voidious14:28, 28 July 2012
           

          Oh, and about the options for mid-battle score data, that sounds like a really cool idea. Do you mean like you could write and plugin your own scoring code? I'm not really sure the best way to set it up so you could write your scoring class and pass it to RoboRunner, but from a technical standpoint I don't think it would be too tough.

          I was just thinking yesterday that it would be cool to integrate some stuff like what Rednaxela did here for collecting hit rates and stuff during a battle, too.

            Voidious14:34, 28 July 2012