Genetic tuning

Jump to navigation Jump to search

Genetic tuning

WaveSim looks great to genetic tune your gun classification. And would probably be the biggest improvement to Combat.

But what about movement and energy management? I wonder how people here tune them.

MN03:43, 16 January 2013

I've never done any genetic tuning outside of WaveSim for gun classification, but I've thought about it. At first it seems like I'd want to be able to, from Java code, modify some other Java code and package it into a Robocode bot, copy that bot into my RoboRunner robots dir, then run battles. That always seemed really cumbersome to me.

But I think a much simpler approach is to just have one packaged bot that you always run, it reads in the DNA string from a file and interprets it however you like, then your genetic algorithm code just modifies the input file and runs more battles. This is the approach I'd take. Once it's done, you can convert it to real hard-coded behavior if you like. Especially if you're tuning a MiniBot (or smaller), you'd probably want to remove the file reading code.

Voidious03:49, 16 January 2013

You do not have permission to edit this page, for the following reasons:

  • The action you have requested is limited to users in the group: Users.
  • You must confirm your email address before editing pages. Please set and validate your email address through your user preferences.

You can view and copy the source of this page.

Return to Thread:Talk:WaveSim/Genetic tuning/reply (2).

 

Unfortunately genetic tuning is very long and tedious if you do it with a whole robot. Genetics also happens to cheat, so you need a good sample size to test against. I have genetically evolved a robot that could perfectly defeat walls from a certain starting position for example. But no where else.

Chase19:29, 18 January 2013
 

That's pretty cool you tried it with overall robot behavior. That's definitely on another level in terms of complexity. I think that's usually the distinction between "Genetic Algorithms" (your code interprets the "DNA" string and alters its behavior in predefined ways) and "Genetic Programming" (your "DNA" string is translated into real code that itself is run as the bot's behavior).

Voidious23:11, 18 January 2013
 

It was the latter. It took about 100 generations (each generation with 1000 randomly mutated samples). I didn't take into consideration genetics natures. I thought it would evolve like how we write robots. You know a radar, some basic back and forth movement. Nope, everything spinned, it just happened it stumbled upon a perfect speed and turning ratio to move from its start location (to dodge all walls shots) and a perfect speed to spin its gun and fire to hit walls.

But this was from specific stating positions and angles, which I had programmed into the battle to remove all the extra 'noise' so I could run shorter battles and still get a good sample. Since neither walls or the robots genetics had any random function, I figured it was safe.

After all that work, I decided not to pursue it further. Since it was long boring, and I could write a better robot in 5 minutes.

Chase07:10, 19 January 2013

Preserving the 'noise' is useful in these situations to avoid over-fitting. Although using only Walls in the fitness function will still produce a bot which is only good against it.

MN14:23, 19 January 2013