Talk:RetroGirl/Gun

From Robowiki
Jump to navigation Jump to search

Awesome concept - and probably one of the better uses for a high-speed gun testing scheme as well. I see you've given values for HOT, your static decision tree and Diamond's gun, but how about a few other common ones like circular, linear, random-linear etc? I'd be curious to see where DrussGT's 'fire linear before there are waves' technique holds up. --Skilgannon 05:33, 16 March 2011 (UTC)

Hey thanks! =) It does leverage WaveSim quite well - a height=9 tree goes through 1000 battles in ~30 seconds on my laptop (both cores, loading data from a RAM disk). Yes, I've also been quite curious about those comparisons... Unfortunately, WaveSim is so "wave" that I don't even have enough data in my current data sets to do LT or CT with bounds checking (I need to store enemy direction or absolute heading), and I hadn't gotten around to butchering the battle simulation to let me deduce it on the fly yet. I just did and got:

  • 16.06 with LT from WaveSurfingChallengeBotB
  • 16.3 with a randomized LT, multiplier = (1 - (Math.random() * Math.random()))
  • 16.71 with CT from WaveSurfingChallengeBotC
  • 16.48 with a randomized CT, multiplier = (1 - (Math.random() * Math.random()))

So I'm only getting slightly better than CT right now, which I don't find too surprising. Hopefully I can still improve it a bit further. I'm also going to test using the tree for the first 1/2/3/... shots before switching to Diamond's main gun and try to find out what's optimal there. So far, using it for up to 5 shots isn't looking too promising, though there are some unlucky results still there too. --Voidious 07:02, 16 March 2011 (UTC)

Will probably rename this page to be about perceptual / non-learning guns in general, but anyway. My new perceptual gun is more like KNN - 500 data points, each with an associated GuessFactor, and to aim it just uses the GuessFactor of the nearest point to the current situation. Seems a bit more versatile, more similar to the current best aiming strategies, and also lends itself to GA a lot more I think. Interpreting the bits to a decision tree, the higher up on the tree, the more cascade effects from making any changes. With this setup it should be able to tweak a lot further since changes won't cause disruptions elsewhere.

Only tried 200 and 500 data points so far. I got the 500 point version up to 17.636. It was still steadily but slowly going up... But figured I'd just cut it out, try it in Diamond, and then try more points and/or more attributes. I may use accel and time since velocity changed, which aren't really perceptual, but that's permissible in Diamond. ;) --Voidious 01:57, 25 March 2011 (UTC)

I wonder... have you considered algorithms besides GA to generate the 500 data points? It seems to me that clustering all the wavesim data with 500-means clustering, would be a rather ideal way to generate these 500 data points. Maybe slow to process, but I think it would get a stronger solution quicker than GA. Interestingly though, googling "genetic algorithm clustering" shows that some people are applying GA techniques to clustering, though I imagine it takes quite a bit of tuning to start to match the performance of k-means. --Rednaxela 03:51, 25 March 2011 (UTC)

There are no threads on this page yet.