User talk:Pedersen/Ugluk

From Robowiki
< User talk:Pedersen
Revision as of 19:15, 11 January 2008 by Pedersen (talk | contribs) (migration)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

I enjoy your enthusiasm. <removed obsolete comments> If you want a good indication of Ugluk's melee strength, you might want to start with some classic well-rounded competitors: gg.Wolverine, intruder.PrairieWolf, emp.Yngwie and tzu.TheArtOfWar come to mind. Cheers. --Corbos

I was running Ugluk 0.4.0 against Stampede2 1.1.0 and I noticed that when Ugluk is entered second into the battle, it doesn't fire at all for the first 2-3 rounds. This happens consistently, but only against Stampede2(as far as i know). I ran Ugluk against several other bots with the same setup but could not recreate the results. Anyway, just letting you know about a potential bug. EDIT: I ran a few more tests, it looks like it has something to do with the fact that Stampede2 only moves when fired at. --wcsv

I noticed some odd reluctance to fire as well, and eventually I tracked it down to a line that basically says if the expected hit percentage is too low, don't take the shot. After commenting out that line Ugluk was firing once again. Last night I uncommented the same code and Ugluk is still firing, but at a better hit rate. I recently added energy management statistics and overhauled the internal movement prediction engine, but at some point I dropped from second place against the Freya line of bots (Freya in 1st) to dead last. I am trapped in a cycle of playing with movement combinations and other settings. Also known as polishing a turd. -- Martin Alan Pedersen

I found the bug causing Ugluk not to fire against a non-moving target. Three factors went into it. I was dividing a long by a double of the same value when I had a 100% virtual bullet success rate, which evidently was the root of the issue. Also against a sitting duck there are only two guns that I have that won't hit 100% of the time; tangental oscillation and mirror guns. In 0.4.0 those guns were disabled to reduce computation time. If they hadn't, one would have fired at some point and made you move, which would have snowballed into all guns being fully operational (and rather over-confident). I've a few more bugs to ferret out before I release 0.5 though. -- Martin Alan Pedersen

While refreshing my memory of the virtual bullet engine, recently refactored to better compliment the wave system I'll use for guess factor targeting, I noticed a nasty redundancy that I thought has been cleaned up in an earlier fix. The short of it is that my targeting statistics and wave surfing data were garbage. With a quick repair I turned Ugluk's performance around and made him the star performer I knew he should be. While 0.4.3 was trailing about 100 ranking points compared to 0.4.0, 0.4.4 is leaps ahead. --Martin Alan Pedersen

Found another bug, this time in my linear and circular mean targeting methods. Maybe they won't suck so much now. I overhauled my firepower selection mechanism to shoot at the best product of hit percentage and damage. Should make for higher powered shots (but not much higher) in the opening ticks of melee battles where things are packed together, and higher velocity shots as things thin out. Testing it now against the bvh.* crew. I've also earmarked my present virtual bullet system as 'virtual', and shots actually fired as 'actual'. My virtual bullets will feed my regular battery, which will fire actual bullets which will feed my fired bullet stats, which will feed my guess-factor gun (which will also be part of the virtual gun array). After a little while the guess factor gun may become favored over the other guns and take over. Then your bot will be a smouldering heap of rubble. --Martin Alan Pedersen

It was nice to see Ugluk taking first place in melee battles in the Rumble for a change, though I'd seen many 2nd place finishes before. 2nd place is 1st place for losers. --Martin Alan Pedersen

Freya - Diety or no, I shall eventually crush you under my heel. Thus far you have eluded me, but there are only so many blindspots to hide in. As the veils of my ignorance are lifted, your destiny may be more clearly seen.
Not before the "Gotterdammerung" if i have it my way. So i will take up the challenge. I guess i have at least 20 'blindspots' left in melee to hide. ;) --Loki

I haven't been actively coding lately, having purchased a new machine that took about 4 days to get running. I'm also low on ideas of how to improve Ugluk's performance. I made a significant leap with Ugluk's movement but the guess-factor-like gun has not been the gun of choice against any opponents thus far. I think my next step is to make the bin divisions parameter driven so that I can have multiple guns and eventually dynamically segment (or implement some other catch-phrase). -- Martin Alan Pedersen

Well, I made an innocuous adjustment in melee and it went awry, but my dueling tweaks were an astounding success (v0.6.1). -- Martin Alan Pedersen

Melee is hard to grasp, small improvements tend to let you loose 10 places. One-on-one is a lot easier to improve. By the way, what do you mean with 'Domination'. -- GrubbmGait

Well, each rumble has a different number of opponents, so my getting 38th in melee looks a lot better than 123rd in duels, though they are about the same in terms of what percentage of the competition I am doing better than, which is my 'domination' figgure. My best melee bot (so far) can beat out 75% of the melee rumble entries, and my best one on one bot can beat out 69% of the opponents. When I hit 100%, it means I've finally beaten Ascendant (or whomever beat me to it). So.. it is just my way of putting Ugluk's ranking into prespective. -- Martin Alan Pedersen

Ugluk 0.6.6 simply does nothing in most matches on my system... So every result comes back with Ugluk losing 6,000-something to 0 (which I think just discounts the match completely because of the zero). No errors printed to the console or anything. I'm using Java 1.5 to run Robocode, FYI. (I actually just noticed a match vs TheBrainPi where it didn't get 0, but that was the first I noticed that wasn't.) -- Voidious

Part of that is me being mean. He won't even try if the battle is not 35 rounds long. It hampers exhaustive tests against him, while still allowing him to compete in the rumbles. Evidently I failed to test my recent modifications to the kill switch implementation. Doh. Ah well. It was pretty funny to barely beat SittingDuck. -- Martin Alan Pedersen

Hm.. I removed 3 guns that I didn't think we all that necessary .. and it turns out I was wrong... good thing I make regular backups of my code. -- Martin Alan Pedersen

Re: "me being mean..." Martin, if you ever do make it to #1 in these rankings, I'm going to dub you "the Bill Gates of Robocode". :) -- Voidious

A small price to pay.
I fixed some errors and added the mostly useless pattern matcher. -- Martin Alan Pedersen

"Momentum = 1.9273471707492718E-13" .. I'd say that's pretty stable. -- Martin

Congratulations for breaching the 1700-barrier, next goal 1800?? or maybe 1700 in melee?? -- GrubbmGait

Thanks. I'm looking to get in the top 20% of melee, which would be ~1666. Not sure how it's gonna happen though. I still only rarely beat Freya 0.31 in my test melee battles. -- Martin

Martin, i allways get a NullPointerException in "setPerformanceStatisticsTracking" with Ugluk 0.7.2 (in melee). But as it is currently ranked 45 in melee, it has to do with my set-up (Java 1.5.0) and not with Ugluk. Any ideas? --Loki

Heh .. just me being a jerk again, though I wasn't being as clean about it. I've fixed the code so Ugluk will just sit there doing nothing and not throw the exception. Ugluk won't participate in battles that aren't 35 rounds long! He's sneaky that way. -- Martin

  • mmmm, so except for a fighting-strategy you also have included a strategy to hinder me developing new ideas and test them... ;) Well, now i understand why in my normal testing (50 times a 35-round battle) i got normal scores for Ugluk, but when i ran some 1000-round battles today with Robocode to watch my new movement i saw a rather unconvincing 'sitting duck'. --Loki
    • well, in terms of testing against someone you will be competing with in the Rumble, a 1000 round battle gives you tainted information. My guns have more data to work with and will be firing more accurately, as will yours (though I've found yours are better than mine in 1000 round battles), so the scores may be more stable but don't reflect scores of Rumble battles. I set my battles-per-bot to 1200 so I'll fight all my opponents three times (on average) before I mark down a 'final' rating. Three fights leaves a lot of wiggle room.

I guess the Roborumble isn't everything though. I should be flattered that you want to secure dominance over Ugluk. Then again, I've got the same ideas about your bots... -- Martin

I refactored my bearing offset gun segmentation to represent the segments as objects instead of a multidimensional array. This was a brain draining undertaking, but it should help the cleanliness of trying out new segments. I'm submitting v0.7.3 as a baseline for the changes, and v0.7.4 will likely have new segments that have promise. -- Martin

Hey, welcome back from your brief absence. By the way, I'm honored that you find my bots to be a worthy challenge. --wcsv

I remember getting Stampede2 from the repository when I was just getting started with Robocode and I didn't realize how strong it was because I was running melee battles. When I decided to work on duels I did some 25 round tests against Stampede2 and Chomsky. Once in a while I'd get one or two wins against Chomsky, but Stampede2 shut me out every time. I can beat Stampede 1.3.3 now, but your other two bots are leaving welts on my hindside. ( I can also nearly beat Chomsky 1.5 but Chalk is out of my league for now. ) - Martin

Ya know, I had a dream last night where I saw some kind of sign post labeled "Ugluk"... And I thought to myself, in the dream, "Oh, that's weird! That's the same name as Martin's tank." Maybe it's time to spend less time on this wiki... ;) -- Voidious

Added a screenshot from World of Warcraft. Now you can dream of Ugluk himself. -- Martin

Fighting battle 21 ... pedersen.Ugluk 0.7.5,kawigi.sbf.FloodHT 0.9.2
RESULT = pedersen.Ugluk 0.7.5 wins 2623 to 2518

Cool. I grabbed this version and ran a few battles with it. I'm pretty impressed; 0.7.5 seems to be much better than previous versions in duels. FloodHT is a really strong bot, keep up the good work! --wcsv

Thanks. I introduced two new segments to my bearing offset gun with 0.7.4 but nothing happened. With 0.7.5 I added some decay to the movement stats (introduced with Butterfly 2.0) and a new wave surfing movement that waits until the last moment to go to the bearing offset, rather than going immediately and waiting for the wave. Evidently it works. I also did some tuning against the lowest ranked bots in the rumble to squeeze the most points I could out of them, but I doubt that was a significant portion of the boost to Ugluk's rating. -- Martin

Top 100 with over 500 battles! Congratulations! --wcsv

Welcome to the RobocodeHigh School by entering the top-100! You sneaked up on me while I was busy with melee, no hard feelings for that, and I intend to improve GrypRepetyf shortly so be warned. You can easily compare two versions of Ugluk to see the impact of your changes. Click http://rumble.fervir.com/rumble/RatingDetailsComparison?game=roborumble&name1=pedersen.Ugluk%200.7.5&name2=pedersen.Ugluk%200.7.4 to see the difference between 0.7.5 and 0.7.4. -- GrubbmGait

That link (modified) will come in handy soon. I just placed two 'new' version of Ugluk in the Rumble. Ugluk v0.7.5a has some additional tuning against below-average bots. Ugluk v0.7.5x has all opponent-specific tuning turned off. It is 60% of the size of the tuned version. This comparison should tell me two things: How much my rating is improved / inflated by the tuning, and what tuning is obsolete due to advancements in the bot since I started. -- Martin

So far the results are unsettling. The ratings are within 5 points of one another after over 300 battles each. I'll see if there is any information to glean from the comparison of 1200+ battles each, but so far it looks like I need to retest each of the opponents (unless I am already crushing them) with each advancement in movement or targeting. Ah well. -- Martin

Ultimately, I don't know if a bot-specific approach is scalable. Better to work your ass off figuring out a generalized strategy. I noticed your 'profile' classes and wonder if it's worth it. Bot-specific logic is rigid. Seems like the holy grail is ultimate flexibility. --Corbos

If by scalable you mean that as time marches on the bot can adapt .. well .. if I'm not paying attention I don't much care how well I am doing. Also advances in movements and targeting over time mean your bot is going to go by the wayside if you aren't developing it (eventually).
My approach to the tuning takes manual repetition of running battles, adjusting, and running more battles. For each opponent. It is time intensive and the Rumble rating gap is telling me it ain't fricken worth the trouble. I am guessing that my primary boost came from the dynamic elimination of inferior movement techniques. The more adaptive Ugluk gets the less of a need for hard-coding.
That said, I still plan to revisit pre-loaded enemy data when I implement a crib sheet or something like it that I can use to tune a statistical gun with x rounds of test data. -- Martin

I'm sure the data is valuable. Still, hard-coding == bad ju-ju. ;) --Corbos

Did some crazy wide-scale seat of the pants refactoring to allow my guns to target Ugluk. I also added a movement method intended to dodge all incomming virtual bullets fired with those guns. It is not very mature, but my score climbed back up to the enemy profile version, though profiles are presently not used. I will probably leave them disabled until I get a real statistical wave surfing movement, rather than just the two styles of random offset go-to's I have now. Right now it feels like I'm doing everything I can to put off my next overhaul of my statistical guns, though the recent changes are groundwork for it. -- Martin

I've been making tweaks to my melee movement and nothing is having a tangible impact on my rating. It is quite frustrating. -- Martin

I know the feeling, the one-on-one changes to Gruwel should not have impact on its melee-performance, but it dropped from 29 (0.2) to 40 and I never got it back on its level (yet). -- GrubbmGait

I'm gonna be working on a redesign of my bearing offset gun for a few more days. I am designing it to allow the segmentation granularity to be adjustable as the population increases. At this point I do not know how I'm going to decide when to make that adjustment. I'll probably have to do some measurements akin to those discussed on the entropy page. -- Martin

I've introduced some nifty stuff in terms of my bearing offset gun with v0.8.0. I am hoping it can tackle some of trickier movement algorithms out there, and maybe catch some napping surfers. Time will tell... -- Martin

Whoah, some weird stuff going on with Ugluk's ranking... He's "at 1990", but underperforming by 10+ points against everyone in his list but 1. (That just doesn't make sense.) Any idea what's up? -- Voidious

I'm more curious than anything, as I know it will stabilize. Maybe the first match was vs FlamingKombat, and skyrocketed the ranking immediately? -- Voidious

First two matches were Shiva and micro.Freya, uploaded with a 2012 rating, both 20 points in the red. I dunno how the rating thing makes the original ratings, but it's pretty nuts in the beginning. -- Martin

Weird indeed! I thought it started you at 1600 and adjusted from there? *shrug* -- Voidious

New bots should start with 1600, updated bots should start with the rating of its previous version. Alas there is some flaw somewhere, so Ugluk started with the rating of the old Pugilist or Dookious or so. If the momentum is high ( > 15) positive or negative, the rating has not stabilized yet. If you have a momentum of -300, your next battle will let your rating drop by 3 points! -- GrubbmGait

By the way, twice throughout the night, Ugluk crashed with an "Out of Memory" error, and I had to restart my RR@Home client. -- Voidious

I also saw this error last night, but I couldn't get it to happen again so I ignored it. --wcsv

Hm.. sorry about that. He's started logging more wave data but in tests he didn't start having heap errors until around 200 battles. I have my batch files allow 512 megs instead of 256 so maybe that's why I haven't seen it crash any of my Roborumble machines. I'll see what I can do about managing the memory better .. -Martin

It turns out I was logging all waves instead of just the non-virtual ones, so that's about 100x as many as I was expecting. 0.8.1 will address this problem (after some more tuning). -- Martin

Hmm, sorry to nitpick, but... Isn't it a max of 16 ticks between bullets at maximum fire rate? =) -- Voidious

A while ago I reworked my virtual bullets and waves so that all bullets (virtual or not) are just a firing angle attached to a wave. This is really smooth for non-virtual waves because the firing solution that is selected for the shot becomes the wave and then all targeting systems are repolled with an exact bullet velocity constraint, each targeting system returns one (or no) firing angle that it would use, and all of those angles (and a reference to the source targeting system) are attached to the wave. With virtual bullets each targeting system can have tens of firing solutions each round, none of which are coordinated with one another (sharing a wave). Typically I've got 9 targeting systems, so let's say each produces 25 firing solutions for 16 ticks. That's 3600 waves with one firing angle attached. One of those is selected (in tick 16) as the primary and flagged as non-virtual. So really I've reduced my storage to 1/3600th, not 1/100th, but I didn't feel like doing the math at the time. It's an area I could tidy up a bit, but I don't think it will affect my rating. -- Martin

Ah, I see what you mean now. Yeah, I use a similar integration of Waves and VirtualGuns, except I've only got 2 guns each with 1 firing solution attached to each wave ;) -- Voidious

Finally.. I win! http://home.comcast.net/~kokyunage/robocode/asif.jpg

Well, I screwed something up with 0.8.2. I removed the breakdown of hit percentages by bullet flight time and started playing with rolling averages. Oh, and I revamped the targeting engine. And altered the functionality of two guns. Hmm.. well, good thing I have make regular backups... -- Martin

With 0.8.2 I revamped the targeting engine, affected the stats feeding my two Mean guns, did some tweaking to my latest movement, and eliminated all virtual wave creation / processing. (I still have waves based on actual shots taken.) It cost me 8 points, but I lost them really really fast. I'm sure the point loss is a combination of the tweaked guns and movement, complicated by hard-coded enemy profiles. The main reason for the rating plummet with 0.8.2 was that I'd eliminated the timeToTarget range aspect of my statistics, lumping them all together. It was a bad move, and one I had to manually restore from a backup.
I'm presently tuning a new movement for melee and I need to debug / improve the implementation of my top secret (until it works) organically segmented bearing offset gun (which paid an unsatisfying 10 rating points upon introduction). -- Martin

Have you done something special to upload only single results in melee, I have always and still am uploading the double results. This means I generate 18 results per bot per battle. I assume you are running meleebattles today, as Ugluk is the last updated meleebot. -- GrubbmGait

Yeah whatever version I have of the RoboRumble@Home is doing what I suggested a while back .. only reporting battles of the present bot and the bots it has beaten, so the first bot reports 9, second 8, and so on. #10 doesn't report anything. I do not know if it is something I downloaded or if I made the change in the source code myself. I can zip up the compiled version (I think I sebsequently doinked the source code with further tweaks) and make it available if you like.
On an unrelated note, my scores really took a nose dive even though I tested a lot against GrubbmGrb to get some kinks out of my gun (and enhance it). I did fine against GrubbmGrb in the Rumble, but something went awry elsewhere. I suspect I've messed up my hit statistics, making it favor my new gun always, when simple targeters work better against others. -- Martin

Evidently Ugluk has gone Zoolander .. he cannot fire left. He can aim left, but won't fire. Shouldn't be too hard to track down .. though it has taken me hours of other testing to notice it. -- Martin
Follow-up: I once again fell victim to the comparison of two angles when one is always positive (turret heading) and one is relative to another heading. My position.getBearing( position ) function returns a value [-pi,pi]. Bah. -- Martin

While Ugluk's rating is not stellar, and is a few points shy of his highest, he is presently (v0.8.5) doing it all with 1 gun and 3 movement options. His melee movements are different, but the same single gun (which isn't working out for melee). There are also no enemy profiles active. I'm suprised by the relative success so far. -- Martin

That's really cool; nice work. As I've continued to do Robocode, I've found that the KISS principle really can improve performance significantly in addition to its other benefits (cleaner code, easier testing, more consistent performance). I think sometimes a more complex solution ends up getting in the way of your simpler solutions as much as it is helping to augment them in other areas. -- Voidious

Ugluk has been floundering somewhat lately, so I am going to focus on getting him to a satisfactory level of achievement in duels, then leave duels behind for a while and focus on the highly disturbing game of the free-for-all. -- Martin

I expect v0.8.8 to have a low initial rating but sharp improvement over time, depending on how many different machines are processing him. If I were running all of the battles from one machine I'm pretty sure he'd reach 1800. Then again, recent changes to Ugluk have been disappointing (aside from losing really really fast now). -- Martin

What kind of stuff are you saving? I was under the impression that data-saving never yielded much more than about 10 points onto a rating. -- Curious Voidious

Well, when I was looking at data storage I was always looking at storing targeting information. Yet I realized when I was doing my tuning that the real gains in score ratio came from finding the movements that my opponents are weak against. So at present I'm just storing a list of identifiers for movements that suck against an opponent. Over time I eliminate underperforming movements, making Ugluk harder to hit, which buys more time to return fire, which increases my bullet damage and decreases theirs. The next release of Ugluk will begin with the persisted data of the former version, so the initial performance should get better over time. Basically I am doing similar performance tuning in an automated fashion. Rather than taking about 20 minutes a bot to test, code, and retest profiles, I can run the Rumble (or League, but I ran into the file i/o bug) to get similar results. Eventually I'll limit the gun selection through the same process, but for now I just went with my latest pimp sauce gun. All the discussion of virtual guns jinxed mine, and I haven't managed to figgure out what is making the stats go haywire. -- Martin

Right on, I'll be interested to see how it turns out. I'm pretty sure you can use the "canoncaches=false" argument in your RoboLeague.bat to avoid the file i/o problem, if it's that security one that you're talking about. -- Voidious

Unfortunately I've seen entries dissappearing out of the log, so something is going on to make Ugluk lose data. It needs some more testing. -- Martin

I've added some fault tolerance and reworked the file reading portion of my code (I'll update the subsection later). I spent a few hours ferreting out the bugs in the process, and I think this version should be pretty solid. Time will tell. I'm predicting this basic bot (with bug corrections if they are found) will be my 1800 mark breaker. I am going to have a few hard-coded profiles that I will turn on (all are presently off) before I 'go gold'. Ram bots take some special mojo to stay clear of, but I have a working formula. GrubbmGait's GrubbmThree still gives me a good fight though. -- Martin Alan Pedersen

As there are only three adequate RamBots around, they would not harm your ranking that much. I believe that if your close-range fighting is ok, this should be able to handle rambots also. If you can score 55% against rambots, I would not put any effort in it. There are more important issues to handle if you want to reach your goal. Every now and then using the KISS-principle can let you focus on the real problems instead of the unimportant details. See you soon on my side of the 1800 mark! -- GrubbmGait

CassiusClay does a couple of things when its facing a rammer. Like evading steeply. And never evaluating the stop position in the WaveSurfing. And shortening the BlindMansStick in the WallSmoothing. Depening on movement system the counter measures will vary I guess. But one thing that's pretty general is to detect when you are facing a rammer. I keep track of the rolling average of the enemy bot's ApproachingVelocity. Like so:

enemyApproachVelocity = PUtils.rollingAvg(enemyApproachVelocity, enemyVelocity * -Math.cos(e.getHeadingRadians() - enemyAbsoluteBearing), Math.min(scans, 5000));

And I decide it is a rammer if this average goas above 4.5.

Of course, Grub is right about it not being a problem one should generally bother with. For me it is that I am quite emotional about my robots and I hate seeing them get abused. =)

-- PEZ

I have viewed a few battles in the past and CC has a very elegant way of evading rambots. CC (and Ali) are the only bots that score 70+% against GrubbmThree, but now I know how to counteract! I will set the max velocity to 4 and start ramming then! ;-) -- GrubbmGait

Hehe, please do. =) -- PEZ

Won't work for me - I keep track of the ratio of ticks that you are heading right at me vs not right at me to detect a rammer ;) And I just fire power 3 if you are a rammer, and that's enough to get Dookious over 60%. -- Voidious

Ooooh, now I get it! =) I should just not comment on anything until I've eaten lunch... -- Voidious

My virtual guns array is really performing poorly. As I have not found any outright bugs in it yet, coupled with the observation that the loss of hit percentages by bullet flight time really hurt the bot, I suspect that my removal of all virtual waves is to blame. It really sped up Ugluk to no longer create and process them, but they provided stats by range to fall back on when the real bullets had not fired at that range yet. At present I am relying on a bearing offset gun to kill all my (unprofiled) enemies, so I'm not killing nearly as fast as I could be. I think I can re-introduce a different implementation of virtual bullets / waves that I can turn off after the first round, if not sooner. -- Martin

After several releases Ugluk is finally pushing his rating past 1750 again. I've added weapon selection to the tuning system, which means Ugluk will get gradually faster, processing only 3 targeting systems instead of 10. v0.9.0 saw the return of virtual bullets, redesigned and no longer using the wave system (which was overdesigned since they don't share waves). I also abandoned rolling averages in my virtual gun hit stats, though it seems like it makes sense. Then again if you want to ride a wave you have to be ahead of it, not chasing it. My approach to movement, using a variety of crappy systems rather than one good one, relies on three tuning systems; hard-coding (fixed), learning (between battles), and performance (mid battle). My guns have the same tuning methods, but movement performance has a more drammatic impact on rating than gun performance. -- Martin

I don't know if you noticed (now you do), but Ugluk has 4 more wins than GrubbmGrb in the PL. You are going in the right direction, but still have 50 points to catch up on. -- GrubbmGait

  • I'm bumped down to only one spot above GrubbmGrb. I have managed to beat all of your bots again, which I hope you take more as a sign of respect than an insult. I've nearly got all of Loki's as well but Friga was only fought once and I only managed 47%. I've got another ace waiting up my sleeve for the next Ugluk, which may take me that much closer to 1800. I don't know if it will help me in the PL. For some reason I don't really pay much attention to the PL ranking, even though it appeals to my 'player vs. player' online gaming background. (In PvP it's not about style, it's about who is left standing.)

I'm not sure rolling stats makes sense in targeting. It would only make sense against MultiMode and AdaptiveMovement like WaveSurfing. And that's quite few bots. And to actually make a difference you need really fast rolling stats. If you have VirtualGuns then it's better to have one gun just remembering the last sample and one that accumulates the stats without rolling. Aiming for the middle ground just lowers your performance in both ends I think. -- PEZ

I've been working on a bullet-avoiding-but-otherwise-random movement mechanism. It has gone through some evolutions, but still has some unfinished features and bugs to work out. I also maintain a list of people for which ramming (and firing) gets me a better relative score. It is not a dynamic decision. -- Martin

Fixed some long standing bugs with this cut. Also worked out some (but not all) kinks in my new bullet-avoiding-but-otherwise-random movement. I may do a release that uses the movement exclusively, along with my not-fully-segmented-because-I'm-still-working-on-collecting-the-necessary-data bearing offset gun. -- Martin

I managed to boost Ugluk's gun performance by a hair under 50% from a few versions ago. I don't know if it is the bug fixes or the most recent (unreleased) segmentation. Hopefully it is the segmentation .. because the bug fixes don't seem to help the roborumble rating much. -- Martin

Nice job on your gun work there - that is no small jump you made! Especially against the surfers. -- Voidious

Thanks. Turns out I also bumped up my wave surfing challenge scores, but only by a little over 25%. I'm expanding my debugging code to figgure out where the bugs are, though it could come down to a simple difference of targeting algorithms... -- Martin

I'm getting "file not found" for this version of Ugluk, and I think I may have for the last version, too... -- Voidious

Judging by my rating it looks like I broke something too. -- Martin

Well, it was finally time to revamp my physics engine, which I've done, and now I have 76 JUnit tests backing them up. Next I need to track down an error with my radar, so I'll probably reinforce those systems with JUnit tests as well. Eventually I'll get tests set up for each of the targeting methods so I'll have more confidence making changes in that area. The testing is a bit of overkill but I've fairly frustrated with the performance of my targeting methods and ensuring that they are behaving as intended. My attempts to emulate the Robocode client environment have not gotten me anywhere. -- Martin

Can you publish your unit tests? I am always planning to make a bed of unit tests for CassiusClay, but I never know how to go about it. If you can't publish it then maybe you could give your thoughts and advice on the matter on the UnitTesting page or some such. -- PEZ

I can't download Ugluk, what's happen? -- DemetriX

The main problem with learning about opponents through saving data is that the data is spread out among all of the machines running the robot. From time to time I run Ugluk from only one machine (in this case from work over the weekend) and this is why you can't get to the .jar file. It isn't really there. The .jar file is in my work machine's roborumble/robots directory though, so it doesn't need to look for it. The alternative is to make about 5 versions of Ugluk to collect the same data, creating a lot of unnecessary processing work for everyone running the rumble. -- Martin

Ugluk v0.10.1 features the return of the fully segmented bearing offset gun and some bug fixes. v0.10.2 rams some specific bots rather than being a good sport. So far the score difference is negligible. Yet, it's still worth it. =) -- Martin

A comparison between these two bots shows quite a lot of variance, even though the only difference is that there are maybe 9 opponents that .2 rams that .1 does not. <a href="http://rumble.fervir.com/rumble/RatingDetailsComparison?game=roborumble&name1=pedersen.Ugluk%200.10.1&name2=pedersen.Ugluk%200.10.2">Comparison link.</a> Perhaps it is time to get RoboLeague working with file i/o so I can have a more controlled test bed, with a consistent number of battles fought per opponent, and always facing all opponents. I can appreciate the effory it took to create RoboLeague, yet it's a pain in the ass to set up battles, especially when my setup has it picking up teams and roborumble class files as potential bots. -- Martin

I rarely select the bots through RoboLeague itself - I manually create .xml files a lot with RoboLeague. For instance, I have "dooki086_mc2k6_base.xml", an MC2K6 template for my current dev version, and then I just make a copy of it and name it "dooki086_mc2k6_20060306_001.xml" to run a set of seasons with a certain build. And I do the same for whatever is my current testbed. I use RoboLeague a heck of a lot in my development, and my whole process would be a lot different without it! (It's pretty easy to edit the XML files, except that it's usually all on one line.) -- Voidious

A while back I considered having a cross between partial linear targeting and my bearing offset gun. My partial linear gun defaults to 70% of present velocity, and for some opponents it is my best gun. For some it is the best gun at a different fraction of velocity. It would great if it turned out to be a formidable gun against many folks, if only I could teach it how to calibrate itself. If I can figgure out a practical means of determining after the fact what percentage range would have hit my opponent and track it like bearing offsets, I'll have what I want. I don't know how well it will do, but it's one more option in the battery. -- Martin

The actual code wouldn't be much different than your standard GuessFactor gun - you would just make GF1 = linear aim for each wave instead of GF1 = max escape angle. Then GF0.7 would be 70% of linear aim, GF0 would still be head on, and so on. I think the idea of using linear aim as the "reference angle", instead of direct aim, in a GuessFactor gun might have some merit, too. (Like GF0 = linear aim. I think I've seen a mention of that somewhere else on the wiki.) -- Voidious

You basically just described the idea behind Stampede's gun, Voidious. --wcsv

Oh yeah? Which idea, GF1 = linear aim, or GF0 = linear aim? -- Voidious

It uses GF1 = linear aim, GF0.5 = 50% linear aim etc. It uses virtual bullets though, so I could probably improve on it by converting it to use waves. --wcsv

100% linear aim would have to assume the target hits the gas this turn, or you wouldn't be able to hit people who accelerate from 4.0 to 8.0, for example. I could do it with virtual bullets for a discreet set of bins. I was hoping to have a wave-like means of determining what the linear projection percentages were that would have worked after the fact. It's probably a lot simpler than I am making it out to be, but I keep getting too distracted by other things to focus on it. -- Martin

Well, if you assume a maximum velocity, you're pretty close to how "traditional" GuessFactorTargeting already defines GF1, which is just the maximum angle they could possibly get to before your bullet intercepts them. (Assuming they accelerate to 8.0 doesn't seem like "LinearTargeting" to me, it seems like "GF1", but it's really just a label anyway.) -- Voidious

I don't really use guess factors, so I'm not the expert, but it seemed to me that those represented a fraction of movement tangental to the firing position. I'm looking at the target maintaining heading, going forward or reverse. The problem with not normalizing to the maximum velocity is that your percentages can easily exceed +/- 100. If they are stopped and hit the gas you've got an infinite ratio. It would be nice to scale it off of whatever the present velocity is, perhaps with a cap (like 800%). That might be better handled by segmentation. I dunno. I just want to get a prototype working. I've been sketching it out and I think I can make it work. I just need to check if I remember the Law of Sines correctly. -- Martin

You're right that it wouldn't quite be a normal GuessFactor, but it'd be pretty close, especially in duels where perpindicular movement is the norm. Good points about zero velocity, though, and you are probably looking for a gun useful in melee battles, too (throwing perpendicular-ness out the window). In any case, good luck, I look forward to seeing how it turns out (and then hearing the explanation, maybe =)). -- Voidious

I've tried several times now to get a profiler working with Eclipse, and evidently I need a lot of hand holding with it because I've tried 4 tools and none of them have worked. They keep throwing errors about missing files or otherwise calling me a schmuck. So.. I've decided to do the 'brute force' method of creating a stop watch mechanism for timing subsections of my tank. I decided to use a JRE 1.5.0 introduced System.nanoTime() to get more accuracy than the millisecond alternative. A cross section of my tank's operation looks like so:

Stopwatch combatantOperate:        87.910341747 seconds
Stopwatch combatantOperateGun:     4.756166431 seconds
Stopwatch combatantOperateChassis: 29.445992399 seconds
Stopwatch combatantOperateTurret:  53.434751403 seconds
Stopwatch combatantOperateScanner: 0.96735465 seconds

Indicating that my targeting is the main culprit of my speed issues, but movement is not far behind. I'll add similar timer support to the base classes to get a per-implementation profile. -- Martin

Once again I've mangled my rating by removing range segmentation on my hit statistics. It's pretty crude but obviously effective. Still, I'm not happy with it, and I am going to try implementing segmentation like I have in my bearing offset gun. -- Martin

Well, I've tinkered with quite a few things lately, and done a few overhauls. I'm happy with the new segmentation of hit statistics. It worked out pretty smoothly. -- Martin

Ugluk v0.10.5 is purely an opponent tuning change compared to v0.10.4. All existing profiles are on, with many new ram opponents, including all rambots. I compared Ugluk v0.10.4 to Banzai v0.5, but that version of Banzai does not have the new segmented targeting statistics. I do not know how the new statistics will help or hinder Banzai... -- Martin

For some reason ramming wasn't as effective for Ugluk as for Banzi, using the same code. But I'll have to track it down later since I've already reworked yet another section, dealing with waves. - Martin

It's very early to speculate, but Ugluk v0.10.7 has been climbing steadily and after 40 battles is at 1774 with 108 momentum. In addition to fixing an issue with Ugluk's ram mode, I re-implemented an old feature in a new way, which I suspect is more to blame for the promising rating ..
Ugluk v0.10.7 peaked at 1789 before his momentum finally reversed, after 90 battles. I expect it will flatten out somewhere between 1780 and 1800. -- Martin

Good to see that you finally entered the top-75. Next is to beat the 1800 (top-50) barrier! -- GrubbmGait

Ugluk was really struggling to make any progress. Part of it was that I'd remove stuff that didn't work well but I wouldn't replace it. I'm still missing some 'features' that were slow, but the reduced functionality allowed me to focus on better performance of the systems that mattered. Right now I'm trying to squeese the rest of the rating points (to 1800) out of performance tuning. My attempt to ram rammers to minimize my losses didn't work out well enough, so I'll be fiddling with anti-rammer stuff as well. -- Martin

I'm working on the new gun and I've hit a slight snag, though I think I know a way around it. Here are some diagrams I threw together:

http://home.comcast.net/~kokyunage/robocode/diagram1.jpg

The target begins at known position [1] and has a known direction of travel (factoring in negative velocity). Ugluk has a bead on it (bearing offset 0). It travels to position [2], and the hit arc is projected through the original direction of travel line. Points A and B are scaled to the percentage of maximum velocity that would have taken the vehicle to those points, giving you your range, and you can drop the interval into bins. As you collect more data you get a smoother picture of where the target tends to be on that scale. Walls would tend to be at 100%, but also has some lower range values due to stopping at corners.

http://home.comcast.net/~kokyunage/robocode/diagram2.jpg

Above is one of the problems with this technique. The points of intersection are behind the firing position because the direction of travel has changed drastically. You may have noticed that point [A] is further from position [1] than point [B], so we can detect this situation and may be able to accomodate and even use the data for future firing solutions.


-- Martin

Man, it's scary to think how that gun would react to Tron's wiggling... -- Voidious

It's gone through a few revamps already and still isn't working as well as my Linear and Partial Linear (fixed percent) guns. I think I'll have to add some graphics to debug it. It looks like I'm storing and returning legitimate percentages (of straight linear), so I'm either not projecting correctly or my partial linear gun is using max velocity. I forgot to bring in the code to work today so it will have to wait. -- Martin

I've backed out some code changes, including the gun I was working on (which was ineffective). I am also working on using virtual waves again, but only when they prove more effective (such as against opponents that don't take their enemy's firing into account). -- Martin

Ugluk finally penetrates the 1800 barrier. -- Martin

Congratulations, you're getting closer and closer to v1.0, only three bots left to beat. As for melee, your current version has a better 'domination' than your best sofar. Due to the late flood of (sub)topmeleebots, the ratings are dropping and the former top-20 bots (like PulsarMax, Troodon, Ares) have a hard time to stay in the top-30. I'll try to put some more effort in my WaveSurfing, just to stay ahead of you in OneOnOne. -- GrubbmGait

My client can't download the newest version of your robot. -- Kinsen

Yeah, sorry Kinsen. I am 'teaching' Ugluk what movement and targeting sytems don't work against rumble opponents. It's best if I only allow one machine to process the battles, otherwise I just have to keep making new bots because of the 1200 battle limit. If many people are processing battles I'm still only learning on one machine, where I'll grab the results. I'd do this with RoboLeague but I've never figgured out how to get the cannon caches thing disabled with it. So for now 0.13.1 and one or two more versions will just be available to my machine. Then he'll be public again.
One thing I've observed this morning is Ugluk has challenged bayen.Squirrel and stelo.WangRobot about 30 times each but keeps scoring 100%, so the score is invalid. I wish they'd him me just once. -- Martin
edit: and Grubbm, your wave surfing intimidates me. I will have to achieve my stated goals for v1.0 before you have time to polish it.

  • I know my surfing is quite impressive, that is the result when the best parts of BasicSurfer and my own bots are put together ;-) -- GrubbmGait
    • I haven't actually seen your surfing in action yet. I just know you have a highly competitive bot without it. I expect when you add in some guess-factor style targeting you are going to make me cry. -- Martin

By the way, I have a line in my roboleague/launcher.properties file that says: user.cmdline=-Dsun.io.useCanonCaches\=false -Xmx512M ... Florent tipped me off to it, and I have used RoboLeague to collect data like that (for that Dookious version testing pre-loaded data). -- Voidious

I have spent quite a bit of time developing a object-based bearing offset gun with dynamic resegmentation. As more data became available, the segments would determine if it was time to increase the 'buckets' by 1, and reprocess all previous scenarios and resulting enemy positions. At first there are maybe 9 total buckets, accounting for cominations of positive-only and positive and negative segments. After growing sufficiently, there may be hundreds of buckets. The premise was that initially you have very little data and you want to be able to take an educated guess as soon as possible. What I'm finding out though is that slicing your data finely isn't really making your guesses more educated. At least not in a 35 round battle.
I also moved my virtual gun statistics to use the same dynamic segmentation.
I think I'm going to make the gun a lot simpler, cut way down on the segments, and add in some randomness in situations where the odds of picking the right angle are slim to begin with. -- Martin

Definitely agreed that more segments is not always good, and finding the sweet spot can be tough. I've been wondering for a while - why would you use bearings instead of GuessFactors in your statistical guns? To me, GuessFactors seem like an evolution of bearing offsets, and I see no advantage in using the latter. (GF1 at bullet power = 3 is a much different angle than at bullet power = .1, but they will both hit the same type of movement, generally.) -- Voidious

Gut feeling. And it's easier. And I don't have a good mechanism for wall smoothing, so I cannot predict how an opponent will behave in that situation. And I'm presently not using rolling averages.
For post 0.13 I've gutted a good chuck of the code, including inferior guns, segmentation, and my dynamicaly recalibrating bearing offset gun. I've a fresh simpler gun with fixed segments, and only 4 of them so far, with 625 elements, each holding a bearing offset bin array. I'm going to toy around with implementing (and debugging) various segments and try them out. To support this effort I'll also need a means for collecting bin hits across hundreds of battles to see if there are any consistent patterns for each segment. That should give a relative worth of considering that aspect of an opponent's situation in the gun. I think I'll revisit the Entropy discussion next.. -- Martin

I didn't intend to hide my posting of 3 bots at once, but after 4 failed attempts to save the participants page, I started fiddling and it worked as a minor edit. /shrug
This selection of Ugluk bots all have preloaded profiles disabled, 5 movement options, and one gun, with the same segmentations. The way the bearing offset is determined is the only difference.

  • v.1a - the bin with the dominant visit count is selected.
  • v.1b - a random bin is picked, but bins with higher visit counts have proportionately higher chance of being selected.
  • v.1c - a shallow rolling average is used to track previous successful bearing offsets.

-- Martin

I'm not sure where the "rule" might be written, but RoboRumble/ParticipantsChat has a bit of talk of only having one version of a tank in the rumble at a time. Not that I'm one to talk, with my daily Komarious releases, but it was kinda uncool to see "Ugluk...a vs Ugluk...c" come up as the first match in my RR@H client. -- Voidious

You are right that you are not one to talk. It isn't very cool, but since I entered them at 10:30 at night and had 3 cpu's cranking the rumble throughout the night, it didn't impact much. And now there are three cpu's cranking through the daily Voidious release... -- Martin

Indeed, it didn't cause much of a clog, so no harm no foul... but if there is a "rule" about not having multiple versions in at once, I think it should be followed. I do post releases often, but between them I also run lots and lots of tests, instead of letting the RR be my testing grounds. In other news, with my three clients as well as yours, this version of Komarious might stabilize faster than any bot I've ever seen in the rumble. -- Voidious

Komarious is a pretty fast bot. I give it two hours. -- Martin

I just noticed that my MC2K6 rating againt the Circular Targeting bot (C) is second only to Engineer. Linear Targeting (B) is still very high. I dunno why I am getting nailed by the head-on bot so much. Ah well. -- Martin

Indeed an impressive score against botC, but I still stay the best non-surfer in the MC2K6. Btw, I released my first WaveSurfer with GF-gun, I propose that it does not count for v1.0 of Ugluk. -- GrubbmGait

The very first undertaking with Ugluk (v0.0.1) was a wall collision avoidance algorithm. His movement was fairly simple (mostly straight with a little sine wave added to throw off targeting) but he stopped short of running into walls. Sometime later I revamped the movement engine to use vectors, and I have used wall repellant vectors ever since. I also added robot repellant fields and some others I toyed with. Subsequent movement methods have all boiled down to a suggested movement vector, and the sum of all vectors was where Ugluk wanted to go. With the wall repellant fields in place, I ditched the costly early wall collision avoidance algorithm.
It has worked fairly well for a long time. One thing I didn't have though, which became very apparent when I introduced a grid go-to system, was that I didn't have a way to determine the fastest way to get from point A to B for any initial heading. For example, if Ugluk was heading north at a velocity of 8.0 and wanted to arrive 40.0 units to the east, Ugluk would bank right and go in a circle until the point ended up behind him, at which time he'd hit reverse and try backing into the position. He's rock his way to the destination as long as it took.
Well, no more. I finally figgured out how to find the shortest path to my destination, and I'm pretty proud of it. I also take into account the location of walls that I may run into along the way, and avoid them fairly well - but not well enough yet. I still need to work out some kinks in that department. My most recent test had about 1.5 wall hits per round. I need to check the speed of those hits to see if they are really relevant. If I am hitting at less than 2.0 velocity then it doesn't really matter. Edit: about 80% of the hits are at 1.0 velocity. Must be a simple algorithm issue. -- Martin

I've worked out the kinks in the 'least time from point a to b' movement, complete with wall avoidance and my own version of smoothing. My score isn't stellar, and not my highest, but Ugluk is damned fast and getting it done with very few tools. I'll be reviewing the poorly performing movement styles of the past (i.e every one I've made) and trying to make them work as I envisioned. -- Martin

Got some decent wave surfing results finally. Hopefully this transfers to the Roborumble... -- Martin

  • While the wave surfing challenge scores were much improved, the other challenges saw sharp declines. Consequently, the rating didn't climb much overall. -- Martin

I figgured I'd add some eye candy related to my movement engine overhaul. The first two screens are related, the third is from a different round. After the first wave options get more limited. I haven't begun to tackle multiple wave surfing yet. Yet..

http://home.comcast.net/~kokyunage/robocode/ugluksg001.jpg http://home.comcast.net/~kokyunage/robocode/ugluksg002.jpg http://home.comcast.net/~kokyunage/robocode/ugluksg003.jpg

The green box is the destination I am trying to reach. The orange box is my present direction of travel's intercept with the wall. The green wavy lines are my 'optimal' tangental paths of travel. Given my wall avoidance I probably need to make more than one pass after determining that my ending direction of travel is rather different than the tangent. The grey lines denote the original dead center bearing as well as the present maximum clockwise and counterclockwise rotations (as calculated by the tangental path prediction). The rings are my wave representation (as previously described on the RobocodeSG page). -- Martin

Nice eye candy. Debugging graphics are enabled in Phoenix too, you can turn them to draw projected clockwise / ccw motion, enemy waves, values of various surfing components, and other goodies. =) --David Alves

With regards to the WSC scores / RoboRumble correlation, I would add that I think the score vs BotA (HeadOnTargeting) is by far the most important one. (Komarious gets relatively bad scores against BotB and BotC, but still gets a lot of rating points out of her WaveSurfing.) Nice graphics, too, they really do help a lot in the WaveSuffering process. -- Voidious

Ugluk v0.14.6 kicked Grishnak's ass 2:1 in my first (and so far only) test of 35 rounds. It lost to GrubbmGrb and GrypRepetyf but not nearly as badly as it has been. I expect some decent results (as always). -- Martin

Nice graphics, I should try to visualize some more than just the wave. It seems that 0.14.6 is your strongest Ugluk sofar, being relatively better against the high-end and midrange bots, well at least better than GrubbmGrb does. That bot is not exceptionally strong with about 100 losses, but will ruthlessly exploit any weak spot in movement and/or targeting. Examining the details, it almost looks like a combination of Grishnakh and Ugluk is able to get into the top-50. -- GrubbmGait

At present Ugluk only has one gun, which was a few hours of re-writing my bearing offset gun nearly from scratch to be far simpler. It is still my most effective gun in terms of the targeting challenges (which is odd since I haven't even begun to toy with the segmentation) but I don't have any simpler targeters backing it up. The simple targeters tend to kill the simple bots far more quickly. It is possible that re-introducing some simple targeting methods will nudge me into the top 50, but right now there's still some nagging movement behavior that I'd like to iron out. For example, my Grid movement should never get hit by head on targeting, especially against a stationary bot. Yet sometimes it just crashes right through the middle of one. I need to figgure out just what is happening. Once I get that nailed down, I'll add some guns that reflect the LT and CT wave surfing challenge bot targeting algorithms, and I will produce a bot that gets 100% score in the wave surfing challenge (even if it is just a novelty bot). While coding to distract me from staring at the Roborumble client churning my results, I added a pre-turn-30 movement style that gets me out of pinned positions, and that should help my wave surfing challenge scores across the board. -- Martin Edit: And yes, Ugluk v0.14.6 is my best bot to date. Somehow I keep treading water in terms of "domination", though my rating is increasing. People are turning out strong bots lately. -- Martin

Recently all of my bot design has been focused on movement. Some of the design is similar, but there has been a lot of overhaul and rewriting of the core system. A while back I made a grid walking movement that had some flaws and was terribly slow, so I shelved it. I recently got it working fairly well and then got it working really fast. And through it I've managed to track down some annoying issues. There are still some more to go though. Ultimately I expect the Grid movement style to get 100% across the board in the wave surf challenge, even though it isn't doing any learning like other wave surfers do. -- Martin

Well, in playing with the collision detection I have managed to disable the registration of hits against my opponents, causing my gun to act as a head-on targeting gun, hence the dismal scores of 0.14.7's brief stint. I'll fix it tonight... -- Martin

I got the gun somewhat functional, but there's some strange behavior going on. For example, I'm determining all the points at which the wave (circle) intersects the target hit box (square) which should be 0 or 2-4 times. (A wave could intersect the firing tank's hit box at as many as 8 points.) I am getting some results of 1 intersect point, which should be extremely rare. Anyway .. more work to be done. I was making general changes to the bullet / wave collision detection system and focused on testing movement so I didn't notice the gun failure. -- Martin

I was getting some nice test results with my gun modification and the work day was nearly done so I packaged it. A nice 10 point boost. I have some more gun tricks to refine that should push Ugluk even further along the path to glory. I'll run some targeting challenge tests tomorrow. -- Martin

Ugluk took a performance hit with 0.15.2, but his targeting challenge scores are getting a nice boost. As I get higher on the targeting challenge list it is more and more evident that my movement really needs improvement. My initial attempt at learning wave surfing was less than satisfactory and has been disabled. I need to tune it some more so that it looks like it is really surfing. As always, there is more work to be done... -- Martin

It seems that 0.15.7 is the 'Release-Candidate' for v1.0, if you can improve that 49.9% score against Mjolnir! And you are threatening my position }:-\ -- GrubbmGait

  • I was expecting some goodness, but I expect that every release =/ I ended up defeating Mjolnir enough in the secong battle to bring the average to 52.2%. I ran Ugluk overnight and didn't see any battles until this morning. He's in 57th place, right behind GrubbmGrb at 56th. No PL ranking yet. It's going to be a good morning. And yes, I think I shall release this bot as Ugluk v1.0, over a year in the making.
  • Congrats! =) -- Voidious
  • Thanks. Upon reflection while driving into work, there is wave surfing movement style that I've been developing but just haven't gotten to work right yet. If I switch to melee now (per my stated goals of about 8 months ago) it will collect dust, since wave surfing has little business in melee (until it comes down to 1v1). That and how could I accept being right behind GrubbmGrb? Freya maybe my melee white whale, but GrubbmGrb is surely my standard for duelling excellence. Ugluk v1.0 should be a reality within a week or two though...

I have been having a problem with graphical debugging. I was building lists of line segments to draw, but by the time they are drawn they are one turn late. So with a few tweaks this morning I made the onPaint method process my robot's turn (and if it wasn't called for that turn the normal run() cycle will process it, and execute the instructions). Snag #2 is that onPaint is called before any events are processed. So now I am remodelling the event handlers so that I grab them off the queue manually at the beginning of the round and process them all before making any decisions. This will have the side effect of improving the handling of onDeath and onWin events.

With my new movement I have nearly nailed my goal in the wave surfing challenge. But .. about 4 rounds in 500 I get killed. I dunno what the situation is that causes it, and watching for something that occurs once every 125 rounds on average is not really practical. I guess I'll just have to store some notes about my starting position, the opponent's first scan position, and what my energy was at the end of the round. On the whole though I've really made some progress in getting out of tight situations at the beginning of matches, allowing for more flexibility as the round progresses. Still some work to be done though, it seems. -- Martin

Ugluk v0.15.8 does not contain the wave surfing movement that I have been developing. I am trying to get the grid walking nearer to perfect. I should probably add segmentation to it, which will be a single line change. -- Martin

  • The RoboRumble sure is a frustrating proving ground. Down 18 points. I think I may know some of the issue though. 0.15.9 will be along shortly...

Ugluk v1.0 released. Already I've found an error with the 2K6 challenge mode .. I misspelled the (rather large) Cassius Clay version numbers. I don't think anyone will be doing that testing for me so it's fairly moot. -- Martin

I'm getting serious about tuning my gun now. The trouble is I have 4 styles of gun and 3 segmentations. (Really there are 4, but two are the same type of measurement). So in order to get a baseline for tuning I need to test all guns with no segmentations and each segmentation individually, for 16 total runs. I am using the Fast Learning style test, so 5250 rounds per test, ~2.7 hours. It's gonna take a few days. It is really long overdue though. I rewrote the base gun to be very simple a while ago (compared to my dynamically re-segmenting one) and took an educated guess at some practical segmentations and number of them, with no attempt to verify that they made sense. The gun has performed pretty well, but it is unlikely that it won't benefit from some tuning. -- Martin

After achieving my goals for v1.0, I am now forcing myself to get back on the melee horse. I modified my Grid movement (really it isn't a grid any longer, but several rings of potential positions) to be a minimum risk system. It is good for getting out of the middle and keeping distance from other bots, but one drawback I am observing is that I will often get on a wall between two other bots and start moving back and forth along the wall. I soak up a lot of damage from that. I tried making a few tweaks to my melee radar but now I am back to the original. I've toyed with targeting a bit and guess-factor style targeting just really doesn't do well. I think there are three major factors. The lesser is that your radar has to spin more to collect updates of everyone in the battle, and it will cause you to miss wave collision data. Next is that as the battle field becomes less cluttered your opponent's movement styles change. But the biggest problem is that you've got to learn how 9 opponents move, and there's simply not enough time to learn it. I've had the most success with dead-on targeting. No learning involved. As a test bed I am using bots from the 9 top authors. I can usually place in the top 3. That hasn't translated to dominant ratings in the rumble though. I've got a few more tricks to try though, and I imagine more will come to me. -- Martin

I had the same problem, oscillating along the wall and getting targeted by the two bots in the closest corners. When I introduced perpendicularity proportional to danger (distance and targeting me) in Griezel, I jumped from rank 30 to rank 10. Performing good against topbots is important, but performing good against a bunch of mediocre and 'not-so-good' bots is as important, maybe even more important. Pitch in Gruwel against 9 top-20 bots and it will end dead-last. Pitch in Gruwel against 9 (very)low-ranked bots (see SampleBotMeleeChallenge) and it will slaughter them faster than nearly everyone. Meleebattles will almost always consist of one or two topbots, some mediocre ones and a few bad bots, so being good against everybody does pay off. -- GrubbmGait

I know I have a tendancy to repeat what the person before me said, but well I like to outline things, your bot has to beat most of the bots below it, and all the bots below it to gain the 1st rank position, or atleast more then the current 1st placer does. So your bot might beat all the top bots all the time, but if it beats the top 10 bots and nothing else, then it only beats 10 bots out of over 200 bots. Beating the top bots means nothing in a division based ranking system. --Chase-san

Well, I think it is about time to develop a qualitative measurement of bearing offset profiles. I don't want to go into detail about the targeting system I've been developing, but I've decided that I need to dynamically choose segmentations, since tests in the Rumble have shown that different segmentations can have significant impact on individual opponents. Rather than choosing the best overall combination, it would be nifty to pick them on the fly. Certainly not a new concept, but one I think I can attain, for better or worse. -- Martin

Not an easy task, but if you manage it, awesome (has anyone ever managed that?) --Chase-san

Toad has working DynamicSegmentation and does fairly well, though it's kinda slow. Albert says Virus has some type of DynamicSegmentation, too. Definitely a tough task to get it working well at a reasonable speed, best of luck! -- Voidious

Toad kicks royal butt in PL still. Maybe this is why, its tuned for every opponent to 'win'. --Chase-san


With traditional segmentation, various conditions are recorded when a shot is aimed, a wave is created, and as the wave crashes over the opponent more data is recorded to note what shot(s) would have hit the opponent. When a similar condition is encountered, the targeting can make an educated guess as to where the opponent is likely to be, and fire in that direction.

The main issues with this approach are what data to record, and what granularity to use. The more data used and more finely grained slicing of the data, the more precise the measurement of likeness of condition, but with that you are less likely to find a matching condition, so the slower the gun is to learn and adapt.

As far as I know all bots to date treat all combinations of data as unique conditions, and have storage bins specific to that combination, though there are also techniques of bin smoothing to blur the hard edges provided by segmentation.

My latest approach is to treat data categories independently from one another. I record wave hits for the categories seperately, and when my gun requests a angle to shoot at (with a collection of targeting condition data), I grab the hit count bins from each data category and then layer them together as a composite, and find the best angle (i.e. most commonly hit angle).

http://home.comcast.net/~kokyunage/robocode/layers.jpg

In the debug screenshot above, you can see six bins lining the bottom. Above is a composite of some of those bins. If I don't think that a data category (e.g. distance to wall) profile has much to offer, I don't include it for that shot.

A traditional segmentation array would not see the similarity between, for example, a bot circling you counterclockwise near a wall vs. far from a wall, because the wall distance would make it fall into an entirely different bin. With my approach, the counterclockwise circling is read from by both conditions, and recorded to twice. This means that each data category is learning more quickly.

The hope is that giving each category an equal say in the final composite will still result in a reasonable prediction of future position.

- Martin

Well, any single buffer (multi-dimensional array) treats each combination as a unique situation, but many bots use multiple buffers and layer them like you do here. Usually, each buffer still has multiple segments, but I do have a couple with zero or one segment in parts of Dookious. I think this is a good idea (and like what PEZ was doing in CrowdTargeting), but I think you have so much data at hand with every-tick wave collection in a gun that it seems like you could afford more than one attribute in each buffer. -- Voidious

http://home.comcast.net/~kokyunage/robocode/debug03.jpg

There is a frustrating issue with bullet collisions. When a bullet collides with a tank, it shifts position. When watching battles, you can see it yourself. The explosion on the tank occurs near the point of impact, but not where the bullet seems to have struck the tank's exterior, or even along the segment. The bullet's location in the related events received match up with the explosion's center (seen as yellow rings above), but they are not along the bullet path (seen as gray lines above). I was trying to match up the position, heading, and velocity of the bullet with a known enemy firing position, but it wasn't matching up. Eventually I took another route to reach the same end, but it irks me that the event was useless. -- Martin

First thing I would check is that your gun is actually firing where you think it is - are the bullets following your painted lines? -- Skilgannon

  • It's true that robocode does this. But knowing the bullet power and approximate impact location gives you enough information to eliminate all but the correct Wave / EnemyWave --Simonton
  • Correct. I actually use the bullet's getHeadingRadians method and apply that to all known inbound waves' firing positions to see which wave intersects my bot's position at the time of impact (after bullet movement, before tank movement). I could use the velocity of the bullet for an extra check, but I don't really need to. At this point the only bullets I can't match up to are the ones fired when the opponent hits a wall in the same turn, throwing off the energy drop. -- Martin
  • Thanks for bringing this up =). I would have assumed that this would be correct. This could be what's throwing off my otherwise pixel-perfect wavesurfing. I think I'll file a bug report =) -- Skilgannon
    • Well, if your bot would be hit at the yellow circle but not along the line segment, do you get hit? I think not (and don't see any examples above), but I can't say for certain. If it's just the reported spot that's off, it shouldn't affect your surfing much at all - for most bots, the two spots would resolve to the same GF bin anyway. -- Voidious
      • Every pixel counts =). I found that out with DrussGT. 6 pixels off in my precise prediction was more than 40 points in the Rumble. And I've got 151 bins, so at a distance of 400 that's 2.11 pixels per bin (0.8*400/151). So a few pixels off will definitely change things. -- Skilgannon
        • Something seemed wrong with that calculation. It's 4.22 pixels per bin, because the MEA is in each direction. =) -- Skilgannon

Thank you for the report at SourceForge. So, if I fix this issue with Robocode, would this break anything with existing robots? --Fnl

  • Well, there is this to consider: If a bullet segment clips a corner off of a tank, or if the bullet segment starts within the tank and exists (because the tank moved into the bullet during the tank movement phase of the previous round, after bullet collisions are detected), the bullet's ending location will be outside of the tank's body. If that location is used for the explosion, it will look odd. Now that I am thinking about it, I've noticed that a bullet's lifespan lasts a few ticks after the collision, and its position follows the tank's, and I conclude that the reason for this is that the bullet's position is used throughout the course of the explosion graphics. Basically in order to correct the problem you would have to mark the bullet's initial point of impact on the robot, wether that is on the exterior (bullet begins outside and enters the hitbox) or interior (bullet begins inside and passes through hitbox). Because the hitbox doesn't rotate or even match the tank's graphics, it will still appear to be an interior hit. Its probably not really worth it to change this, but it should be documented on the Wiki and in the API that a bullet's position after colliding with a tank will not be exact. -- Martin
  • I would think it'd be fairly easy to use a different coordinate for the graphic than the one that gets reported via the RobotHitEvent, wouldn't it? But I do not know if changing what's reported via the event would affect some existing bot. -- Simonton
  • No, unfortunately it is not that easy as the Bullet class gets it's data dynamically from an internal BulletPeer class. But the problem is that the bullet is not a snapshot when it collides, i.e. a deep copy. I have changed this with a special test version of Robocode 1.4.5 here. Please try out this version to see if this fixes the problem at report back. :-) --Fnl
  • I'll look at it in the morning. I'm trying to get Tomcat, MySQL, and an old Java web app installed and running. I enjoy programming, but I sure hate setting up the environment. -- Martin
  • Is it possible to work around this (for accurately detecting enemy bullets) by projecting from the fireposition with the bullet's heading? The heading of the bullet isn't changed on impact, is it? -- Skilgannon
  • If you now the coordinate, heading, and velocity of the bullet, then I should be quite easy to figure out. When the bullet hits the tank, the only things that are changed for the explosion part is the x og y coordinates and the state of the bullet = "bullet_hit_robot", which means that the bullet is no longer active on the battle field. ;-) --Fnl
  • Ok, thanks. It should be as simple as Point2D.Double realLocation = project(wave.fireLocation,bulletHeading,wave.distanceTraveled); -- Skilgannon
  • Yes, exactly. ;-) --Fnl
  • Skilgannon: I take my words back! I found out that a deep copy would not do, as the Bullet would become a new object, which you cannot compare to your own bullets you fired, e.g. when you hit another robot for example. Thus, I have changed it, so the rendering x og y coordinates for the explosion part is independt from where the bullet did hit a robot or other bullet. I have now made a new version, which I have tested, and it seems to work now. But I should still like you to try it out. A new test/debug version can be found Robocode 1.4.5 (02) here. I tested it by putting all bullets received in bullet event as onHitByBullet() in a list, and then I paint all the bullets in this list using onPaint().

Ugluk is presently undergoing a code transplant. I started a new project, rewrote the physics engine building blocks, and am slowly refurbishing the old Ugluk guts as they are moved to the new body. It is a fairly intimate refresher course in the bot. I'm also culling obsolete functionality (dependencies of features long abandoned) and changing some of the flow. I'm sure Ugluk will need some physical therapy following the procedure, but he should be a lot healthier at the end. -- Martin

Cool! Welcome back. -- Voidious

I've decided to deliberately introduce some imprecision to my targeting waves. I now treat an opponent as a sphere with radius ~25.5, rather than a 36x36 square. If at some point my bot is so awesome that switching back to a square is the only practical means of topping himself, hopefully I can retire and leave it at that. -- Martin