Talk:RougeDC

From Robowiki
Jump to navigation Jump to search

Hey, do you copy thing from Template:Bot Page or using subst:? » Nat | Talk » 06:52, 19 May 2009 (UTC)

  • Using subst:. I thought I'd try it, didn't get around to cleaning up after though :) --Rednaxela 07:15, 19 May 2009 (UTC)

Bad Results (willow)

Version "willow" may be throwing exceptions or possibly running out of memory. In addition to the battle you pointed out on the RR server talk page, there are other examples: [1], [2] (Simonton's client), [3] (Voidious), [4] (Rednaxela, no opponent bullet dmg). --Darkcanuck 21:12, 24 August 2009 (UTC)

It's for certain more memory efficient than the old version. Throwing exceptions is a possibility, but I don't see how and cannot reproduce this at all... If anyone sees it throw an exception it would be nice to hear about it... =\ --Rednaxela 21:23, 24 August 2009 (UTC)

I see this in my RoboRumble client:

Fighting battle 7 ... ags.rougedc.RougeDC willow,voidious.Dookious 1.573c
Preparing battle...
Exception in thread "Battle Thread" java.lang.OutOfMemoryError: Java heap space
	at java.io.ObjectOutputStream$BlockDataOutputStream.<init>(ObjectOutputStream.java:1711)
	at java.io.ObjectOutputStream.<init>(ObjectOutputStream.java:221)
	at robocode.util.ObjectCloner.deepCopy(Unknown Source)
	at robocode.robotpaint.Graphics2DProxy.copyOf(Unknown Source)
	at robocode.robotpaint.Graphics2DProxy.create(Unknown Source)
	at robocode.battle.snapshot.RobotSnapshot.<init>(Unknown Source)
	at robocode.battle.snapshot.TurnSnapshot.<init>(Unknown Source)
	at robocode.battle.Battle.finalizeTurn(Unknown Source)
	at robocode.battle.Battle.runTurn(Unknown Source)
	at robocode.battle.Battle.runRound(Unknown Source)
	at robocode.battle.Battle.run(Unknown Source)
	at java.lang.Thread.run(Thread.java:619)

Dookious uses its fair share of memory, so I can't squarely place the blame on RougeDC, but I don't recall ever seeing/hearing about this for Dooki before. My client is set to 512 MB. I'll try running some battles manually and hunting for exceptions later tonight. --Voidious 21:31, 24 August 2009 (UTC)

You don't think it could be something to do with your new tree? Maybe add something to the benchmark which monitors the maximum memory consumption of the trees as well... --Skilgannon 14:52, 25 August 2009 (UTC)

Monitoring the maximum memory usage would be difficult to do I think... however... running the benchmark in a profiler earlier indicated to me that it's memory usage was just as small as the Simonton and Voidious trees. There is only one way I can think of that RougeDC uses more memory than before: It has a 'time' segment, and it's ever-increasing nature would likely cause any kd-tree to become ridiculously unbalanced given enough time. I'm quite doubtful that is the issue though, since this didn't occur with the old kd-tree which branched in a very similar way and was far less efficient in many ways. --Rednaxela 15:09, 25 August 2009 (UTC)

(double edit conflicts) My last 100 repetitions test run with default heap space (i.e. not -Xmx) so the usage should be under 128M. Red, I think you should recorded the searches/adds request from RougeDC vs. Dookious and run the benchmark again... » Nat | Talk » 15:13, 25 August 2009 (UTC)

Hmm... well... running RougeDC vs Dookious, and running a java memory profiler indicates that the Kd-Tree is not a significant memory user in the slightest, and that the big memory users are some other things in Dookious and RougeDC that have never changed. So... 1) I can't duplicate running out of memory at 512MB like Voidious had, and 2) Even if it were running out of memory, it wouldn't be due to the new Kd-Tree. So unless these bad results were flukes that could have just as easily happened to the old RougeDC, something is very very weird... --Rednaxela 06:47, 28 August 2009 (UTC)

Unless there is bug in the new Kd-Tree, which makes it consume crazy memory when the it actually hits the bug, so unless the profiled run hits it it wouldn't show in the profile. I haven't seen, ran or otherwise tested the new tree, I'm just trying to come up with a possible way the problem is the new tree while not conflicting with your profiler results. Like the bug you said once, hitting an infinite loop when there were many points in the same location, you could have profiled that version and (un)luckily get results that show the tree was very fast and still be a source of skipping turns forever. Anyway, hope you find the problem soon. --zyx 08:42, 28 August 2009 (UTC)

Hit another error... (Same stack trace, so feel free to delete it.)

Fighting battle 19 ... darkcanuck.Holden 1.12a,ags.rougedc.RougeDC willow
Preparing battle...
Exception in thread "Battle Thread" java.lang.OutOfMemoryError: Java heap space
	at java.io.ObjectOutputStream$BlockDataOutputStream.<init>(ObjectOutputStream.java:1711)
	at java.io.ObjectOutputStream.<init>(ObjectOutputStream.java:221)
	at robocode.util.ObjectCloner.deepCopy(Unknown Source)
	at robocode.robotpaint.Graphics2DProxy.copyOf(Unknown Source)
	at robocode.robotpaint.Graphics2DProxy.create(Unknown Source)
	at robocode.battle.snapshot.RobotSnapshot.<init>(Unknown Source)
	at robocode.battle.snapshot.TurnSnapshot.<init>(Unknown Source)
	at robocode.battle.Battle.finalizeTurn(Unknown Source)
	at robocode.battle.Battle.runTurn(Unknown Source)
	at robocode.battle.Battle.runRound(Unknown Source)
	at robocode.battle.Battle.run(Unknown Source)
	at java.lang.Thread.run(Thread.java:619)

This instance of Java is indeed at 576 megs (via Activity Monitor). Maybe there is some obscure memory leak in Robocode that your new tree is exposing? Are you doing your testing with 1.6.1.4? Maybe try some RoboResearch / 1.6.1.4 long runs and see if you hit it? --Voidious 21:44, 28 August 2009 (UTC)

I've also been experimenting with Rednaxela's new tree and got this memory leak (more of an explosion) problem. In the first few ticks of a battle Robocode freezes with an out of heap memory error. I traced it down to one of my dimensions having a some infinity values. I also saw a an array out of bounds error in line 177, probably also related to strange values in some of my dimensions, I'll see if I can figure what's going on. --ABC 23:58, 28 August 2009 (UTC)