Effectiveness

Jump to navigation Jump to search
Revision as of 23 September 2012 at 15:14.
The highlighted comment was edited in this revision. [diff]

Effectiveness

While I have only tested a little, I think this has lots of room for improvements in robots. I tried using the simplistic "bandwidth = rawHitPercentage" capped at upper and lower bounds and it leads to less than 60 bullet damage from HawkOnFire over 1000 rounds. I wonder what bugs I have in my surfing...

    AW22:37, 22 September 2012

    Hmm, interesting. With regards to the "wavesurfing without explicit dive protection" part though, I feel that Waves/Precise_Intersection covers that perfectly well, and no additional "dynamic bandwidth" is needed for that purpose. Really, the only reason I can think of for any kind of "dynamic bandwidth" besides that provided by precise intersection, is for the case of small data set size near the start of a round, where the uncertainty is greater.

      Rednaxela06:35, 23 September 2012

      I'm not sure about precise intersection negating dive protection, because the precise intersection only helps determine what part of the wave you would be hit by, not how dangerous that part of the wave may be.

      I agree that variable bandwidth isn't a replacement for some sort of dive protection, though, because they really solve different problems - variable bandwidth doesn't take into account the reduced reaction time to enemy bullets, escaping rambots etc. But I think that variable bandwidth certainly has applications. Particularly with multi-modal distributions on the wave, the relative danger of particular locations can change considerably if the bandwidth is changed - what was considered to be a safe point between two logged GF hits with a small bandwidth might be considered more dangerous than either of the peaks with a large bandwidth.

        Skilgannon11:12, 23 September 2012

        Er, precise intersection most certainly handles "how dangerous that part of the wave may be", at least if you take full advantage of the data it returns, to integrate the danger over the full range it returns.

          Rednaxela15:10, 23 September 2012

          The actual precise intersection just gives the angle range of the wave which could contain a bullet and hit you. It's the stats system which says how likely a bullet is to be within this range, and the danger function which adds other information such as wave weighting.

          Precise intersection isn't any better at dive protection than just dividing by distance-to-enemy in your danger function. What it's better at is dodging bullets in tight situations where you're reasonably sure where the bullet is =)

            Skilgannon15:44, 23 September 2012

            In my view, the wider range of angles that bullets may hit you at, is the only real "current danger" to include in the danger function. Now, it may be appropriate to also add a "future danger" for how a closer distance may affect waves not yet existing, but that should be:

            1. additive on top of normal danger (because it's not additional risk of current waves, it's a risk due to be close in later waves)
            2. based on the distance-to-enemy at the end of the last wave currently surfed (the future risk is all after that point in time)

            Dividing by distance-to-enemy is really kind of a fudge factor that IMO doesn't reflect the real dynamics of the situation very well.

              Rednaxela16:13, 23 September 2012
               
               
               
               

              I tried something like this for gun, but it didn't really work out. I ended up getting much better results using a square kernel. From Wikipedia, if you assume that your distribution is Gaussian then the optimal bandwidth would be:

              <math>h = \left(\frac{4\hat{\sigma}^5}{3n}\right)^{\frac{1}{5}} \approx 1.06 \hat{\sigma} n^{-1/5}</math>, where <math>\hat{\sigma}</math> is the standard deviation of the samples and <math>n</math> is the number of samples.

              Perhaps this would work well for movement, where there is much less data to work with. It might also be necessary to add some sanity checks to the calculated h value in case there is only 1 or 2 samples, etc.

              Of course, I'm fairly sure our distributions are not at all Gaussian, or even uni-modal, so this formula might not be relevant at all.

                Skilgannon11:00, 23 September 2012

                I've considered something like this before but, when you have less than 20 samples or so, your estimate of standard deviation itself is going to have a large amount of uncertainty. I suspect one would need to determine the "typical" standard deviation for most bots, use that as the initial value, and slowly transition to a value calculated from the data.

                Regarding the distribution not being gaussian, indeed it wouldn't be... but I think that formula may still be somewhat roughly applicable if we apply a correction factor for how the uncertainty gets smaller near the maximum escape angle, perhaps modeled off of the behavior of the binomial distribution near the edges.

                  Rednaxela15:26, 23 September 2012

                  I actually think this formula might not adapt to multimodal distributions very well at all, because it will try to adapt the bandwidth to fit to one big, centered danger, which may not be a reasonable assumption to make, considering things like segmented VCS buffers and different ways of measuring similar attributes (vel+offset vs latvel+advvel comes to mind) in the gun systems we're trying to dodge.

                  Maybe clustering the logs into different groups based on their GFs, then using this formula on each of those, with each set of logs having its own bandwidth for the final kernel density estimate would be more effective.

                    Skilgannon15:58, 23 September 2012

                    Hmm... at least with sufficient data I'd agree that applying bandwidth estimation separately to different clusters would make sense. The cases where the appropriate bandwidth changes most significantly would be when there's limited data though (each additional data point doesn't change uncertainty as much once there are already many data points).

                      Rednaxela16:05, 23 September 2012