Great and very detailed article!
Algorithmic optimisation will pretty much always outperform low level optimisation's such as optimising memory access for cache misses.
You really shouldn't need to worry about cache misses in Java!
Optimising from o(n) to o(log n) will give a big performance benefit!
Yes, I agree with that. It just happens that in a scenario where you get on average one scan every 4-5 ticks, and the average BFT is 50 or less, even the theoretical improvement becomes negligible. But I'm a guy who likes to have the worst case situations nicely covered :P
The interesting question for me is: "does this make my bot run faster?"
I do not have this answer, I only know that this helps me not skipping turns because of odd worst case situations.
Well, I think the worst cases is not about bft, but the entire round time. BFT is too small to make you skip a turn, but a bug most bot authors make could make the worst case round-long.
The catch-point is, how do you handle data from different round?
Yeah, it is, my gun was pretty slow in melee. Idk if I got what you mean. Can you clarify?
e.g. You store the information of the next round right after the first round, and when the scans of the first round isn’t enough to get a hit, you continue searching scans from the next round and start from time = 0 to time = movie start time + bft.
if you store time as globaltime, this will only result in inaccurate result which may be eliminated by kde. But if you store round time, it will cause the data of the entire round be iterated.
Of course it is not only abount the BFT time, we still have a Kdtree, and the other components, but when we are talking about milliseconds it helps a lot.
Well, that’s only true for large enough n... And for small n, such as our cases, constant factor is dominant.
Btw, memory access is WAAAY expensive than basic calculations, so the gain for optimized memory access, for small n, often outperforms paper algorithms that don’t use contiguous memory in order.