scale standard deviations to 1
Fragment of a discussion from Talk:Firestarter/Version History
Jump to navigation
Jump to search
Well, that's the easiest way I can think of... And it really surprised me that the standard deviation is almost the same for most bots. If so, then I think normalising standard deviation may not be that useful — since we already tune weights very hard. But for neural networks, I think this is invaluable, since we don't tune weights manually, and that would affect the learning speed a lot.