Neural Networks and Genetic Algorithms Source: Eralp
Hello, I am trying to create a very basic evolution simulation but there are few things I didn't understand clearly.
I am creating "worms" and "mushrooms" and putting them into my virtual world, and at each frame I give the coordinates of the nearest mushroom to every worm and thus they update their direction and location properly.
So the input to my "worms" are 4 doubles, the nearest mushrooms x and y coordinates and it's own x and y coordinates, the best solution would be to chose direction as atan2 function with parameters x2-x1 and y2-y1, so in bare bones I am trying to teach my neural network how to calculate atan2 function.
I am using NeuronDotNet in C# but I don't know if these things are common in NN libraries since this is the first I am using.
I created a LinearLayer as input layer and sigmoid layers for hidden and output layers. input has 4 neurons, output has 1 neuron these are fixed(I guess), I experimented with hidden layer assigning 1,2 and 3 neurons but I couldn't spot a big difference.
So my questions are would it be more successful if I decreased input neurons to 2 and give the difference pair as input(since atan2 function needs these two) ? But I assume no matter what, after training my NN, it should be able to understand the relation between those "pairs".
Also how can I decide which type of layer I should be using? The direction is between 0 and 2*pi so I assumed using sigmoid layer and multiplying the output with 2*pi will do the job. However linear layers are giving better results strangely(or expectedly?)
About the genetic algorithm part I first randomly assign weights to all neurons but I couldn't find in the documentation about the range of the weight and bias values. Should I assign unity values so far I've been assigning between -20 and +20 but everytime I test a random chromosome it is either giving me a value very close to 0 or very close to 1. I mean is it because the range I use or is it just VERY likely to get a sigmoid output that is very "thin" with random weights.
| }
|