# Neat-Ex
This project provides the means to define, simulate, and serialize Artificial-Neural-Networks (ANNs), as well as the means to develop them through use of the Neuro-Evolution of Augmenting Toplogies ([NEAT](http://nn.cs.utexas.edu/downloads/papers/stanley.ec02.pdf)) algorithm created by Dr. Kenneth Stanley.
Neuro-Evolution, unlike back-propogation, easily allows the usage of recurrent neural networks instead of just feed-forward networks, and fitness functions instead of just training data. Additionally, since NEAT augments topologies, all the engine needs to start is the input/output layout, and a fitness function.
## Installation
This project only requires [Elixir](http://elixir-lang.org/).
### Add as a dependency
Add `{:neat_ex, "~> 1.0.0"}` to your list of deps in your mix file, then run `mix deps.get`.
### Or clone a copy
To clone and install, do:
git clone https://gitlab.com/onnoowl/Neat-Ex.git
cd Neat-Ex
mix deps.get
## Documentation
For details, the latest documentation can be found at http://hexdocs.pm/neat_ex/index.html. For example usage, see the example below.
## Example usage
Here's a simple example that shows how to setup an evolution that evolves nerual networks to act like binary XORs, where -1s are like 0s (and 1s are still 1s). The expected behavior is listed in `dataset`, and neural networks are assigned a fitness based on how close to the expected behavior they come. After 50 or so generations, or 10 seconds of computation, the networks exhibit the expected behavior.
```elixir
dataset = [{{-1, -1}, -1}, {{1, -1}, 1}, {{-1, 1}, 1}, {{1, 1}, -1}] #{{in1, in2} output} -> the expected behavior
fitness = fn ann ->
sim = Ann.Simulation.new(ann)
error = Enum.reduce dataset, 0, fn {{in1, in2}, out}, error ->
result = Dict.get(Ann.Simulation.eval(sim, [{1, in1}, {2, in2}, {3, 1.0}]).data, 4, 0) #node 3 is a "bias node"
error + abs(result - out)
end
:math.pow(8 - error, 2)
end
# Make a new network with inputs [1, 2, 3], and outputs [4].
neat = Neat.new_single_fitness(Ann.new([1, 2, 3], [4]), fitness)
#Then evolve it until it reaches fitness level 63 (this fitness's function's max fitness is 64).
{ann, fitness} = Neat.evolveUntil(neat, 63).best
IO.puts Ann.json(ann) #display a json representation of the ANN.
```
### XOR
mix xor.single
This command runs the sample xor code, evolving a neural network to act as an XOR logic gate. The resulting network can be viewed visually by running the command `./render xor`, and then by opening `xor.png`.
### FishSim
mix fishsim [display_every] [minutes_to_run]
This evolves neural networks to act like fish, and to run away from a shark. Fitness is based on how long fish can survive the shark. It will display ascii art demonstrating the simulation, where the @ sign is the shark, and the digits represent the fish, and the concentration at that specific location (higher numbers show a higher relative concentration of fish).
The evolution will only print out every `display_every` generations (default 1, meaning every generation). Setting it to 5, for example, will evolve for 5 generations between each display (which is far faster).
The evolution lasts `minutes_to_run` minutes (default is 60).
mix fishsim [display_every] [minutes_to_run] [file_to_record_to]
By including a file name, the simulation will record visualization data to the file rather than displaying ascii art. `display_every` becomes handy for limiting the size of the visualization file. To view the recording after it's made, use Jonathan's project found [here](https://gitlab.com/Zanthos/FishSimVisualAid), and pass the file as the first argument.
When the process finishes, you can view the best fish using `./render bestFish`, and then by opening `bestFish.png`
## Testing (for contibuters)
mix test