# macula-tweann
**Topology and Weight Evolving Artificial Neural Networks for Erlang**
[](https://hex.pm/packages/macula_tweann)
[](https://hexdocs.pm/macula_tweann/)
[](https://github.com/macula-io/macula-tweann/blob/main/LICENSE)
Evolutionary neural networks that evolve both topology and weights, now with **Liquid Time-Constant (LTC) neurons** for adaptive temporal processing. Based on DXNN2 by Gene Sher.
## Highlights
- **First TWEANN library with LTC neurons** in Erlang/OTP
- **CfC closed-form approximation** - ~100x faster than ODE-based LTC
- **Hybrid networks** - Mix standard and LTC neurons in the same network
- **Production ready** - Comprehensive logging, error handling, and process safety
## Quick Start
```erlang
%% Add to rebar.config
{deps, [{macula_tweann, "~> 0.10.0"}]}.
%% Create and evolve a standard network
genotype:init_db(),
Constraint = #constraint{morphology = xor_mimic},
{ok, AgentId} = genotype:construct_agent(Constraint),
genome_mutator:mutate(AgentId).
%% Use LTC dynamics directly
{NewState, Output} = ltc_dynamics:evaluate_cfc(Input, State, Tau, Bound).
```
## LTC Neurons
Liquid Time-Constant neurons enable **adaptive temporal processing** with input-dependent time constants:

```erlang
%% CfC evaluation (fast, closed-form)
{State1, _} = ltc_dynamics:evaluate_cfc(1.0, 0.0, 1.0, 1.0),
{State2, _} = ltc_dynamics:evaluate_cfc(1.0, State1, 1.0, 1.0).
%% State persists between evaluations - temporal memory!
```
Key equations:
- **LTC ODE**: `dx/dt = -[1/τ + f(x,I,θ)]·x + f(x,I,θ)·A`
- **CfC**: `x(t+Δt) = σ(-f)·x(t) + (1-σ(-f))·h` (100x faster)
See the [LTC Neurons Guide](https://hexdocs.pm/macula_tweann/ltc-neurons.html) for details.
## Documentation
- **[Installation](https://hexdocs.pm/macula_tweann/installation.html)** - Add to your project
- **[Quick Start](https://hexdocs.pm/macula_tweann/quickstart.html)** - Basic usage
- **[LTC Neurons](https://hexdocs.pm/macula_tweann/ltc-neurons.html)** - Temporal dynamics
- **[LTC Usage Guide](https://hexdocs.pm/macula_tweann/ltc-usage-guide.html)** - Practical examples
- **[Architecture](https://hexdocs.pm/macula_tweann/architecture.html)** - System design
- **[Full Documentation](https://hexdocs.pm/macula_tweann/)** - All guides and module docs
## Features
### Neural Network Evolution
- **Topology Evolution**: Networks add/remove neurons and connections
- **Weight Evolution**: Synaptic weights optimized through selection
- **Speciation**: Behavioral diversity preservation (NEAT-style)
- **Multi-objective**: Pareto dominance optimization
### LTC/CfC Neurons (NEW in 0.10.0)
- **Temporal Memory**: Neurons maintain persistent internal state
- **Adaptive Dynamics**: Input-dependent time constants
- **CfC Mode**: ~100x faster than ODE-based evaluation
- **Hybrid Networks**: Mix standard and LTC neurons
### Production Quality
- **Process Safety**: Timeouts and crash handling
- **Comprehensive Logging**: Structured logging throughout
- **Rust NIF (optional)**: High-performance network evaluation
- **Mnesia Storage**: Persistent genotype storage
## Architecture

Process-based neural networks with evolutionary operators. See [Architecture Guide](https://hexdocs.pm/macula_tweann/architecture.html) for details.
## Testing
```bash
rebar3 eunit # Unit tests (801 tests)
rebar3 dialyzer # Static analysis
rebar3 ex_doc # Generate documentation
```
## Academic References
### TWEANN/NEAT
- **Sher, G.I.** (2013). [*Handbook of Neuroevolution Through Erlang*](https://www.springer.com/gp/book/9781461444626). Springer.
- Primary reference for DXNN2 architecture and Erlang implementation patterns.
- **Stanley, K.O. & Miikkulainen, R.** (2002). [Evolving Neural Networks through Augmenting Topologies](http://nn.cs.utexas.edu/downloads/papers/stanley.ec02.pdf). *Evolutionary Computation*, 10(2), 99-127.
- Foundational NEAT paper introducing speciation and structural innovation protection.
- **Stanley, K.O.** (2004). [Efficient Evolution of Neural Network Topologies](http://nn.cs.utexas.edu/downloads/papers/stanley.cec02.pdf). *Proceedings of the 2002 Congress on Evolutionary Computation (CEC)*.
- Complexity analysis and efficiency improvements for topology evolution.
### LTC/CfC Neurons
- **Hasani, R., Lechner, M., et al.** (2021). [Liquid Time-constant Networks](https://ojs.aaai.org/index.php/AAAI/article/view/16936). *Proceedings of the AAAI Conference on Artificial Intelligence*, 35(9), 7657-7666.
- Introduces adaptive time-constant neurons with continuous-time dynamics.
- **Hasani, R., Lechner, M., et al.** (2022). [Closed-form Continuous-time Neural Networks](https://www.nature.com/articles/s42256-022-00556-7). *Nature Machine Intelligence*, 4, 992-1003.
- CfC closed-form approximation enabling ~100x speedup over ODE-based LTC.
### Weight Initialization
- **Glorot, X. & Bengio, Y.** (2010). [Understanding the difficulty of training deep feedforward neural networks](http://proceedings.mlr.press/v9/glorot10a/glorot10a.pdf). *Proceedings of AISTATS*.
- Xavier initialization theory used for network weight initialization.
### Evolutionary Algorithms
- **Holland, J.H.** (1975). *Adaptation in Natural and Artificial Systems*. MIT Press.
- Foundational text on genetic algorithms.
- **Yao, X.** (1999). [Evolving Artificial Neural Networks](https://ieeexplore.ieee.org/document/784219). *Proceedings of the IEEE*, 87(9), 1423-1447.
- Comprehensive survey of neuroevolution approaches.
### ONNX Export
- **ONNX Consortium** (2017-present). [Open Neural Network Exchange](https://onnx.ai/).
- Open standard for neural network interoperability enabling cross-platform inference.
## Related Projects
### Macula Ecosystem
- **[macula](https://hex.pm/packages/macula)** - HTTP/3 mesh networking platform with NAT traversal, Pub/Sub, and async RPC. Enables distributed neuroevolution across edge devices.
- **[macula_neuroevolution](https://hex.pm/packages/macula_neuroevolution)** - Population-based evolutionary training engine that orchestrates neural network evolution using this library.
### Inspiration & Related Work
- **[DXNN2](https://github.com/CorticalComputer/DXNN2)** - Gene Sher's original TWEANN implementation in Erlang, the foundation for this library.
- **[NEAT-Python](https://neat-python.readthedocs.io/)** - Popular Python implementation of NEAT.
- **[SharpNEAT](http://sharpneat.sourceforge.net/)** - High-performance C# NEAT implementation.
- **[PyTorch-NEAT](https://github.com/uber-research/PyTorch-NEAT)** - Uber's PyTorch-based NEAT implementation.
- **[LTC/CfC Reference Implementation](https://github.com/raminmh/liquid_time_constant_networks)** - MIT/ISTA reference implementation of LTC networks.
## License
Apache License 2.0 - See [LICENSE](https://github.com/macula-io/macula-tweann/blob/main/LICENSE)
## Credits
Based on DXNN2 by Gene Sher. Adapted with LTC extensions by [Macula.io](https://macula.io).