File size: 3,860 Bytes
32b65b6
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
---
tags:
- evolutionary-algorithms
- neural-turing-machines
- meta-learning
- classics-revival
- experimental
license: apache-2.0
library_name: pytorch
---

# Evolutionary Turing Machine - The Classics Revival

**Neural Turing Machines That Evolve Their Own Architectures**

**Experimental Research Code** - Functional but unoptimized, expect rough edges

## What Is This?

Evolutionary Turing Machine combines Neural Turing Machines with evolutionary algorithms to create memory-augmented networks that evolve their own architectures. Instead of hand-designing memory systems, populations of NTMs compete and evolve optimal memory configurations.

**Core Innovation**: NTMs that mutate their memory slots, read/write heads, and controller architectures through evolutionary pressure, discovering novel memory patterns.

## Architecture Highlights

- **Self-Evolving Memory**: Memory slots and dimensions adapt through evolution
- **Adaptive Read/Write Heads**: Number and behavior of memory heads evolve
- **Controller Architecture Search**: LSTM controller dimensions discovered evolutionarily
- **Task-Driven Fitness**: Evolution guided by performance on memory tasks
- **Population Diversity**: Maintains genetic diversity in memory architectures
- **Crossover Operations**: Recombines successful memory strategies

## Quick Start
```python
from evolutionary_turing import EvolutionaryTuringMachine, EvolutionaryTuringConfig

# Create evolutionary NTM system
config = EvolutionaryTuringConfig(
    population_size=50,
    input_dim=8,
    output_dim=8,
    max_generations=100
)

evolution = EvolutionaryTuringMachine(config)

# Evolve population on memory tasks
history = evolution.run_evolution()

# Get the best evolved model
best_ntm = evolution.get_best_model()
```

## Current Status
- **Working**: Population evolution, architecture mutations, memory task evaluation, crossover operations
- **Rough Edges**: No distributed evolution, limited task variety, basic fitness functions
- **Still Missing**: Advanced mutation operators, multi-objective optimization, neural architecture search integration
- **Performance**: Functional on toy problems, needs scaling for complex tasks
- **Memory Usage**: High due to population storage, optimization needed
- **Speed**: Sequential evaluation, parallelization would help significantly

## Mathematical Foundation
The evolutionary process optimizes the NTM architecture space through genetic algorithms:
```
Fitness(NTM) = Performance(copy_task) × 0.5 + Performance(recall_task) × 0.3 + Efficiency × 0.2
```
Mutations modify:
- Controller dimensions: `d_new = d_old + N(0, σ_controller)`
- Memory parameters: `M_slots_new ~ U[16, 256]`, `M_dim_new ~ U[8, 64]`
- Head configurations: `heads_read_new ~ U[1, 4]`, `heads_write_new ~ U[1, 3]`

Selection pressure favors architectures that balance task performance with computational efficiency.

## Research Applications
- **Neural architecture search for memory systems**
- **Adaptive memory allocation strategies**
- **Meta-learning through evolutionary computation**
- **Automated machine learning for sequence tasks**
- **Evolutionary neural network design**

## Installation
```bash
pip install torch numpy
# Download evolutionary_turing.py from this repo
```

## The Classics Revival Collection

Evolutionary Turing Machine is part of a larger exploration of foundational algorithms enhanced with modern neural techniques:

- **Evolutionary Turing Machine** ← You are here
-  Hebbian Bloom Filter
-  Hopfield Decision Graph
-  Liquid Bayes Chain
-  Liquid State Space Model
-  Möbius Markov Chain
-  Memory Forest

## Citation
```bibtex
@misc{evolutionaryturing2025,
  title={Evolutionary Turing Machine: Self-Evolving Memory Architectures},
  author={Jae Parker 𓅸 1990two},
  year={2025},
  note={Part of The Classics Revival Collection}
}
```