Quick Start β’ Architecture β’ Genetic Algorithm β’ Benchmarks
This repository contains a solid piece of work in NeuroEvolution in Flappy Bird game. It is a research platform that aims at applying the principles of biological evolution in training artificial neural networks without backpropagation or the need for labeled data.
The system employs a Genetic Algorithm (GA) to modify the weights and biases of a constant, topology neural network. In this way, the agents (birds) learn to avoid the obstacles purely through survival of the fittest, i.e., evolution.
- Adaptive Evolutionary Parameters: Mutation rates dynamically vary according to the population diversity to avoid stagnation of the solution.
- Pluggable Architecture: The modular design of the package allows you to swap the crossover strategies (Uniform, Single, Point, Arithmetic) and selection methods easily.
- Real-time Diagnostics: You can keep track of the "brain" of the best, performing agent live.
- Serialization: You can save and reload the whole state of the best neural networks (in JSON format) for subsequent examination or transfer learning.
- Headless Training: The game logic is decoupled which allows for high, speed training (configurable).
Each agent is controlled by a feed-forward neural network. The topology is fixed, while the weights and biases are the subject of evolution.
- Input Layer (4 Nodes):
Bird Y(Normalized 0-1)Bird Velocity(Normalized 0-1)Distance to Next Pipe (X)(Normalized 0-1)Vertical Distance to Gap (Y)(Normalized, centered at 0.5)
- Hidden Layers: Fully connected layers. Default configuration:
[6, 4]neurons. - Output Layer (1 Node): Jump probability.
- Hidden Layers:
Hyperbolic Tangent (tanh)- Chosen for its zero-centered range[-1, 1], allowing for stronger negative inhibition signals compared to Sigmoid. - Output Layer:
Sigmoid- Maps the final aggregation to a probability[0, 1]. A value> 0.5triggers a jump.
The evolution engine drives the learning process through the following lifecycle:
-
Evaluation: Each agent plays the game until collision.
- Fitness Function:
$F = t_{survival} + (100 \times N_{pipes}) - 50_{crash}$
- Fitness Function:
-
Selection: A subset of parents is chosen to reproduce.
- Default: Tournament Selection (k=3). Robust against outliers.
-
Crossover: Genetic material (weights/biases) is mixed.
- Default: Uniform Crossover. Attributes are chosen randomly from either parent with equal probability, preserving genetic diversity better than single-point crossover for neural weights.
-
Mutation: Random perturbations are applied to weights.
- Default: Gaussian Mutation. Small values drawn from a normal distribution added to weights.
-
Adaptive Logic: If population diversity drops below threshold
$\delta$ , mutation rate$\mu$ is boosted.
-
Elitism & Immigrants:
- Top
$N$ performers carry over unchanged (Elitism). - 10% of new population are randomized "Immigrants" to inject fresh genetic material.
- Top
- Python 3.8 or higher
- pip (Python Package Manager)
# 1. Clone the repository
git clone https://github.com/antilneeraj/geneticalgorithm.git
cd geneticalgorithm
# 2. Install dependencies
pip install -r requirements.txt1. Watch the AI Learn (Training Mode) This is the default mode where you see evolution in action.
python main.py --mode ai_training --population 50 --fps 60- Use
--no-soundto speed up processing slightly.
2. Play as Human Challenge yourself against the game physics.
python main.py --mode human3. Run Best Trained Model Load the best performing bird from previous runs.
python main.py --mode ai_playHyperparameters are located in src/utils/constants.py. Tweak these to experiment with evolutionary dynamics:
| Parameter | Default | Description |
|---|---|---|
POPULATION_SIZE |
150 | Number of agents per generation. Higher = more diversity but slower. |
MUTATION_RATE |
0.1 | Base probability of a gene mutating. |
ELITE_COUNT |
5 | Number of top agents preserved perfectly. |
NN_HIDDEN_NODES |
[6, 4] |
Topology of the "Brain". |
ACTIVATION |
tanh |
Activation function for hidden layers. |
Want to break the AI? Experiment with the evolutionary dynamics in src/utils/constants.py:
- The "Chaos" Test: Set
MUTATION_RATE = 0.8. Watch as the birds struggle to retain knowledge between generations. - Minimalist Brain: Change
NN_HIDDEN_NODES = [2]. Can the agent solve the game with only 2 "neurons"? - Hyper-Selection: Change
ELITE_COUNTto 50. Watch how genetic diversity collapses as the "top" birds dominate the gene pool.
Typical convergence behavior observed with default parameters:
- Gen 0-5: Pure random behavior. Most birds crash immediately.
- Gen 10-20: "Wall-following" or "Floor-hugging" strategies emerge.
- Gen 30-50: Discovery of the gap. Agents begin to pass 1-5 pipes.
- Gen 500+: Mastery. Agents can play indefinitely.
Note: Convergence speed is highly dependent on POPULATION_SIZE and MUTATION_RATE. Larger populations generally converge in fewer generations but take longer computation time per generation.
As the population evolves, the mean fitness (survival time) increases exponentially before plateauing as the solution space is optimized.
Figure 1: Fitness progression over 100 generations showing rapid convergence (Blue) followed by elite optimization (Green).
Note: You can generate your own local stats plot by running the analysis script
plot_results.py(see below) which parses thedata/statistics/evolution_stats.jsonfile.
GeneticAlgorithm
ββ assets
β ββ images
β β ββ All graphics used in the game (bird, pipe, background, base)
β ββ sounds
β ββ All sound effects (flap, point, hit, die)
ββ collab.ipynb
ββ CONTRIBUTING.md
ββ data
β ββ high_score.txt
β ββ models
β β ββ best_bird.json
β ββ statistics
β ββ evolution_stats.json
β ββ training_results.png
ββ diagnostic_ai_debug.py
ββ LICENSE
ββ main.py
ββ plot_results.py
ββ readme.md
ββ requirements.txt
ββ src
β ββ ai
β β ββ crossover.py
β β ββ fitness.py
β β ββ genetic_algorithm.py
β β ββ mutation.py
β β ββ neural_network.py
β β ββ population.py
β β ββ selection.py
β ββ game
β β ββ bird.py
β β ββ game_engine.py
β β ββ pipe.py
β β ββ renderer.py
β ββ utils
β ββ asset_loader.py
β ββ constants.py
ββ validateNN.py
ββ validate_ai_constants.py
We welcome contributions from the research and open-source community!
- Fork the repository.
- Create a Feature Branch (
git checkout -b feature/NewSelectionMethod). - Commit your changes.
- Push to the branch.
- Open a Pull Request.
Please ensure you run diagnostics before submitting:
python diagnostic_ai_debug.pyThis project is licensed under the MIT License - see the LICENSE file for details.