NeuroTrack is a Unity-based project where cars learn to drive using evolutionary neural networks and reinforcement-inspired feedback. Built with C# in Unity, the project explores how AI can adapt driving strategies through generations of trial and error.
The long-term goal is to evolve AI agents that can navigate complex tracks with minimal human intervention, much like how real-world self-driving cars refine their control systems through data and simulation.
NeuroTrack currently supports:
Neural Network Inputs:
Network Outputs:
Evolutionary Design: Neural network weights are encoded as “genes” that mutate across generations, enabling the population to gradually improve driving ability.
Reinforcement-Inspired Feedback: Cars are rewarded for progress, speed, and survival while being penalized for collisions.
Neural Network Implementation:
Building and testing the feedforward multilayer perceptron (MLP) to process sensor and velocity inputs.
Genetic Algorithm Integration:
Evolving network weights across generations to improve driving strategies.
Track Generalization:
Training cars to adapt to new tracks rather than overfitting to a single course.
Visualization Tools:
Adding real-time fitness metrics, genome visualization, and replay analysis.
This project is inspired by A.I. Learns to Drive From Scratch in Trackmania, where Yosh trained an AI model to drive in Trackmania. He demonstrated how **neural networks can learn driving techniques such as cornering and acceleration timing through iterative training.
NeuroTrack applies a similar concept in Unity, but with a top-down driving model that balances physics realism with accessible experimentation. The aim is to explore how cars can evolve intelligent driving strategies using nothing but simulated sensors and feedback.