An Artificial Intelligence course visualization using Manim, by Zhu, Xu, and Lu, finished in Dec 2024.
—
This project was designed to visualize and animate the neural network section of the “Fundamentals of Artificial Intelligence” course. Built on ManimML, it turns abstract AI concepts into intuitive animations, covering single neurons, multilayer perceptrons, CNNs, RNNs, and optimization techniques.
- Neuron Visualization – Animates how weights and activation functions affect the response of a single neuron.
- MLP Structure – Shows the architecture and working principle of multilayer perceptrons.
- MNIST Demo – Visualizes the full process of handwritten digit recognition.
- Universality of NNs – Demonstrates why neural networks are powerful approximators.
- Gradient Descent – Animates optimization strategies and learning rate effects.
- Vanishing Gradient – Explains solutions with activation functions like ReLU.
- CNN Visualization – Shows convolution kernels, feature maps, and pooling.
- YOLO Convolution – Illustrates how YOLO applies convolution in object detection.
- Classical AI Examples – Includes animations of linear regression, eight-puzzle solving, and clause partitioning.
—
Check the files to watch the animations!