Learn how GPT and Transformer models work through 75+ interactive visualizations. A free 10-week curriculum covering tokenization, attention mechanisms, backpropagation, training loops, and inference — with 90 quizzes and Google Colab coding labs.
Prerequisites: Linear Algebra, Probability & Information Theory, Backpropagation
Available in English and Turkish. Please enable JavaScript to use this interactive application.