Roadmap

AI Learning Path

From programming basics to building production LLM applications — a structured path through the concepts that matter.

01

Programming Basics

Variables, loops, functions, data structures, control flow.

Why

Every AI system is code. You need the foundation before touching models.

Build

Build a command-line number guessing game with user input and loops.

02

Python

Python syntax, OOP, list comprehensions, file I/O, virtual environments.

Why

Python is the dominant language for ML, data science, and AI engineering.

Build

Build a CSV data analyser that computes basic statistics.

03

NumPy

Vectors, matrices, broadcasting, dot products, element-wise ops.

Why

NumPy is the computational foundation for ML. Understand arrays before tensors.

Build

Implement matrix multiplication and softmax from scratch with NumPy.

04

Linear Algebra

Vectors, dot products, matrix multiplication, eigenvalues, SVD.

Why

Linear algebra is the mathematics of neural networks and embeddings.

Build

Implement a 2D transformation visualiser showing rotations and scaling.

05

Probability & Statistics

Probability, distributions, Bayes' theorem, entropy, KL divergence.

Why

LLMs are probabilistic systems. Loss functions, sampling, and logits all require this.

Build

Implement a Naive Bayes text classifier from scratch.

06

Machine Learning

Supervised/unsupervised learning, train/val/test split, overfitting, regularisation.

Why

Neural networks are a subset of ML. You need the broader context.

Build

Train a logistic regression model on a binary classification dataset.

07

Neural Networks

Perceptrons, dense layers, activations (ReLU, Sigmoid, Tanh), forward pass.

Why

The core building block of all modern AI systems.

Build

Build a 2-layer neural network from scratch using only NumPy.

08

Backpropagation

Chain rule, gradient computation, vanishing/exploding gradients.

Why

Backprop is how networks learn. Understanding it separates practitioners from users.

Build

Implement backpropagation manually for a 2-layer network, verify with numerical gradients.

09

Optimisation

SGD, momentum, Adam, learning rate schedules, batch size effects.

Why

Choosing the right optimizer and hyperparameters directly determines model quality.

Build

Compare SGD vs Adam on a noisy regression task. Plot loss curves.

10

Transformers

Attention mechanism, multi-head attention, positional encoding, encoder/decoder.

Why

Every major LLM — GPT, Claude, Gemini — is a Transformer variant.

Build

Implement scaled dot-product attention in pure Python/NumPy.

11

RAG

Embeddings, vector databases, semantic search, chunk selection, context injection.

Why

RAG is the dominant pattern for building LLM apps that work with private/recent data.

Build

Build a local Q&A system that retrieves from a document collection using embeddings.

12

AI Agents

Tool use, function calling, multi-step reasoning, agent loops, scratchpad.

Why

Agents are the next layer above RAG — AI that can take actions, not just answer questions.

Build

Build an agent that can use a calculator and web search tool to answer math questions.

13

LLM Product Engineering

Prompt engineering, streaming APIs, SSE, FastAPI, eval frameworks, cost management.

Why

Knowing the theory is not enough. Production AI products require full-stack engineering.

Build

Deploy a streaming RAG API with FastAPI and connect it to a Next.js frontend.