DSA in Python — Course Track
Learn to think like a computer scientist. This track takes you from raw memory all the way to dynamic programming, building genuine intuition at every step.
Why This Track Exists
Every time you use Google Maps, Spotify’s shuffle, or Instagram’s feed, data structures and algorithms are running underneath. The code that makes those products fast isn’t magic — it’s the same ideas you’ll learn here, applied with care.
This track is structured like a course with a deliberate progression:
- Start with memory and arrays — understand how data actually lives in RAM before touching any abstraction.
- Move to linear structures — linked lists and queues that flex where arrays are rigid.
- Build recursion intuition — train your brain to think in sub-problems.
- Master sorting — the proving ground for algorithmic thinking.
- Climb to advanced territory — trees, graphs, heaps, hashing, and dynamic programming.
Each section answers three questions: What is the data layout? Which operations are cheap? What trade-off is being made?
Sections
-
Introduction What is a data structure? What is an algorithm? Why do they power the modern world?
-
Algorithm Analysis Big O notation, time vs space complexity, and how to reason about performance mathematically.
Arrays
The foundation of almost every other data structure. Understand arrays and you understand memory.
- Overview
- RAM — How your computer actually stores data at the byte level.
- Static Arrays — Fixed-size, blazing fast, and the basis for everything else.
- Dynamic Arrays — How Python lists grow themselves automatically.
- Stacks — Last in, first out — the secret behind undo buttons and function calls.
- Kadane’s Algorithm — Find the maximum-sum subarray in a single pass.
- Sliding Window — Fixed Size — Efficiently process every window of size k.
- Sliding Window — Variable Size — Expand and shrink a window to meet a condition.
- Two Pointers — Solve array problems with two indices moving toward each other.
- Prefix Sums — Answer range-sum queries in O(1) after O(n) setup.
Linked Lists
Flexible structures where elements can live anywhere in memory, connected by pointers.
- Overview
- Singly Linked Lists — Chains of nodes, each pointing forward.
- Doubly Linked Lists — Chains that can walk backwards too.
- Queues — First in, first out — how printers and task schedulers work.
- Fast and Slow Pointers — Detect cycles and find midpoints with two pointers at different speeds.
Recursion
The art of solving a problem by solving a smaller version of itself.
- Overview
- Factorial — The classic entry point into recursive thinking.
- Fibonacci Sequence — Where naive recursion meets its first performance wall.
Sorting
Rearranging data is the most studied problem in computer science — and for good reason.
- Overview
- Insertion Sort — Simple, intuitive, and surprisingly fast on small inputs.
- Merge Sort — Divide, conquer, and merge — reliably O(n log n).
- Quick Sort — The algorithm powering most real-world sort implementations.
- Bucket Sort — When you know something about your data, you can beat O(n log n).
Binary Search
Find anything in a sorted collection in O(log n) — eliminate half the possibilities each step.
Trees
Hierarchical structures that power databases, file systems, and autocomplete.
- Overview
- Binary Tree — Each node has at most two children.
- Binary Search Tree — A sorted tree you can search in O(log n).
- BST Insert and Remove
- Depth-First Search — Explore a tree going deep before going wide.
- Breadth-First Search — Explore layer by layer, level by level.
- BST Sets and Maps — How Python’s
setanddictare built. - Trie — A tree for storing strings — the backbone of autocomplete.
- Union-Find — Track connected components with near-O(1) union and find.
- Segment Tree — Range queries and updates in O(log n).
- Iterative DFS — DFS without recursion using an explicit stack.
Backtracking
Explore every possibility systematically — and prune dead ends early.
- Overview
- Tree Maze — Navigate a maze by trying every path and retreating when stuck.
- Subsets — Generate every subset of a set systematically.
- Combinations — Choose k items from n — without repetition.
- Permutations — Every possible ordering of a set of elements.
Heap / Priority Queue
Always get the minimum (or maximum) element in O(log n). Used in Dijkstra’s and task scheduling.
- Overview
- Heap Properties
- Push and Pop
- Heapify — Build a heap from an array in O(n).
- Two Heaps — Use a min-heap and max-heap together to track medians in real time.
Hashing
The secret behind Python’s dict and set — O(1) average lookup via math.
- Overview
- Hash Usage — Using hash maps and sets to solve problems fast.
- Hash Implementation — How collision resolution actually works inside.
Graphs
The most general structure — social networks, maps, and the internet are all graphs.
- Overview
- Intro to Graphs — Nodes, edges, directed vs undirected.
- Matrix DFS
- Matrix BFS
- Adjacency List — The most efficient graph representation for sparse graphs.
- Dijkstra’s — Shortest path in a weighted graph using a priority queue.
- Prim’s — Build a minimum spanning tree greedily, edge by edge.
- Kruskal’s — Build an MST by sorting edges and avoiding cycles.
- Topological Sort — Order tasks so every dependency comes before the task that needs it.
Dynamic Programming
Solve complex problems by caching the results of overlapping sub-problems.
- Overview
- 1-Dimension DP — Single-variable state: climbing stairs, coin change.
- 2-Dimension DP — Grid problems and string comparisons.
- 0 / 1 Knapsack — Pick items with weights and values — each item used at most once.
- Unbounded Knapsack — Same as knapsack but items can be reused unlimited times.
- LCS — Find the longest sequence common to two strings.
- Palindromes — Detect and build palindromic substrings with DP.
Bit Manipulation
Use binary operations directly — the lowest level of computation, and often the fastest.
- Overview
- Bit Operations — AND, OR, XOR, shifts — and why they matter.