Chapter 1: The First Step - Climbing Stairs

Every journey begins with a single step. For our journey into Dynamic Programming, that first step will be LeetCode's "Climbing Stairs" problem. It is a classic for a reason: it's simple enough to grasp intuitively, yet it perfectly demonstrates the core principles of DP and the transition from an inefficient recursive solution to a highly optimized one.

The Problem:

You are climbing a staircase. It takes n steps to reach the top. Each time you can either climb 1 or 2 steps. In how many distinct ways can you climb to the top?

For example:

  • If n = 2, there are two ways: (1 step + 1 step) and (2 steps).

  • If n = 3, there are three ways: (1 + 1 + 1), (1 + 2), and (2 + 1).

Let's break this down.

Approach 1: The Recursive Solution

How do we solve this? Let's think about the very last step we take to reach the top (n). We could have either stepped from stair n-1 or jumped from stair n-2. There are no other possibilities.

This insight is the key! The total number of ways to reach stair n is simply the number of ways to reach stair n-1 plus the number of ways to reach stair n-2.

This gives us a beautiful recursive formula: ways(n) = ways(n-1) + ways(n-2)

We also need base cases to stop the recursion:

  • ways(1) = 1 (Only one way: take 1 step)

  • ways(2) = 2 (Two ways: 1+1 or 2)

Let's translate this directly into Python code.

def climbStairs_recursive(n: int) -> int:
    # Base cases
    if n == 1:
        return 1
    if n == 2:
        return 2

    # Recursive step
    return climbStairs_recursive(n - 1) + climbStairs_recursive(n - 2)

# Trying it out
# print(climbStairs_recursive(5)) # Output: 8
# print(climbStairs_recursive(40)) # This will be VERY slow!

If you run climbStairs_recursive(40), you'll notice it takes a very long time to complete. Why? Because we are re-calculating the same values over and over again. For instance, climbStairs_recursive(5) will call climbStairs_recursive(4) and climbStairs_recursive(3). But the call to climbStairs_recursive(4) will also call climbStairs_recursive(3). We are solving the same subproblem multiple times, leading to an exponential time complexity. This is where DP shines.

Approach 2: Memoization with a Decorator (Top-Down DP)

The problem with our recursive solution is re-computation. The solution is simple: let's store, or cache, the result of a calculation the first time we perform it. The next time we need that same result, we can just look it up in our cache instead of re-computing it. This technique is called memoization.

In Python, we can achieve this with incredible ease using a built-in decorator: @functools.lru_cache. This decorator wraps our function and automatically handles the caching for us.

Let's apply it to our recursive solution.

By adding a single line of code, we've transformed our slow, exponential solution into a highly efficient one. This is the power of top-down dynamic programming. We kept the same recursive logic but eliminated the redundant work.

How Does the Cache Decorator Work?

It feels like magic, but it's a core feature of Python. A decorator is a function that takes another function as an argument, adds some functionality to it, and returns a new, modified function. The @ symbol is just a clean way to apply this transformation.

So how does @functools.lru_cache work? Under the hood, it maintains a dictionary (or a hash map) to store the results of function calls. The keys of this dictionary are the arguments passed to the function, and the values are the corresponding return values.

When our decorated climbStairs_memoized function is called:

  1. The decorator first checks if the argument (n) already exists as a key in its cache.

  2. If it does, it immediately returns the value from the cache, skipping the actual function execution.

  3. If it doesn't, it calls our original function, stores the result in the cache with n as the key, and then returns that result.

This ensures that the expensive computation is only performed once for any given n.

Building Our Own Simple Cache Decorator

To make this concept crystal clear, let's write our own basic caching decorator.

Now, let's apply our homemade decorator to the original recursive function.

As you can see, our simple decorator achieves the same goal! functools.lru_cache is, of course, far more advanced. It includes logic for managing the cache size (the "LRU" stands for "Least Recently Used," meaning it discards old items when the cache gets full), it's written in C for speed, and it handles more complex argument types. However, at its core, it operates on the same principle we've just demonstrated.

Now, let's look at another way to solve this problem that avoids recursion entirely.

Approach 3: Iteration (Bottom-Up DP)

Memoization is a great way to optimize a recursive algorithm. But we can also solve the problem without any recursion at all. Instead of starting from n and working our way down (top-down), we can start from our base cases and build our way up to n (bottom-up). This is known as tabulation.

  1. We know the number of ways to reach step 1 is 1.

  2. We know the number of ways to reach step 2 is 2.

  3. We can calculate the ways to reach step 3 by adding the previous two: ways(3) = ways(1) + ways(2) = 1 + 2 = 3.

  4. We can continue this pattern until we reach n.

Let's implement this iteratively.

This bottom-up approach is often the most efficient. It has a linear time complexity (we just loop from 3 to n) and constant space complexity (we only ever store two previous results).

Conclusion

In this chapter, we took a simple problem and solved it in three different ways:

  1. Pure Recursion: Great for defining the problem, but inefficient due to re-computation.

  2. Memoization (Top-Down DP): We kept the recursive structure and used a cache (via a decorator) to store results, making it efficient.

  3. Iteration (Bottom-Up DP): We started from the base cases and built the solution iteratively, resulting in the most optimal code.

This pattern of thinking—from recursion to memoization to iteration—is a powerful framework that you will use to solve many dynamic programming problems. In the next chapter, we will tackle a new problem and apply these same ideas.

Last updated