## § LCS DP: The speedup is from filtration

• I feel like I finally see where the power of dynamic programming lies.
• Consider the longest common subsequence problem over arrays $A$, $B$ of size $n$, $m$.
• Naively, we have $2^n \times 2^m$ pairs of subsequences and we need to process each of them.
• How does the LCS DP solution manage to solve this in $O(nm)$?
• Key idea 1: create "filtration" of the problem $F_{i, j} \subseteq 2^n\times2^m$. At step $(i, j)$, consider the "filter" $F_{i, j}$containing all pairs of subsequences $(s \in 2^n, t \in 2^m)$ where maxix(s) \leq i and maxix(t) \leq j.
• These filters of the filtration nest into one another, so $F_{i, j} \subseteq F_{i', j'}$ iff $i \leq i'$ and $j \leq j'$.
• Key idea 2: The value of max LCS(filter) is (a) monotonic, and (b) can be computed efficiently from the values of lower filtration. So we have a monotone map from the space of filters to the solution space, and this monotone map is efficiently computable, given the values of filters below this in the filtration.
• This gives us a recurrence, where we start from the bottom filter and proceed to build upward.
• See that this really has nothing to do with recursion. It has to do with problem decomposition . We decompose the space $2^n \times 2^m$cleverly via filtration $F_{i, j}$ such that max LCS(F[i, j]) was efficiently computable.
• To find a DP, think of the entire state space, then think of filtrations, such that the solution function becomes a monotone map, and the solution function is effeciently computable given the values of filters below it.