§ Edward Kmett's list of useful math
- I use Bayesian statistics constantly for probabilistic programming and neural networks. Calculus gave me access to understand automatic differentiation, which gets used in everything I do. Real analysis doesn't come up often, but intermediate value thm. style arguments are handy.
- I don't use classic geometry, but I when I was doing graphics I used projective geometry, and that served as a gateway to understand category theory's duality principle, and I use category theory to organize most of the code I write.
- I took one course on differential geometry. The knowledge I gained from it probably led to about half of my income to date. I've made a career out of weaponizing "obscure" math, e.g. using toric varieties of rational functions of Zhegalkin polynomials to schedule instructions...
- Differential equations? Physics? Well, I use Hamiltonian methods for most of the sampling I do in the world of probabilistic programming. So understanding symplectic integrators is a big step, and I have to move frictionless particles subject to a Hamiltonian, so there's diff eq.
- Fourier analysis, heat equations? Well, if I want to approximate a space/distribution, http://ddg.math.uni-goettingen.de/pub/GeodesicsInHeat.pdf
- Learning group theory "bottom up" from monoids and the like has been useful, because I use monoids basically everywhere as a functional programmer to aggregate data. It led to the work I did at S&P Capital IQ, and later to basically my entire niche as a functional programmer.
- But understanding more of the surrounding landscape has been useful as well, as I use all sorts of lattices and order theory when working with propagators. And I use regular and inverse semigroups (note, not groups!) when working with all sorts of fast parsing techniques.
- Complex analysis? Understanding Moebius transformations means I can understand how continued fractions work, which leads to models for computable reals that don't waste computation and are optimally lazy. Knowing analytic functions lets me understand complex step differentiation.
- Linear algebra is in everything I've done since highschool. Learning a bit of geometric algebra, and playing around with Plucker coordinates led to me licensing tech to old game companies for computational visibility back before it was a "solved" problem.
- Wandering back a bit, Gröbner bases wind up being useful for comparing circuits modulo 'don't care' bits for impossible situations, and all sorts of other simplification tasks.
- Let's go obscure. Pade approximants? Good rational approximations, not polynomial ones. Sounds academic, but computing exp and log is expensive, and fast rational approximations can be conservative, monotone and have nice derivatives, speeding NN-like designs a lot.
- Weird number systems => Data structures. Category theory acts as a rosetta stone for so many other areas of math it isn't even funny. You can understand almost all of the essential bits of quantum computing just knowing by category theory.
- Logic. Well, which logic? You run into a modal logic in a philosophy class some time and think of it as a clever hack, but monads in FP are basically a modality. Modal logics for necessity/possibility model effect systems well. Substructural logics to manage resource usage...
- I don't use a lot of number theory. There. Mind you, this is also one of those areas where I'm just standing on weak foundations, so I don't know what I don't know. I just know it's hard to do all sorts of things that sound easy and try to muddle through w/ my limited background.