Show HN: The Hessian of tall-skinny networks is easy to invert
rahimiali Thursday, January 15, 2026It turns out the inverse of the Hessian of a deep net is easy to apply to a vector. Doing this naively takes cubically many operations in the number of layers (so impractical), but it's possible to do this in time linear in the number of layers (so very practical)!
This is possible because the Hessian of a deep net has a matrix polynomial structure that factorizes nicely. The Hessian-inverse-product algorithm that takes advantage of this is similar to running backprop on a dual version of the deep net. It echoes an old idea of Pearlmutter's for computing Hessian-vector products.
Maybe this idea is useful as a preconditioner for stochastic gradient descent?
Summary
The article introduces Hessian, a Python library that provides a simple and efficient way to compute the Hessian matrix, which is crucial for optimization and machine learning tasks. It highlights Hessian's performance and versatility in handling complex functions and large-scale problems.
13
11
Summary
github.com