Show HN: Self-growing neural networks via a custom Rust-to-LLVM compiler
pierridotite Sunday, December 28, 2025Hi HN,
I built NOMA (Neural-Oriented Machine Architecture), a systems language where reverse-mode autodiff is a compiler pass (lowered to LLVM IR).
My goal is to treat model parameters as explicit, growable memory buffers. Since NOMA compiles to standalone native binaries (no Python runtime), it allows using realloc on weights mid-training. This makes "self-growing" architectures a system primitive rather than a complex framework hack.
I just pushed a reproducible benchmark (Self-Growing XOR) to validate the methodology: it compares NOMA against PyTorch and C++, specifically testing how preserving optimizer state (Adam moments) during growth affects convergence.
I am looking for contributors! If you are into Rust, LLVM, or SSA, I’d love help on the harder parts (control-flow AD and memory safety).
Repo: https://github.com/pierridotite/NOMA