Story

Show HN: Self-growing neural networks via a custom Rust-to-LLVM compiler

pierridotite Sunday, December 28, 2025

Hi HN,

I built NOMA (Neural-Oriented Machine Architecture), a systems language where reverse-mode autodiff is a compiler pass (lowered to LLVM IR).

My goal is to treat model parameters as explicit, growable memory buffers. Since NOMA compiles to standalone native binaries (no Python runtime), it allows using realloc on weights mid-training. This makes "self-growing" architectures a system primitive rather than a complex framework hack.

I just pushed a reproducible benchmark (Self-Growing XOR) to validate the methodology: it compares NOMA against PyTorch and C++, specifically testing how preserving optimizer state (Adam moments) during growth affects convergence.

I am looking for contributors! If you are into Rust, LLVM, or SSA, I’d love help on the harder parts (control-flow AD and memory safety).

Repo: https://github.com/pierridotite/NOMA

Summary
NOMA is an open-source project that showcases a new neural network architecture, called NOMA, which aims to improve the performance and efficiency of neural networks by leveraging novel concepts from the field of neuroscience.
6 3
Summary
github.com
Visit article Read on Hacker News Comments 3