Story

Show HN: CPU-based Neural Net. Zero floats. Returns "I don't know"

kwojno Monday, February 02, 2026

I just built a pattern-matching neural network with zero IEEE 754 floats. All arithmetic uses integer ratios (5/8 instead of 0.625), compared via cross-multiplication. No infinity anywhere. The key feature: when confidence is below threshold, it returns "I don't know" instead of hallucinating an answer. Tested on medical diagnosis (1179 diseases) — 50% correct, 20% wrong-but-related, 30% honest "I don't know".

Core ideas:

Every operation costs "budget" — network dies before seeing everything Confidence is Ratio(n,d), not float — division only for human display Rare symptoms weighted higher (entropy from information theory) 500 lines of Rust, formal spec in Coq (44 files)

Emerged from a critique of Pascal's Wager: what if observation itself has finite cost?

Oh, and it runs on CPU, not GPU. And it runs fast as hell. How? That's a bit of a mystery for me too...

However, what's most important is that it is inherently capable of saying "well, I dunno...".

no binary dictate of YES or NO. there is always an option to chose UNCERTAINTY. and there is nothing more natural than uncertainty in real life.

well, in there repository, you will not only find the network but also a hefty collection of coq files that define the ontology of finitiary math I have written with ai during my long stay in hospital bed with cancer. cancer is receding, code is compiling, and the network is learning. try it out.

Summary
This article presents a fully finite, Coq-verified framework for void mathematics, which provides a formal foundation for the study of infinite and infinitesimal quantities. The framework offers a comprehensive approach to handling these concepts in a precise and rigorous manner, enabling further advancements in the field of mathematical analysis.
1 0
Summary
github.com
Visit article Read on Hacker News