Story

Show HN: Omni-Glass – Rust app that turns screen pixels into MCP tool calls

goshtasb Sunday, February 22, 2026

Omni-Glass is an open-source macOS app (Rust/Tauri) that sits in your menu bar. You draw a box around anything on your screen — a terminal error, a data table, a foreign-language doc — and it runs local OCR, sends the text to an LLM, and gives you a menu of executable actions in under a second. Not explanations. Actions. It fixes the error, exports the CSV, creates the GitHub issue, runs the command. The LLM layer supports Claude Haiku, Gemini Flash, or Qwen-2.5 running locally via llama.cpp (fully offline, nothing leaves your machine). The part I'm most excited about: it's built on MCP (Model Context Protocol). Anyone can write a plugin — a standard MCP server in Node.js or Python — and their actions show up in the menu automatically. The app translates raw OCR text into structured JSON arguments matching your tool's schema. You just write the API call. Every plugin runs inside a kernel-level macOS sandbox (sandbox-exec). Your entire home directory is walled off unless you explicitly approve access. Environment variables are filtered. Commands require confirmation.

Looking for help with:

Build a plugin. Jira, Slack, Notion, Linear, Datadog — if it has an API, it can be an Omni-Glass action. Most plugins are under 100 lines. Break the sandbox. If you can read ~/.ssh/id_rsa from a plugin process, I want to know. Windows and Linux. The code compiles on Windows but hasn't been tested on real hardware. Linux needs Tesseract OCR and Bubblewrap sandbox work.

GitHub: https://github.com/goshtasb/omni-glass

Summary
Omni-Glass is an open-source augmented reality (AR) platform that aims to provide a unified and extensible framework for building AR applications. It offers a modular design, support for various input sources, and a plugin-based architecture to enable rapid development of AR experiences.
3 0
Summary
github.com
Visit article Read on Hacker News