Show HN: Execute local prompts in SSH remote shells
tgalal Tuesday, March 10, 2026Instead of giving LLM tools SSH access or installing them on a server, the following command:
$ promptctl ssh user@server
makes a set of locally defined prompts "magically" appear within the remote shell as executable command line programs.For example, I have locally defined prompts for `llm-analyze-config` and `askai`. Then on (any) remote host I can:
$ promptctl ssh user@host
# Now on remote host
$ llm-analyze-config /etc/nginx.conf
$ cat docker-compose.yml | askai "add a load balancer"
the prompts behind `llm-analyze-config` and `askai` execute on my local computer (even though they're invoked remotely) via the llm of my choosing.This way LLM tools are never granted SSH access to the server, and nothing needs to be installed to the server. In fact, the server does not even need outbound internet connections to be enabled.
Github: https://github.com/tgalal/promptcmd/
Summary
The article discusses the SSH integration of PromptCMD, a command-line tool for interacting with AI models. It provides details on how to use SSH to connect to remote servers and execute commands through the PromptCMD interface, enabling users to leverage AI capabilities in their remote environments.
3
0
Summary
docs.promptcmd.sh