Story

Show HN: Bring your own prompts to remote shells

tgalal Monday, March 09, 2026

Instead of giving LLM tools SSH access or installing them on a server, the following:

   promptctl ssh user@server
makes a set of locally defined prompts "appear" within the remote shell as executable command line programs.

For example:

  # on remote host
  analyze-config --help
  Usage: analyze-config [OPTIONS] --path <path>

  Prompt inputs:
        --all          
        --path <path>  
        --opt          
        --syntax       
      --sec          
would render and execute the following prompt:

  You are an expert sysadmin and security auditor analyzing   the configuration
  file {{path}}, with contents:
  
  {{cat path}}
  
  Identify:
  
  {{#if (or all syntax) }}- Syntax Problems{{/if}}
  {{#if (or all sec) }}- Misconfigurations and security risks{{/if}}
  {{#if (or all opt) }}- Optimizations{{/if}}
  
  For each finding, state the setting, the impact, a fix, and a severity
  (Critical/Warning/Info).
Nothing gets installed on the server, API keys never leave your computer, and you have full control over the context given to the LLM.

Github: https://github.com/tgalal/promptcmd/

Documentation: https://docs.promptcmd.sh/

Summary
PromptCMD is an open-source command-line interface (CLI) tool that allows users to generate AI-powered prompts for various language models, including GPT-3, DALL-E, and Whisper. The tool aims to simplify the process of creating effective prompts and enable users to explore the capabilities of these AI models.
3 0
Summary
github.com
Visit article Read on Hacker News