hey-comma
v1.2.0
Published
Run shell commands using natural language
Downloads
5
Readme
Table of Contents
About
Use natural language to run shell commands using GPT-3.
Just say what you want to do and hey,
will generate the command for you.
Features
- use natural language to run shell commands
- explains files, scripts or any data scripts using GPT-3
- caches successful commands to speed up future runs
Why?
Shell scrips are powerful, but only if you know how to use them. hey,
makes it easier to use shell scripts by using natural language.
Always forget the command to pack a directory into a tarball? Just say it:
hey, create a tarball with all files in the current directory, except javascript files
Install
hey,
requires Node.js v16 or higher.
npm i -g hey-comma
Note: pnpm does not like the comma, so only the
hey
alias is available. You can add the alias manually if you want to:alias hey,=hey
Setup
OpenAI API key
hey,
uses OpenAI's API to generate the commands. You need to sign up for an OpenAI account and create an API key.
Then, run:
hey, setup
and follow the instructions. This will create a .hey-comma
folder in your home directory and store your API key there.
If you're not comfortable with saving your api key as plain text, you can also set your api key as environment variable and configure hey,
to read it from there:
export YOUR_ENV_VAR_NAME=sk-...
hey, config set openai_api_key "env:YOUR_ENV_VAR_NAME"
Usage
hey,
currently has two modes: run
and explain
. Most of the time you don't need to specify the mode specifically, as hey,
will automatically detect the mode based on whether you pipe data to it or not.
hey, run
hey, run
is the default mode. It will convert your instruction to a shell command and run it. It will always ask for confirmation before running the command.
hey, create a tarball with all files in the current dir, except js files
You can explicitly specify the mode:
hey, run: initialize a next.js project in ./my-app
(colon is optional)
hey, explain
hey, explain
will explain the data you pipe to it.
[!IMPORTANT] Note: The piped data will be sent to OpenAI's servers, so you should only pipe data to
hey, explain
that you are comfortable sharing with OpenAI.
cat mysterious.sh | hey, is this safe to run
You can explicitly specify the mode:
cat script.sh | hey, explain: what does this do
(colon is optional)
Special characters
To pass special characters to the hey,
, you can pass them as a quoted string:
hey, "what is the most recent file in ~/Documents?"
Configuration
You can configure hey,
using the hey, config
command. Or by editing the config.toml
directly. To get the path to the config file, run:
hey, config path
For example, ~/.hey-comma/config.toml
Available options:
openai_api_key
: your OpenAI API keyopenai_model
: the OpenAI model to use (e.g.gpt-3.5-turbo
orgpt-4
) (default:gpt-3.5-turbo
)temperature
: the temperature to use when generating commands (default:0.2
)max_tokens
: the maximum number of tokens to generate (default:256
)run_prompt
: the prompt to use when generating commands (see Custom prompts)explain_prompt
: the prompt to use when explaining data (see Custom prompts)cache.max_entries
: the maximum number of entries to cache (default:50
)
Use a different OpenAI model (e.g. GPT-4)
By default, hey,
uses GPT-3 (gpt-). If you want to use another mode, like GPT-4, you can set the openai_model
option:
hey, config set openai_model gpt-4
You can also use gpt-4 for a single command:
hey, "what is the most recent file in ~/Documents?" --gpt4
[!NOTE] Note that gpt-4 is significantly more expensive and quite a bit slower than gpt-3.
Custom prompts
You can customize the prompts used by hey,
by setting the run_prompt
and explain_prompt
options. See prompts.ts for the default prompts.
[!IMPORTANT] Make sure to add the placeholders (e.g.
%INSTRUCTION%
) to your custom prompts.
The following placeholders are available:
%INSTRUCTION%
: the instruction that is passed tohey, run
orhey, explain
%SHELL%
: the current shell (e.g.bash
orzsh
) (only available forhey, run
)%INPUT%
: the data that is piped tohey, explain
(only available forhey, explain
)
Data sent to OpenAI
hey,
will send the following data to OpenAI:
- The command you want to run
- The data you pipe to
hey, explain
- Your current shell (e.g.
bash
orzsh
)
More usage examples
hey, what are the largest files in my download directory
cat salaries.csv | hey, what is the average salary of people with a PhD
cat script.sh | hey, explain
Contributing
Development
This project uses bun as package manager & bundler.
If you don't have bun installed, run:
curl -fsSL https://bun.sh/install | bash
Install dependencies
bun install
Build
bun run build
Commit messages
This project uses semantic-release for automated release versions. So commits in this project follow the Conventional Commits guidelines. I recommend using commitizen for automated commit messages.