Skip to content

kynnyhsap/how

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

38 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

how-cli

Ask your terminal (AI) about cli commands

Yes, it's just an LLM wrapper. It saves me a lot of time. It will for you too.

Installation

Run this command to install how:

curl -fsSL https://raw.githubusercontent.com/kynnyhsap/how/main/scripts/install.sh | bash

This will fetch and run the script located in ./scripts/install.sh.

Usage

Make sure to set api key first with --key flag:

how --key

Default provider is openai. You can change it with --provider flag. See providers below for more info.

Now you can prompt and adk how about cli commands:

how to [prompt...]

Providers

The default provider is openai, but you can change it with --provider flag:

how --provider

Changing provider means you also need to update the api key with --key flag.

Supported providers:

  • openai - OpenAPI GPT models (default)
  • anthropic - Anthropic models
  • groq - Groq models
  • ollama - Ollama models, on-device inference, no api key required
  • custom - Custom provider script

Config

Api key and provider info is stored in ~/.how/config.json. This config is also used for other options. You can view it with:

how --config

Help

To see all available commands and options:

how --help

Examples

how to create a git branch
how to convert video to gif with ffmpeg
how to compile a c file

Development

You will need bun for this.

To install dependencies:

bun install

To run from source:

bun how [arguments...]

To compile executable from source:

bun compile-dev

Benchmarks

What I observed using differnet models:

  • groq (llama3-8b) is the fastest, average response time is under half a second.
  • ollama (llama3-8b) is the slowest, average response time is about 3 seconds. And also there is a cold start that takes about 8 seconds (I guess the model loads itself into RAM)
  • openai (gpt-4o) and anthropic (claude-3-5-sonnet) are in between, average response time is about 2 seconds. They also seem to have better results than llama3-8b.

Cross-Compile

Thre is a compile.sh script to cross-compile for multiple platforms. You can run it with:

./scripts/compile.sh

Releases

I do releases when I feel like it. There is a script to automate in in scripts/release.sh.

Later I will add command to upgrade cli to latest version.

License

MIT, you can go nuts with it.

About

Ask your terminal (AI) about cli commands

Resources

Stars

Watchers

Forks

Packages

No packages published