Skip to main content

npcsh is a command line tool for integrating LLMs into everyday workflows

Project description

npcsh logo with sibiji the spider

npcsh

Welcome to npcsh, the perfect tool for integrating AI into your workflows. npcsh provides users with the ability to interact with LLMs from the comfort of their terminal: no more switching windows to copy and paste inputs or outputs and being dragged down by the cost of context switching. npcsh is meant to be a drop-in replacement shell for any kind of bash/zsh/powershell and allows the user to directly operate their machine through the use of the LLM-powered shell.

npcsh introduces a new paradigm of programming for LLMs: npcsh allows users to set up NPC profiles (a la npc_profile.npc) where a user sets the primary directive of the NPC, the tools they want the NPC to use, and other properties of the NPC. NPCs can interact with each other and their primary directives and properties make these relationships explicit through jinja references.

With npcsh, we can more seamlessly stick together complex workflows and data processing tasks to form NPC Assembly Lines where pieces of information are evaluated in a sequence by different NPCs and the results are passed along to the next NPC in the sequence.

Dependencies

  • ollama
  • python >3.10

The default model is currently llama3.2 Download it by running

ollama run llama3.2

Also any Hugging Face model can be used. when setting a model from there, use the full link to the model like https://huggingface.co/caug37/TinyTim.

We support inference as well via openai and anthropic. To use them, set an ".env" file up in the folder where you are working and set the API keys there or set the environment variables in your shell.

export OPENAI_API_KEY="your_openai_key"
export ANTHROPIC_API_KEY="your_anthropic_key"

The user can change the default model by setting the environment variable NPCSH_MODEL in their ~/.npcshrc to the desired model name and to change the provider by setting the environment variable NPCSH_PROVIDER to the desired provider name.

The provider must be one of ['ollama', 'openai', 'anthropic'] and the model must be one available from those providers.

Linux install

sudo apt-get install espeak
sudo apt-get install portaudio19-dev python3-pyaudio
sudo apt-get install alsa-base alsa-utils
sudo apt-get install libcairo2-dev
sudo apt-get install libgirepository1.0-dev
sudo apt-get install ffmpeg
pip install npcsh

Mac install

brew install portaudio
brew install ffmpeg
brew install ollama
brew services start ollama
brew install pygobject3
pip install npcsh

Usage

After it has been pip installed, npcsh can be used as a command line tool. Start it by typing:

npcsh

Once in the npcsh, you can use bash commands or write natural language queries or commands. You can also switch between different modes defined below and you can compile a network of NPCs or use the macro tools we have developed.

compilation

Each NPC can be compiled to accomplish their primary directive and then any issues faced will be recorded and associated with the NPC so that it can reference it later through vector search. In any of the modes where a user requests input from an NPC, the NPC will include RAG search results before carrying out the request.

Base npcsh

In the base npcsh shell, inputs are processed by an LLM. The LLM first determines what kind of a request the user is making and decides which of the available tools or modes will best enable it to accomplish the request.

Built-in NPCs

Built-in NPCs are NPCs that should offer broad utility to the user and allow them to create more complicated NPCs. These built-in NPCs facilitate the carrying out of many common data processing tasks as well as the ability to run commands and to execute and test programs.

Other useful tools

Vixynt: your visual assistant

Type

/vixynt <your_prompt_here> 

to get a response from the diffusion model. Only dall-e-3 confirmed to work at the moment. Other model support to come.

whisper mode

type

/whisper

to enter into a voice control mode. It will calibrate for silence so that it will process your input once youve finished speaking and then will tts the response from the llm.

spool mode

Spool mode allows the users to have threaded conversations in the shell, i.e. conversations where context is retained over the course of several turns. Users can speak with specific NPCs in spool mode by doing

/spool <npc_name>

and can exit spool mode by doing

/exit

Commands

The LLM or specific NPC will take the user's request and try to write a command or a script to accomplish the task and then attempt to run it and to tweak it until it works or it's exceeded the number of retries (default=5).

Use the Command NPC by typing /cmd <command>. Chat with the Command NPC in spool mode by typing /spool cmd. Use the Command NPC in the profiles of other NPCs by referencing it like {{cmd}}.

Question NPC

The user can submit a 1-shot question to a general LLM or to a specific NPC. Use it like npcsh> /sample <question> <npc_name> or npcsh> /sample <question>

Over-the-shoulder

Over the shoulder allows the user to select an area of the screen and the area will be passed to a vision LLM and then the user can inquire about the image or ask for help with it. Use it by typing

npcsh> /ots

It will pop up with your desktop's native screenshotting capabilities, and allow you to select an area. That area will be saved to ~/.npcsh/screenshots/ and then you will be prompted to pass a question about the image. You can also use it on existing files/images by typing

npcsh> /ots filename

and it will also prompt in the same way.

data

Data mode makes it easy to investigate data and ingest it into a local database for later use and inspection.
begin data mode by typing

npcsh> /data

then load data from a file like

data> load from filename as table_name

If it's a tabular file like a csv, you can then perform sql and pandas like operations on the table_name.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

npcsh-0.1.31.tar.gz (48.5 kB view details)

Uploaded Source

Built Distribution

npcsh-0.1.31-py3-none-any.whl (43.3 kB view details)

Uploaded Python 3

File details

Details for the file npcsh-0.1.31.tar.gz.

File metadata

  • Download URL: npcsh-0.1.31.tar.gz
  • Upload date:
  • Size: 48.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.20

File hashes

Hashes for npcsh-0.1.31.tar.gz
Algorithm Hash digest
SHA256 221daa3f124a18255543c18a3ab75dbc16b2846fc6cc0d2f4af871d08ea3941b
MD5 2f58147c517be25afa16d0b5eef35610
BLAKE2b-256 1be0070e7ab54d813f6c2efac45d0b45fb5aba412e7e9d152d36f07f5c778d47

See more details on using hashes here.

File details

Details for the file npcsh-0.1.31-py3-none-any.whl.

File metadata

  • Download URL: npcsh-0.1.31-py3-none-any.whl
  • Upload date:
  • Size: 43.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.20

File hashes

Hashes for npcsh-0.1.31-py3-none-any.whl
Algorithm Hash digest
SHA256 e265de55b30529f51c6109acb8f46aa0fe1fc178e5d0e6502e38999c016059c2
MD5 c7af0f75d9be83080944a226284843ea
BLAKE2b-256 34f80e36902461ae97a533dd8a8a6026da0ebec97955513d1486f2c72b9f0031

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page