Skip to main content

talk-codebase is a powerful tool for querying and analyzing codebases.

Project description

talk-codebase: Tool for chatting with your codebase and docs using OpenAI, LlamaCpp, and GPT-4-All

Node.js Package

chat

Description

Talk-codebase is a tool that allows you to converse with your codebase using LLMs to answer your queries. It supports offline code processing using GPT4All without sharing your code with third parties, or you can use OpenAI if privacy is not a concern for you. It is only recommended for educational purposes and not for production use.

Installation

To install talk-codebase, you need to have Python 3.9 and an OpenAI API key api-keys. Additionally, if you want to use the GPT4All model, you need to download the ggml-gpt4all-j-v1.3-groovy.bin model. If you prefer a different model, you can download it from GPT4All and configure path to it in the configuration and specify its path in the configuration. If you want some files to be ignored, add them to .gitignore.

To install talk-codebase, run the following command in your terminal:

pip install talk-codebase

Once talk-codebase is installed, you can use it to chat with your codebase by running the following command:

talk-codebase chat <path-to-your-codebase>

If you need to configure or edit the configuration, you can run:

talk-codebase configure

You can also edit the configuration manually by editing the ~/.config.yaml file. If for some reason you cannot find the configuration file, just run the tool and at the very beginning it will output the path to the configuration file.

# The OpenAI API key. You can get it from https://beta.openai.com/account/api-keys
api_key: sk-xxx
# maximum overlap between chunks. It can be nice to have some overlap to maintain some continuity between chunks
chunk_overlap: '50'
# maximum size of a chunk
chunk_size: '500'
# number of samples to generate for each prompt.
k: '4'
# maximum tokens for the LLMs
max_tokens: '1048'
# token limit for the LLM model only OpenAI
model_name: gpt-3.5-turbo
# path to the llm file on disk.
model_path: models/ggml-gpt4all-j-v1.3-groovy.bin
# type of the LLM model. It can be either local or openai
model_type: openai

The supported extensions:

  • .csv
  • .doc
  • .docx
  • .epub
  • .md
  • .pdf
  • .txt
  • popular programming languages

Contributing

Contributions are always welcome!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

talk_codebase-0.1.28.tar.gz (5.9 kB view hashes)

Uploaded Source

Built Distribution

talk_codebase-0.1.28-py3-none-any.whl (7.5 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page