talk-codebase is a powerful tool for querying and analyzing codebases.
Project description
talk-codebase
- Simple configuration in just a couple of clicks
- Talk-codebase is a tool that allows you to converse with your codebase using LLMs (Large Language Models) to answer your queries.
- It supports offline code processing using GPT4All without sharing your code with third parties, or you can use OpenAI if privacy is not a concern for you.
- Talk-codebase is still under development, but it is a tool that can help you to improve your code. It is only recommended for educational purposes and not for production use.
Installation
To install talk-codebase, you need to have:
- Python 3.9
- An OpenAI API api-keys
# Install talk-codebase
pip install talk-codebase
# If you want some files to be ignored, add them to .gitignore.
# Once `talk-codebase` is installed, you can use it to chat with your codebase in the current directory by running the following command:
talk-codebase chat .
Reset configuration
# If you want to reset the configuration, you can run the following command:
talk-codebase configure
Advanced configuration
You can also edit the configuration manually by editing the ~/.config.yaml
file.
If for some reason you cannot find the configuration file, just run the tool and at the very beginning it will output
the path to the configuration file.
# The OpenAI API key. You can get it from https://beta.openai.com/account/api-keys
api_key: sk-xxx
# Configuration for chunking
chunk_overlap: 50
chunk_size: 500
# Configuration for sampling
k: 4
max_tokens: 1048
# Configuration for the LLM model
# Name of model to use. You can find the list of available models here: https://gpt4all.io/models
openai_model_name: gpt-3.5-turbo
# Type of model to use. You can choose between `openai` and `local`.
model_type: openai
# Name of local model. If you want to use a local model, you need to specify the name of it.
local_model_name: orca-mini-7b.ggmlv3.q4_0.bin
# Path to local model. If you want to use a local model, you need to specify the path to it.
model_path: 'absolute path to local model'
Supports the following extensions:
-
.csv
-
.doc
-
.docx
-
.epub
-
.md
-
.pdf
-
.txt
-
popular programming languages
Contributing
- If you find a bug in talk-codebase, please report it on the project's issue tracker. When reporting a bug, please include as much information as possible, such as the steps to reproduce the bug, the expected behavior, and the actual behavior.
- If you have an idea for a new feature for Talk-codebase, please open an issue on the project's issue tracker. When suggesting a feature, please include a brief description of the feature, as well as any rationale for why the feature would be useful.
- You can contribute to talk-codebase by writing code. The project is always looking for help with improving the codebase, adding new features, and fixing bugs.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
talk_codebase-0.1.38.tar.gz
(6.6 kB
view hashes)
Built Distribution
Close
Hashes for talk_codebase-0.1.38-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 6e8c3b198cd049f529502b0a37e98bc29d29d55574b04cad249da350d6995bee |
|
MD5 | 04d134943bd1286a4621b437dbaf63f7 |
|
BLAKE2b-256 | 71649b5bffb07b695b17bb2fdbe7dcc2dfd3e5f01150686c9f328933d3549d67 |