AI-powered code understanding and querying tool
Project description
"""
🧠 interro - AI-Powered Code Understanding Tool
Ask natural language questions about your codebase and get intelligent answers!
Features
- 🔍 Smart Code Search: Combines keyword and semantic search
- 🧠 AI Explanations: Optional LLM-powered code explanations via Ollama
- 🎨 Beautiful Output: Rich terminal formatting with syntax highlighting
- 🚀 Fast Indexing: Efficient AST-based parsing for Python, generic chunking for other languages
- ⚙️ Configurable: Extensive configuration options
- 📦 Easy Install: Simple pip installation
Quick Start
# Install
pip install interro
# Ask questions about your code
interro ask "Where is the main function?"
interro ask "What handles user authentication?"
interro ask "Explain the database connection logic"
# Index a specific directory
interro ask "How does error handling work?" --path ./src
# Use AI explanations (requires Ollama)
interro ask "What does this class do?" --llm --model llama3
Installation
pip install interro
For development:
git clone <repo>
cd interro
poetry install
Configuration
Create a .interro.yaml file in your project root:
interro config --init
This creates a default configuration you can customize:
indexing:
file_extensions: [".py", ".js", ".ts", ".java", ".cpp"]
exclude_dirs: ["__pycache__", ".git", "node_modules"]
chunk_size: 1000
chunk_overlap: 200
retrieval:
max_results: 10
use_semantic_search: true
similarity_threshold: 0.7
llm:
enabled: false
model: "llama3"
max_tokens: 500
output:
format: "rich" # rich, plain, json
highlight_syntax: true
show_line_numbers: true
Usage Examples
Basic Questions
# Find specific functionality
interro ask "Where is data loaded from files?"
interro ask "Show me all the API endpoints"
interro ask "What handles authentication?"
# Understand code structure
interro ask "Explain the main application flow"
interro ask "How are errors handled?"
interro ask "What external libraries are used?"
Advanced Usage
# Use AI explanations
interro ask "Explain this algorithm" --llm
# Search specific directory
interro ask "Database queries" --path ./backend
# JSON output for tooling
interro ask "Find all classes" --format json
# Limit results
interro ask "HTTP handlers" --max-results 5
Programmatic Usage
from interro import Interro
# Initialize
interro = Interro()
# Index codebase
interro.index_path("./my_project")
# Ask questions
result = interro.ask("Where is the config loaded?")
print(f"Found {len(result['results'])} matches")
for match in result['results']:
print(f"{match.chunk.file_path}:{match.chunk.start_line}")
print(match.chunk.content)
LLM Integration
Interro supports local LLM explanations via Ollama:
- Install Ollama
- Pull a model:
ollama pull llama3 - Enable in config or use
--llmflag
# Enable LLM explanations
interro ask "Explain this function" --llm --model llama3
Supported Languages
- Python: Full AST parsing for functions, classes, imports
- JavaScript/TypeScript: Generic chunking with smart boundaries
- Java, C++, C: Generic chunking
- Others: Basic text chunking
Architecture
interro/
├── cli.py # Command-line interface
├── indexer.py # Code parsing and chunking
├── retriever.py # Search (keyword + semantic)
├── llm_agent.py # LLM integration via Ollama
├── formatter.py # Output formatting
└── config.py # Configuration management
Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests
- Submit a pull request
License
MIT License - see LICENSE file for details.
FAQ
Q: How does semantic search work? A: Uses sentence-transformers to create embeddings of code chunks, enabling similarity-based matching beyond keywords.
Q: Can I use it without LLM features? A: Yes! Keyword and semantic search work without any LLM integration.
Q: What LLM models are supported? A: Any model available through Ollama (llama3, gemma, phi3, etc.)
Q: How large codebases can it handle? A: Tested on codebases up to 100k+ lines. Uses efficient indexing and configurable limits. """
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file interro-0.1.3.tar.gz.
File metadata
- Download URL: interro-0.1.3.tar.gz
- Upload date:
- Size: 17.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e5966d08967aaf5e4e48ee749eeb0badf6a87b167b5ed3500bfe719063ab5432
|
|
| MD5 |
92424502ba505282bee8c0737496c0fb
|
|
| BLAKE2b-256 |
a33c97b62e733bb67c8e70b530979139b15a88cd825e96c5effc8f19422e1dd7
|
File details
Details for the file interro-0.1.3-py3-none-any.whl.
File metadata
- Download URL: interro-0.1.3-py3-none-any.whl
- Upload date:
- Size: 17.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7cba01ebfa86a871813de459b9239ad2b4476cd73255b486cd1a18bd7da37444
|
|
| MD5 |
df99d13709adbc78d8f05cacfa86c5f9
|
|
| BLAKE2b-256 |
09f850578ad930fea174afb7f106fbc1f008f54265a000532a3f09154301a493
|