A research automation tool that searches and analyzes web content, papers, blogs, and social media to generate comprehensive reports for given queries
Project description
insightflow
Automated research tool that searches and analyzes web content, papers, blogs, and social media to generate comprehensive reports for any query.
Leverages multiple LLM providers (OpenAI, Google, Perplexity, xAI) via OpenRouter to automate multi-perspective research on any topic.
How It Works
Topic
|
[1] Aspect Extraction (openai/gpt-4.1-mini)
|
+-- Aspect A --> [2] Parallel Web Search (perplexity/sonar-reasoning-pro) --> Report A
+-- Aspect B --> [2] Parallel Web Search --> Report B
+-- Aspect C --> [2] Parallel Web Search --> Report C
|
[3] Report Synthesis (google/gemini-3-flash-preview)
|
Final Report (Markdown + Citations)
- Aspect Extraction - Identifies key perspectives of the topic using an LLM
- Parallel Search - Searches each aspect concurrently via web-connected models
- Report Synthesis - Merges all findings into a single, cited Markdown report
Prerequisites
- Python 3.12+
- OpenRouter API key
Installation
pip install insightflow
Extras
| Extra | Description | Command |
|---|---|---|
cli |
CLI interface (Typer) | pip install "insightflow[cli]" |
api |
REST API server (FastAPI) | pip install "insightflow[api]" |
mcp |
MCP server (Claude Code, etc.) | pip install "insightflow[mcp]" |
all |
All extras | pip install "insightflow[all]" |
Setup
Set your OpenRouter API key as an environment variable:
export OPENROUTER_API_KEY="sk-or-v1-..."
Or create a .env file in your project root:
OPENROUTER_API_KEY=sk-or-v1-...
Usage
Python Library
import asyncio
from insightflow.core import research
from insightflow.models import LLMConfig
report = asyncio.run(research(
topic="Recent trends in quantum computing",
aspect_model=LLMConfig(model="openai/gpt-4.1-mini"),
search_model=LLMConfig(model="perplexity/sonar-reasoning-pro"),
report_model=LLMConfig(model="google/gemini-3-flash-preview"),
))
print(report.content) # Markdown report
print(report.citations) # List of citations
print(report.metadata) # Elapsed time, model used, etc.
Individual functions are also available:
from insightflow.core import extract_aspects, search, compose
from insightflow.models import LLMConfig
# Aspect extraction only
aspects = asyncio.run(extract_aspects(
topic="ML optimization techniques",
config=LLMConfig(model="openai/gpt-4.1-mini"),
))
# Single search query
result = asyncio.run(search(
query="Python packaging best practices",
config=LLMConfig(model="perplexity/sonar-reasoning-pro"),
))
CLI
pip install "insightflow[cli]"
# Full research
insightflow research "Recent trends in quantum computing"
# Aspect extraction only
insightflow aspects "ML optimization techniques"
# Single search
insightflow search "Python packaging best practices"
# With options
insightflow research "AI safety" \
--language english \
--max-aspects 3 \
--search-model perplexity/sonar-pro \
--json \
-o report.json
Run insightflow --help for all available options.
REST API Server
pip install "insightflow[api]"
python -m uvicorn insightflow.interfaces.api:app
Swagger UI is available at http://localhost:8000/docs.
curl -X POST http://localhost:8000/research \
-H "Content-Type: application/json" \
-d '{"topic": "quantum computing", "language": "english"}'
MCP Server (Claude Code Integration)
Use insightflow as an MCP tool in Claude Code so that Claude can autonomously run research.
1. Install
pip install "insightflow[mcp]"
2. Add to Claude Code config
Add to ~/.claude.json (global) or .mcp.json in your project root:
{
"mcpServers": {
"insightflow": {
"command": "python",
"args": ["-m", "insightflow.interfaces.mcp"],
"env": {
"OPENROUTER_API_KEY": "sk-or-v1-..."
}
}
}
}
With uvx (no local install required):
{ "mcpServers": { "insightflow": { "command": "uvx", "args": ["--from", "insightflow[mcp]", "python", "-m", "insightflow.interfaces.mcp"], "env": { "OPENROUTER_API_KEY": "sk-or-v1-..." } } } }
3. Use from Claude Code
Once configured, Claude Code recognizes the research tool. Ask Claude to research a topic in conversation and it will call insightflow automatically.
Tool parameters:
| Parameter | Type | Required | Default | Description |
|---|---|---|---|---|
topic |
string | Yes | - | Research topic |
aspect_model |
string | No | openai/gpt-4.1-mini |
Model for aspect extraction |
search_model |
string | No | perplexity/sonar-reasoning-pro |
Model for search |
report_model |
string | No | google/gemini-3-flash-preview |
Model for report generation |
language |
string | No | japanese |
Output language |
max_aspects |
integer | No | 5 |
Maximum number of aspects |
concurrency |
integer | No | 3 |
Maximum concurrent searches |
Configuration
Configurable via environment variables or .env file:
| Variable | Required | Default | Description |
|---|---|---|---|
OPENROUTER_API_KEY |
Yes | - | OpenRouter API key |
DEFAULT_ASPECT_MODEL |
No | openai/gpt-4.1-mini |
Default aspect extraction model |
DEFAULT_SEARCH_MODEL |
No | perplexity/sonar-reasoning-pro |
Default search model |
DEFAULT_REPORT_MODEL |
No | google/gemini-3-flash-preview |
Default report generation model |
DEFAULT_LANGUAGE |
No | japanese |
Default output language |
DEFAULT_MAX_ASPECTS |
No | 5 |
Default number of aspects |
DEFAULT_CONCURRENCY |
No | 3 |
Default concurrent searches |
Development
git clone https://github.com/sync-dev-org/insightflow.git
cd insightflow
uv sync --all-extras
# Test
uv run pytest
# Lint
uv run ruff check src/
uv run ruff format src/
License
MIT License - See LICENSE for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file insightflow-0.1.1.tar.gz.
File metadata
- Download URL: insightflow-0.1.1.tar.gz
- Upload date:
- Size: 27.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
247dd60fa47ff0971d0d5038d4ede6dce174487cb30551022c77ce6ecd72b036
|
|
| MD5 |
5da06e087be5193fb606c1bbf67ec8d8
|
|
| BLAKE2b-256 |
e22bef09676c75a640a8fadfe2c19a25bef00696b7233f2ac353e45a5a6f5505
|
File details
Details for the file insightflow-0.1.1-py3-none-any.whl.
File metadata
- Download URL: insightflow-0.1.1-py3-none-any.whl
- Upload date:
- Size: 32.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
37265a32b59a5ae5b4fded253fd337809269f3ec52955d21f51adf5a7cba8496
|
|
| MD5 |
e1a56dfc0c661f9f6531f35b28852e74
|
|
| BLAKE2b-256 |
7cba4b9a7155e477fa09204a6054a0d15e99b4589beafc35b4f374820580ead8
|