Career intelligence system - gather professional data, generate CVs, analyze career alignment
Project description
FutureProof
Career intelligence agent that gathers professional data, searches job boards, analyzes career trajectories, and generates ATS-optimized CVs — all through conversational chat. Built with LangChain, LangGraph, and ChromaDB. Supports OpenAI, Anthropic, Google, Azure, and Ollama.
What It Does
You: Gather all my career data
Agent: [gathers LinkedIn, portfolio, CliftonStrengths → indexes to ChromaDB]
You: Analyze my skill gaps for Staff Engineer
Agent: [runs skill gap analysis using your data + market trends]
You: Search for remote Python developer jobs in Europe
Agent: [queries 7 job boards + Hacker News hiring threads]
You: Generate my CV targeting that Staff Engineer role
Agent: [generates ATS-optimized CV in Markdown + PDF]
One agent, 40 tools, 12 MCP clients. Data sources: LinkedIn CSV export, GitHub (live MCP), GitLab (glab CLI), portfolio websites, CliftonStrengths PDF, 7 job boards, Hacker News, Dev.to, Stack Overflow, Tavily search.
Architecture
graph LR
User <-->|Rich UI, HITL| Chat[Chat Client]
Chat <--> Agent[Single Agent<br/>40 tools]
Agent --> Gather[Gatherers]
Agent --> MCP[12 MCP Clients]
Agent --> Analysis[Career Analysis]
Agent --> Gen[CV Generator]
Gather -->|LinkedIn CSV, Portfolio,<br/>CliftonStrengths| ChromaDB[(ChromaDB)]
MCP -->|GitHub, 7 job boards,<br/>HN, Tavily, Dev.to, SO| Agent
Analysis --> LLM[Multi-Provider LLM<br/>Fallback Chain]
Gen -->|Markdown + PDF| Output[CV Output]
ChromaDB -->|RAG search| Agent
ChromaDB -->|Episodic memory| Agent
LLM -->|Purpose-based routing| Agent
Key design decisions:
- Single agent — multi-agent handoffs failed with GPT-4.1 (over-delegation, lost context). One agent with all tools is simpler and more reliable.
- Database-first pipeline — gatherers return
SectionNamedTuples and index directly to ChromaDB. No intermediate files, no markdown header roundtrip. - Two-pass synthesis — GPT-4o genericizes analysis responses regardless of prompt engineering.
AnalysisSynthesisMiddlewarelets the agent do tool calling, then replaces its generic response with focused synthesis from a reasoning model. - Multi-provider fallback — supports OpenAI, Anthropic, Google, Azure, Ollama, and FutureProof proxy. Provider-specific fallback chains with automatic rate-limit recovery and purpose-based routing (agent/analysis/summary/synthesis).
- HITL confirmation — destructive or expensive operations (CV generation, full data gathering, knowledge clearing) require user approval via LangGraph's
interrupt().
Quick Start
Tier-1 package support today is Debian/Ubuntu on amd64.
curl -fsSL https://juanmanueldaza.github.io/fu7ur3pr00f/fu7ur3pr00f-archive-keyring.gpg | \
sudo tee /usr/share/keyrings/fu7ur3pr00f-archive-keyring.gpg >/dev/null
echo "deb [arch=amd64 signed-by=/usr/share/keyrings/fu7ur3pr00f-archive-keyring.gpg] \
https://juanmanueldaza.github.io/fu7ur3pr00f stable main" | \
sudo tee /etc/apt/sources.list.d/fu7ur3pr00f.list >/dev/null
sudo apt update
sudo apt install fu7ur3pr00f
fu7ur3pr00f
On first launch, the /setup wizard prompts you to configure an LLM provider. Supports OpenAI, Anthropic, Google, Azure, Ollama, or the FutureProof proxy. Settings are saved to ~/.fu7ur3pr00f/.env. Everything happens inside the chat — use /help to see all commands.
Install via apt (Debian/Ubuntu, amd64)
curl -fsSL https://juanmanueldaza.github.io/fu7ur3pr00f/fu7ur3pr00f-archive-keyring.gpg | \
sudo tee /usr/share/keyrings/fu7ur3pr00f-archive-keyring.gpg >/dev/null
echo "deb [arch=amd64 signed-by=/usr/share/keyrings/fu7ur3pr00f-archive-keyring.gpg] \
https://juanmanueldaza.github.io/fu7ur3pr00f stable main" | \
sudo tee /etc/apt/sources.list.d/fu7ur3pr00f.list >/dev/null
sudo apt update
sudo apt install fu7ur3pr00f
The apt package is self-contained: installation places a ready-to-run Python
runtime and CLI under /opt/fu7ur3pr00f with no pip install and no network
bootstrap during apt install.
The package bundles github-mcp-server. Some optional integrations rely on
extra system packages and degrade gracefully if they are not present.
Optional extras
GitLab tools:
sudo apt-get install glabCliftonStrengths PDF import:
sudo apt-get install poppler-utilsPDF generation (CV export):
sudo apt-get install libpango-1.0-0 libpangoft2-1.0-0 libcairo2 libfontconfig1 libgdk-pixbuf-2.0-0
Other Platforms
apt is the first polished native channel. rpm, AUR, and Homebrew
support are planned from the same packaged runtime model, but they are not yet
the supported installation path.
For contributor workflows and unsupported platforms, pipx install fu7ur3pr00f
still works as a development/testing fallback.
Project Structure
src/fu7ur3pr00f/
├── agents/
│ ├── career_agent.py # Single agent: create_agent(), 4 middlewares, singleton cache
│ ├── middleware.py # Dynamic prompt, synthesis, tool repair, summarization
│ ├── orchestrator.py # LangGraph Functional API for analysis workflows
│ ├── helpers/ # Orchestrator support (data pipeline, LLM invoker)
│ └── tools/ # 40 tools by domain (profile, gathering, analysis, market, settings)
├── chat/ # Streaming client, HITL interrupt loop, Rich UI, /setup wizard
├── gatherers/ # LinkedIn CSV, CliftonStrengths PDF, portfolio scraper, market data
├── generators/ # CV generation (Markdown + PDF via WeasyPrint)
├── llm/ # FallbackLLMManager: multi-provider fallback, purpose-based routing
├── memory/ # ChromaDB (knowledge RAG + episodic), chunker, profile, embeddings
├── mcp/ # 12 MCP clients: GitHub, Tavily, financial, 7 job boards (incl. HN), Dev.to, SO
├── prompts/ # System + analysis + CV prompt templates
├── services/ # GathererService, AnalysisService, KnowledgeService
└── utils/ # PII anonymization, data loading, logging
Development
git clone https://github.com/juanmanueldaza/fu7ur3pr00f.git
cd fu7ur3pr00f
pip install -e .
pip install pyright pytest ruff # dev tools
pytest tests/ -q # Unit tests
pyright src/fu7ur3pr00f # Type checking
ruff check . # Lint
Fresh Install Connectivity Check
Use this to validate a clean pipx install plus MCP/LLM connectivity from a temporary HOME.
scripts/fresh_install_check.sh --source local --config-from .env
scripts/fresh_install_check.sh --source pypi --config-from .env
Tech Stack
Python 3.13 · LangChain + LangGraph · ChromaDB · Multi-provider LLM (OpenAI, Anthropic, Google, Azure, Ollama) · Typer + Rich · WeasyPrint · httpx
Licensed under GPL-2.0.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file fu7ur3pr00f-0.1.8.tar.gz.
File metadata
- Download URL: fu7ur3pr00f-0.1.8.tar.gz
- Upload date:
- Size: 190.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4634cf9448ddf4e024e0b796a4547c92cbf96c6a9b0252fe3506d3b18b20e4a9
|
|
| MD5 |
992addcdb97d282fa15b551628f782e6
|
|
| BLAKE2b-256 |
7241ed76822e7238198d3c2b7d336fabde1235b84043a480260a5767e2ff161e
|
File details
Details for the file fu7ur3pr00f-0.1.8-py3-none-any.whl.
File metadata
- Download URL: fu7ur3pr00f-0.1.8-py3-none-any.whl
- Upload date:
- Size: 200.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b65531dcc1d35578e66f7b3d2a900e82c5d44c4753f6cc84abefb97feb5bc670
|
|
| MD5 |
eb85a8543357a66d93966a059cb8d217
|
|
| BLAKE2b-256 |
c198a8de543e4ddcfd0f549973337cc2669f93d9afb4a8b8c45fa6cc24fdd8ba
|