Skip to main content

Career intelligence system - gather professional data, generate CVs, analyze career alignment

Project description

FutureProof

Python Version License: GPL v2

Career intelligence agent that gathers professional data, searches job boards, analyzes career trajectories, and generates ATS-optimized CVs — all through conversational chat. Built with LangChain, LangGraph, and ChromaDB. Supports OpenAI, Anthropic, Google, Azure, and Ollama.

What It Does

You:   Gather all my career data
Agent: [gathers LinkedIn, portfolio, CliftonStrengths → indexes to ChromaDB]

You:   Analyze my skill gaps for Staff Engineer
Agent: [runs skill gap analysis using your data + market trends]

You:   Search for remote Python developer jobs in Europe
Agent: [queries 7 job boards + Hacker News hiring threads]

You:   Generate my CV targeting that Staff Engineer role
Agent: [generates ATS-optimized CV in Markdown + PDF]

One agent, 40 tools, 12 MCP clients. Data sources: LinkedIn CSV export, GitHub (live MCP), GitLab (glab CLI), portfolio websites, CliftonStrengths PDF, 7 job boards, Hacker News, Dev.to, Stack Overflow, Tavily search.

Architecture

graph LR
    User <-->|Rich UI, HITL| Chat[Chat Client]
    Chat <--> Agent[Single Agent<br/>40 tools]

    Agent --> Gather[Gatherers]
    Agent --> MCP[12 MCP Clients]
    Agent --> Analysis[Career Analysis]
    Agent --> Gen[CV Generator]

    Gather -->|LinkedIn CSV, Portfolio,<br/>CliftonStrengths| ChromaDB[(ChromaDB)]
    MCP -->|GitHub, 7 job boards,<br/>HN, Tavily, Dev.to, SO| Agent
    Analysis --> LLM[Multi-Provider LLM<br/>Fallback Chain]
    Gen -->|Markdown + PDF| Output[CV Output]

    ChromaDB -->|RAG search| Agent
    ChromaDB -->|Episodic memory| Agent
    LLM -->|Purpose-based routing| Agent

Key design decisions:

  • Single agent — multi-agent handoffs failed with GPT-4.1 (over-delegation, lost context). One agent with all tools is simpler and more reliable.
  • Database-first pipeline — gatherers return Section NamedTuples and index directly to ChromaDB. No intermediate files, no markdown header roundtrip.
  • Two-pass synthesis — GPT-4o genericizes analysis responses regardless of prompt engineering. AnalysisSynthesisMiddleware lets the agent do tool calling, then replaces its generic response with focused synthesis from a reasoning model.
  • Multi-provider fallback — supports OpenAI, Anthropic, Google, Azure, Ollama, and FutureProof proxy. Provider-specific fallback chains with automatic rate-limit recovery and purpose-based routing (agent/analysis/summary/synthesis).
  • HITL confirmation — destructive or expensive operations (CV generation, full data gathering, knowledge clearing) require user approval via LangGraph's interrupt().

Quick Start

pipx install fu7ur3pr00f
fu7ur3pr00f

If fu7ur3pr00f is not found, run pipx ensurepath and restart your shell.

On first launch, the /setup wizard prompts you to configure an LLM provider. Supports OpenAI, Anthropic, Google, Azure, Ollama, or the FutureProof proxy. Settings are saved to ~/.fu7ur3pr00f/.env. Everything happens inside the chat — use /help to see all commands.

Install via apt (Debian/Ubuntu, amd64)

curl -fsSL https://juanmanueldaza.github.io/fu7ur3pr00f/fu7ur3pr00f-archive-keyring.gpg | \
  sudo tee /usr/share/keyrings/fu7ur3pr00f-archive-keyring.gpg >/dev/null

echo "deb [signed-by=/usr/share/keyrings/fu7ur3pr00f-archive-keyring.gpg] \
https://juanmanueldaza.github.io/fu7ur3pr00f stable main" | \
  sudo tee /etc/apt/sources.list.d/fu7ur3pr00f.list >/dev/null

sudo apt update
sudo apt install fu7ur3pr00f

The apt package bundles github-mcp-server and installs glab, poppler-utils, and WeasyPrint system libraries as dependencies.

PDF generation (CVs) requires system libraries for text rendering. Without them the app works fine — you just get Markdown output instead of PDF.

Ubuntu/Debian: sudo apt-get install libpango-1.0-0 libpangoft2-1.0-0 libcairo2 libfontconfig1 libgdk-pixbuf-2.0-0 poppler-utils

macOS: brew install pango cairo gdk-pixbuf poppler

Project Structure

src/fu7ur3pr00f/
├── agents/
│   ├── career_agent.py     # Single agent: create_agent(), 4 middlewares, singleton cache
│   ├── middleware.py        # Dynamic prompt, synthesis, tool repair, summarization
│   ├── orchestrator.py      # LangGraph Functional API for analysis workflows
│   ├── helpers/             # Orchestrator support (data pipeline, LLM invoker)
│   └── tools/              # 40 tools by domain (profile, gathering, analysis, market, settings)
├── chat/                   # Streaming client, HITL interrupt loop, Rich UI, /setup wizard
├── gatherers/              # LinkedIn CSV, CliftonStrengths PDF, portfolio scraper, market data
├── generators/             # CV generation (Markdown + PDF via WeasyPrint)
├── llm/                    # FallbackLLMManager: multi-provider fallback, purpose-based routing
├── memory/                 # ChromaDB (knowledge RAG + episodic), chunker, profile, embeddings
├── mcp/                    # 12 MCP clients: GitHub, Tavily, financial, 7 job boards (incl. HN), Dev.to, SO
├── prompts/                # System + analysis + CV prompt templates
├── services/               # GathererService, AnalysisService, KnowledgeService
└── utils/                  # PII anonymization, data loading, logging

Development

git clone https://github.com/juanmanueldaza/fu7ur3pr00f.git
cd fu7ur3pr00f
pip install -e .
pip install pyright pytest ruff    # dev tools

pytest tests/ -q              # Unit tests
pyright src/fu7ur3pr00f       # Type checking
ruff check .                  # Lint

Fresh Install Connectivity Check

Use this to validate a clean pipx install plus MCP/LLM connectivity from a temporary HOME.

scripts/fresh_install_check.sh --source local --config-from .env
scripts/fresh_install_check.sh --source pypi --config-from .env

Tech Stack

Python 3.13 · LangChain + LangGraph · ChromaDB · Multi-provider LLM (OpenAI, Anthropic, Google, Azure, Ollama) · Typer + Rich · WeasyPrint · httpx


Licensed under GPL-2.0.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fu7ur3pr00f-0.1.3.tar.gz (186.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

fu7ur3pr00f-0.1.3-py3-none-any.whl (200.4 kB view details)

Uploaded Python 3

File details

Details for the file fu7ur3pr00f-0.1.3.tar.gz.

File metadata

  • Download URL: fu7ur3pr00f-0.1.3.tar.gz
  • Upload date:
  • Size: 186.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for fu7ur3pr00f-0.1.3.tar.gz
Algorithm Hash digest
SHA256 f908bba08e396bef4892175d9257530cee700af1d84e90b37259d4e8b1cbf9f5
MD5 be8b594bf413ca6c5a47d044caeadb20
BLAKE2b-256 d531cf2d3c602fd40e20f84440b645d6845860b63136370b185ef12d8c460e5b

See more details on using hashes here.

File details

Details for the file fu7ur3pr00f-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: fu7ur3pr00f-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 200.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for fu7ur3pr00f-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 b19ff2186308b5d640790e2966eafba61269a1e2a1e86fd915711ea080e669de
MD5 48f8d8a59ad6cd15bfd7c59fb2796360
BLAKE2b-256 bd7b1240574256bbba625a7a6537610428390dba55ccc3ccb13eff6fefc64295

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page