Skip to main content

An integration package connecting GithubcopilotChat and LangChain

Project description

LangChain GitHub Copilot Chat

This package provides a LangChain integration for GitHub Copilot, allowing you to use Copilot's models (including GPT-4o, Claude 3.5 Sonnet, etc.) as standard LangChain BaseChatModel components.

Unlike other integrations, this package mimics the official VS Code Copilot Chat extension behavior, providing access to the full suite of models available to Copilot subscribers.

🚀 Features

  • Real Copilot API: Connects to api.githubcopilot.com using official VS Code headers.
  • Easy Auth: Built-in GitHub Device Flow for acquiring a valid Copilot Token.
  • Model Discovery: Dynamic fetching of all models authorized for your account.
  • LangChain Native: Full support for Streaming, Tool Calling, and Async operations.

📦 Installation

pip install -U langchain-githubcopilot-chat

🔐 Authentication

To use GitHub Copilot, you need a valid Copilot Token. You can obtain one interactively using the built-in helper:

from langchain_githubcopilot_chat import get_vscode_token

# This will prompt you to visit a GitHub URL and enter a code
token = get_vscode_token()
print(f"Your Token: {token}")

For custom output handling (e.g., in GUI applications), pass a callback:

from langchain_githubcopilot_chat import get_copilot_token

def on_message(msg):
    # Handle status messages (e.g., display in UI)
    print(f"[Copilot] {msg}")

token = get_vscode_token(callback=on_message)

Alternatively, set it as an environment variable:

export GITHUB_TOKEN="your_copilot_token_here"

🛠 Usage

Chat Models

Access any model supported by Copilot (e.g., gpt-4o, gpt-4o-mini, claude-3.5-sonnet).

from langchain_githubcopilot_chat import ChatGithubCopilot

# Initialize with a specific model
llm = ChatGithubCopilot(
    model="gpt-4o", 
    temperature=0.7
)

# Simple invocation
response = llm.invoke("Explain Quantum Entanglement in one sentence.")
print(response.content)

# Streaming
for chunk in llm.stream("Write a short poem about coding."):
    print(chunk.content, end="", flush=True)

Discovery Available Models

GitHub Copilot periodically updates its available models. You can list what's currently available for your token:

from langchain_githubcopilot_chat import get_available_models

models = get_available_models()
for model in models:
    print(f"ID: {model['id']} - Name: {model.get('name')}")

Embeddings

Use Copilot's embedding models for RAG or semantic search:

from langchain_githubcopilot_chat import GithubcopilotChatEmbeddings

embeddings = GithubcopilotChatEmbeddings(model="text-embedding-3-small")
vector = embeddings.embed_query("GitHub Copilot is awesome!")

📖 Advanced: Tool Calling

from pydantic import BaseModel, Field
from langchain_githubcopilot_chat import ChatGithubCopilot

class GetWeather(BaseModel):
    """Get the current weather in a given location."""
    location: str = Field(..., description="The city and state, e.g. San Francisco, CA")

llm = ChatGithubCopilot(model="gpt-4o")
llm_with_tools = llm.bind_tools([GetWeather])

ai_msg = llm_with_tools.invoke("What's the weather like in Tokyo?")
print(ai_msg.tool_calls)

⚖️ Disclaimer

This project is an independent community integration and is not affiliated with, endorsed by, or supported by GitHub, Inc. Usage of this package must comply with GitHub's Terms of Service.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langchain_githubcopilot_chat-0.6.1.tar.gz (13.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

langchain_githubcopilot_chat-0.6.1-py3-none-any.whl (15.8 kB view details)

Uploaded Python 3

File details

Details for the file langchain_githubcopilot_chat-0.6.1.tar.gz.

File metadata

File hashes

Hashes for langchain_githubcopilot_chat-0.6.1.tar.gz
Algorithm Hash digest
SHA256 e83cfef4249749ff2384d2e819df97f84ad60b576108f16534439c6060bb4cb1
MD5 8442c1054812a10d459ae1f0885d3a4e
BLAKE2b-256 fccc691b4770d6f0ed35c065832ce0fced8121417620feb4e41c6b291a27096c

See more details on using hashes here.

Provenance

The following attestation bundles were made for langchain_githubcopilot_chat-0.6.1.tar.gz:

Publisher: publish.yml on BANG404/langchain-githubcopilot-chat

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file langchain_githubcopilot_chat-0.6.1-py3-none-any.whl.

File metadata

File hashes

Hashes for langchain_githubcopilot_chat-0.6.1-py3-none-any.whl
Algorithm Hash digest
SHA256 23b9d041ec152d5aea0f8bc1c431ebb9e1813c0155d78e851fb51cbc5a75f485
MD5 48c1370e989681f4c03a31c896427c31
BLAKE2b-256 a8b7b5635088c28740a706373286a2bb7df1ce00dc06d886fd39e92fe0fcb9d9

See more details on using hashes here.

Provenance

The following attestation bundles were made for langchain_githubcopilot_chat-0.6.1-py3-none-any.whl:

Publisher: publish.yml on BANG404/langchain-githubcopilot-chat

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page