Skip to main content

An integration package connecting GithubcopilotChat and LangChain

Project description

LangChain GitHub Copilot Chat

This package provides a LangChain integration for GitHub Copilot, allowing you to use Copilot's models (including GPT-4o, Claude 3.5 Sonnet, etc.) as standard LangChain BaseChatModel components.

Unlike other integrations, this package mimics the official VS Code Copilot Chat extension behavior, providing access to the full suite of models available to Copilot subscribers.

🚀 Features

  • Real Copilot API: Connects to api.githubcopilot.com using official VS Code headers.
  • Easy Auth: Built-in GitHub Device Flow for acquiring a valid Copilot Token.
  • Model Discovery: Dynamic fetching of all models authorized for your account.
  • LangChain Native: Full support for Streaming, Tool Calling, and Async operations.

📦 Installation

pip install -U langchain-githubcopilot-chat

🔐 Authentication

To use GitHub Copilot, you need a valid Copilot Token. You can obtain one interactively using the built-in helper:

from langchain_githubcopilot_chat import get_vscode_token

# This will prompt you to visit a GitHub URL and enter a code
token = get_vscode_token()
print(f"Your Token: {token}")

For custom output handling (e.g., in GUI applications), pass a callback:

from langchain_githubcopilot_chat import get_copilot_token

def on_message(msg):
    # Handle status messages (e.g., display in UI)
    print(f"[Copilot] {msg}")

token = get_vscode_token(callback=on_message)

Alternatively, set it as an environment variable:

export GITHUB_TOKEN="your_copilot_token_here"

🛠 Usage

Chat Models

Access any model supported by Copilot (e.g., gpt-4o, gpt-4o-mini, claude-3.5-sonnet).

from langchain_githubcopilot_chat import ChatGithubCopilot

# Initialize with a specific model
llm = ChatGithubCopilot(
    model="gpt-4o", 
    temperature=0.7
)

# Simple invocation
response = llm.invoke("Explain Quantum Entanglement in one sentence.")
print(response.content)

# Streaming
for chunk in llm.stream("Write a short poem about coding."):
    print(chunk.content, end="", flush=True)

Discovery Available Models

GitHub Copilot periodically updates its available models. You can list what's currently available for your token:

from langchain_githubcopilot_chat import get_available_models

models = get_available_models()
for model in models:
    print(f"ID: {model['id']} - Name: {model.get('name')}")

Embeddings

Use Copilot's embedding models for RAG or semantic search:

from langchain_githubcopilot_chat import GithubcopilotChatEmbeddings

embeddings = GithubcopilotChatEmbeddings(model="text-embedding-3-small")
vector = embeddings.embed_query("GitHub Copilot is awesome!")

📖 Advanced: Tool Calling

from pydantic import BaseModel, Field
from langchain_githubcopilot_chat import ChatGithubCopilot

class GetWeather(BaseModel):
    """Get the current weather in a given location."""
    location: str = Field(..., description="The city and state, e.g. San Francisco, CA")

llm = ChatGithubCopilot(model="gpt-4o")
llm_with_tools = llm.bind_tools([GetWeather])

ai_msg = llm_with_tools.invoke("What's the weather like in Tokyo?")
print(ai_msg.tool_calls)

⚖️ Disclaimer

This project is an independent community integration and is not affiliated with, endorsed by, or supported by GitHub, Inc. Usage of this package must comply with GitHub's Terms of Service.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langchain_githubcopilot_chat-0.5.1.tar.gz (18.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

langchain_githubcopilot_chat-0.5.1-py3-none-any.whl (20.6 kB view details)

Uploaded Python 3

File details

Details for the file langchain_githubcopilot_chat-0.5.1.tar.gz.

File metadata

File hashes

Hashes for langchain_githubcopilot_chat-0.5.1.tar.gz
Algorithm Hash digest
SHA256 02a388ec042432b3510e1290e38b5410c9698074b9dd916f14d995dfd07a1a8d
MD5 a9cf121214bfea70ba6fd5dbc83ff186
BLAKE2b-256 a04c61d80472f418e6cd15b8681b427535e969b44c42ceec46ebfc7f8529802c

See more details on using hashes here.

Provenance

The following attestation bundles were made for langchain_githubcopilot_chat-0.5.1.tar.gz:

Publisher: publish.yml on BANG404/langchain-githubcopilot-chat

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file langchain_githubcopilot_chat-0.5.1-py3-none-any.whl.

File metadata

File hashes

Hashes for langchain_githubcopilot_chat-0.5.1-py3-none-any.whl
Algorithm Hash digest
SHA256 8a4de764c8850eb5a57686335d396f320206e1a7d3eb591049bdb7633a982a8b
MD5 fdcb6edffdb653a8615e67f367b423b4
BLAKE2b-256 0278e46d31a8f3f05f5f6124acd6ed5e2a13eccd39be485473316da94f15981f

See more details on using hashes here.

Provenance

The following attestation bundles were made for langchain_githubcopilot_chat-0.5.1-py3-none-any.whl:

Publisher: publish.yml on BANG404/langchain-githubcopilot-chat

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page