Skip to main content

An integration package connecting GithubcopilotChat and LangChain

Project description

LangChain GitHub Copilot Chat

This package provides a LangChain integration for GitHub Copilot, allowing you to use Copilot's models (including GPT-4o, Claude 3.5 Sonnet, etc.) as standard LangChain BaseChatModel components.

Unlike other integrations, this package mimics the official VS Code Copilot Chat extension behavior, providing access to the full suite of models available to Copilot subscribers.

🚀 Features

  • Real Copilot API: Connects to api.githubcopilot.com using official VS Code headers.
  • Easy Auth: Built-in GitHub Device Flow for acquiring a valid Copilot Token.
  • Model Discovery: Dynamic fetching of all models authorized for your account.
  • LangChain Native: Full support for Streaming, Tool Calling, and Async operations.

📦 Installation

pip install -U langchain-githubcopilot-chat

🔐 Authentication

To use GitHub Copilot, you need a valid Copilot Token. You can obtain one interactively using the built-in helper:

from langchain_githubcopilot_chat import get_vscode_token

# This will prompt you to visit a GitHub URL and enter a code
token = get_vscode_token()
print(f"Your Token: {token}")

For custom output handling (e.g., in GUI applications), pass a callback:

from langchain_githubcopilot_chat import get_copilot_token

def on_message(msg):
    # Handle status messages (e.g., display in UI)
    print(f"[Copilot] {msg}")

token = get_vscode_token(callback=on_message)

Alternatively, set it as an environment variable:

export GITHUB_TOKEN="your_copilot_token_here"

🛠 Usage

Chat Models

Access any model supported by Copilot (e.g., gpt-4o, gpt-4o-mini, claude-3.5-sonnet).

from langchain_githubcopilot_chat import ChatGithubCopilot

# Initialize with a specific model
llm = ChatGithubCopilot(
    model="gpt-4o", 
    temperature=0.7
)

# Simple invocation
response = llm.invoke("Explain Quantum Entanglement in one sentence.")
print(response.content)

# Streaming
for chunk in llm.stream("Write a short poem about coding."):
    print(chunk.content, end="", flush=True)

Discovery Available Models

GitHub Copilot periodically updates its available models. You can list what's currently available for your token:

from langchain_githubcopilot_chat import get_available_models

models = get_available_models()
for model in models:
    print(f"ID: {model['id']} - Name: {model.get('name')}")

Embeddings

Use Copilot's embedding models for RAG or semantic search:

from langchain_githubcopilot_chat import GithubcopilotChatEmbeddings

embeddings = GithubcopilotChatEmbeddings(model="text-embedding-3-small")
vector = embeddings.embed_query("GitHub Copilot is awesome!")

📖 Advanced: Tool Calling

from pydantic import BaseModel, Field
from langchain_githubcopilot_chat import ChatGithubCopilot

class GetWeather(BaseModel):
    """Get the current weather in a given location."""
    location: str = Field(..., description="The city and state, e.g. San Francisco, CA")

llm = ChatGithubCopilot(model="gpt-4o")
llm_with_tools = llm.bind_tools([GetWeather])

ai_msg = llm_with_tools.invoke("What's the weather like in Tokyo?")
print(ai_msg.tool_calls)

⚖️ Disclaimer

This project is an independent community integration and is not affiliated with, endorsed by, or supported by GitHub, Inc. Usage of this package must comply with GitHub's Terms of Service.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langchain_githubcopilot_chat-0.3.0.tar.gz (15.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

langchain_githubcopilot_chat-0.3.0-py3-none-any.whl (18.4 kB view details)

Uploaded Python 3

File details

Details for the file langchain_githubcopilot_chat-0.3.0.tar.gz.

File metadata

File hashes

Hashes for langchain_githubcopilot_chat-0.3.0.tar.gz
Algorithm Hash digest
SHA256 e9b897badc840de1cea9bdd270b60d122568393f895cb394001d8a4ffe2718a6
MD5 21f16a105e16ff0cf8ee6c7c35bbb1db
BLAKE2b-256 339b31fea102ea803576fb5b138ab084c6ddf58cbb68659de7fb59a952175277

See more details on using hashes here.

Provenance

The following attestation bundles were made for langchain_githubcopilot_chat-0.3.0.tar.gz:

Publisher: publish.yml on BANG404/langchain-githubcopilot-chat

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file langchain_githubcopilot_chat-0.3.0-py3-none-any.whl.

File metadata

File hashes

Hashes for langchain_githubcopilot_chat-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 269e9c569773f042e56a77f2cb2a1b51fbf9acb4ed96a9a895fdcf7fddba6332
MD5 590e9becb7a3fdc8a6f7089e30598e2a
BLAKE2b-256 7e28c91cd912b055089a216161d33df370efedb41e32cf087f75aaca717bb871

See more details on using hashes here.

Provenance

The following attestation bundles were made for langchain_githubcopilot_chat-0.3.0-py3-none-any.whl:

Publisher: publish.yml on BANG404/langchain-githubcopilot-chat

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page