LangChain components for Dartmouth-hosted models.
Project description
Dartmouth LangChain
LangChain components for Dartmouth-hosted models.
Getting started
- Install the package:
pip install langchain_dartmouth
- Obtain a Dartmouth API key from developer.dartmouth.edu
- Store the API key as an environment variable called
DARTMOUTH_API_KEY:
export DARTMOUTH_API_KEY=<your_key_here>
- Obtain a Dartmouth Chat API key
- Store the API key as an environment variable called
DARTMOUTH_CHAT_API_KEY
export DARTMOUTH_CHAT_API_KEY=<your_key_here>
[!NOTE] You may want to make the environment variables permanent or use a
.envfile
What is this?
This library provides an integration of Darmouth-provided generative AI resources with the LangChain framework.
There are three main components currently implemented:
- Large Language Models
- Embedding models
- Reranking models
All of these components are based on corresponding LangChain base classes and can be used seamlessly wherever the corresponding LangChain objects can be used.
Using the library
Large Language Models
There are three kinds of Large Language Models (LLMs) provided by Dartmouth:
- On-premises:
- Base models without instruction tuning (require no special prompt format)
- Instruction-tuned models (also known as Chat models) requiring specific prompt formats
- Cloud:
- Third-party, pay-as-you-go chat models (e.g., OpenAI's GPT 4.1, Google Gemini)
Using a Dartmouth-hosted base language model:
from langchain_dartmouth.llms import DartmouthLLM
llm = DartmouthLLM(model_name="codellama-13b-hf")
response = llm.invoke("Write a Python script to swap two variables.")
print(response)
Using a Dartmouth-hosted chat model:
from langchain_dartmouth.llms import ChatDartmouth
llm = ChatDartmouth(model_name="meta.llama-3-2-11b-vision-instruct")
response = llm.invoke("Hi there!")
print(response.content)
[!NOTE] The required prompt format is enforced automatically when you are using
ChatDartmouth.
Using a Dartmouth-provided third-party chat model:
from langchain_dartmouth.llms import ChatDartmouth
llm = ChatDartmouth(model_name="openai.gpt-4.1-mini-2025-04-14")
response = llm.invoke("Hi there!")
Embeddings model
Using a Dartmouth-hosted embeddings model:
from langchain_dartmouth.embeddings import DartmouthEmbeddings
embeddings = DartmouthEmbeddings()
embeddings.embed_query("Hello? Is there anybody in there?")
print(response)
Reranking
Using a Dartmouth-hosted reranking model:
from langchain_dartmouth.retrievers.document_compressors import DartmouthReranker
from langchain.docstore.document import Document
docs = [
Document(page_content="Deep Learning is not..."),
Document(page_content="Deep learning is..."),
]
query = "What is Deep Learning?"
reranker = DartmouthReranker(model_name="bge-reranker-v2-m3")
ranked_docs = reranker.compress_documents(query=query, documents=docs)
print(ranked_docs)
Available models
For a list of available models, check the respective list() method of each class.
How to cite
If you are using langchain_dartmouth as part of a scientific publication, we would greatly appreciate a citation of the following paper:
@inproceedings{10.1145/3708035.3736076,
author = {Stone, Simon and Crossett, Jonathan and Luker, Tivon and Leligdon, Lora and Cowen, William and Darabos, Christian},
title = {Dartmouth Chat - Deploying an Open-Source LLM Stack at Scale},
year = {2025},
isbn = {9798400713989},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
doi = {10.1145/3708035.3736076},
booktitle = {Practice and Experience in Advanced Research Computing 2025: The Power of Collaboration},
articleno = {43},
numpages = {5}
}
License
|
Created by Simon Stone for Dartmouth College under Creative Commons CC BY-NC 4.0 License. For questions, comments, or improvements, email Research Computing. |
Except where otherwise noted, the example programs are made available under the OSI-approved MIT license.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file langchain_dartmouth-0.3.0.tar.gz.
File metadata
- Download URL: langchain_dartmouth-0.3.0.tar.gz
- Upload date:
- Size: 1.7 MB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
37a15bda17c568c37116e107b9ee53ba44c1bbecb909fa3c0cf6baa7f8b18bf0
|
|
| MD5 |
435b78aed0d9199665e65fe10183e052
|
|
| BLAKE2b-256 |
0ba5f291b68bf1718510d9b9d5937e20df19912645dab2ff655d2d88a94dcdfc
|
Provenance
The following attestation bundles were made for langchain_dartmouth-0.3.0.tar.gz:
Publisher:
pypi.yml on dartmouth/langchain-dartmouth
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
langchain_dartmouth-0.3.0.tar.gz -
Subject digest:
37a15bda17c568c37116e107b9ee53ba44c1bbecb909fa3c0cf6baa7f8b18bf0 - Sigstore transparency entry: 612244251
- Sigstore integration time:
-
Permalink:
dartmouth/langchain-dartmouth@f74000adfdac456fc146c6430049fd7a5ad46285 -
Branch / Tag:
refs/tags/v0.3.0 - Owner: https://github.com/dartmouth
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
pypi.yml@f74000adfdac456fc146c6430049fd7a5ad46285 -
Trigger Event:
push
-
Statement type:
File details
Details for the file langchain_dartmouth-0.3.0-py3-none-any.whl.
File metadata
- Download URL: langchain_dartmouth-0.3.0-py3-none-any.whl
- Upload date:
- Size: 23.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1f8785374e9f8de0b78db83f72bacce7cd83f79210105144a5f29de061c307c5
|
|
| MD5 |
4e81304036db35eefb5a46a5c61e225d
|
|
| BLAKE2b-256 |
d28ea5ffdfe50f0bc1135259ae3f3d9416f5eeeedac209d00094006dbe9101b9
|
Provenance
The following attestation bundles were made for langchain_dartmouth-0.3.0-py3-none-any.whl:
Publisher:
pypi.yml on dartmouth/langchain-dartmouth
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
langchain_dartmouth-0.3.0-py3-none-any.whl -
Subject digest:
1f8785374e9f8de0b78db83f72bacce7cd83f79210105144a5f29de061c307c5 - Sigstore transparency entry: 612244253
- Sigstore integration time:
-
Permalink:
dartmouth/langchain-dartmouth@f74000adfdac456fc146c6430049fd7a5ad46285 -
Branch / Tag:
refs/tags/v0.3.0 - Owner: https://github.com/dartmouth
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
pypi.yml@f74000adfdac456fc146c6430049fd7a5ad46285 -
Trigger Event:
push
-
Statement type: