An integration package connecting Ollama and LangChain
Project description
langchain-ollama
This package contains the LangChain integration with Ollama
Installation
pip install -U langchain-ollama
You will also need to run the Ollama server locally. You can download it here.
Chat Models
ChatOllama
class exposes chat models from Ollama.
from langchain_ollama import ChatOllama
llm = ChatOllama(model="llama3-groq-tool-use")
llm.invoke("Sing a ballad of LangChain.")
Embeddings
OllamaEmbeddings
class exposes embeddings from Ollama.
from langchain_ollama import OllamaEmbeddings
embeddings = OllamaEmbeddings(model="llama3")
embeddings.embed_query("What is the meaning of life?")
LLMs
OllamaLLM
class exposes LLMs from Ollama.
from langchain_ollama import OllamaLLM
llm = OllamaLLM(model="llama3")
llm.invoke("The meaning of life is")
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
langchain_ollama-0.1.0.tar.gz
(10.2 kB
view hashes)
Built Distribution
Close
Hashes for langchain_ollama-0.1.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | d549b3ccf565a9ee9f6cfe8cfb8a94976b7fd0f427cd842ba134db7e2bf5b8cf |
|
MD5 | fe88cc8cd8f4f6c5e45727f5aa6915df |
|
BLAKE2b-256 | e792a283c7f0f9149bef5f440ae563451195e4864ddfe9ba8132fcc10624e9a5 |