An integration package connecting Ollama and GigaChain
Project description
langchain-ollama
This package contains the LangChain integration with Ollama
Installation
pip install -U langchain-ollama
You will also need to run the Ollama server locally. You can download it here.
Chat Models
ChatOllama
class exposes chat models from Ollama.
from langchain_ollama import ChatOllama
llm = ChatOllama(model="llama3-groq-tool-use")
llm.invoke("Sing a ballad of LangChain.")
Embeddings
OllamaEmbeddings
class exposes embeddings from Ollama.
from langchain_ollama import OllamaEmbeddings
embeddings = OllamaEmbeddings(model="llama3")
embeddings.embed_query("What is the meaning of life?")
LLMs
OllamaLLM
class exposes LLMs from Ollama.
from langchain_ollama import OllamaLLM
llm = OllamaLLM(model="llama3")
llm.invoke("The meaning of life is")
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
gigachain_ollama-0.1.1.tar.gz
(10.9 kB
view hashes)
Built Distribution
Close
Hashes for gigachain_ollama-0.1.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | a1917f41670828c54adf86a32769746d836c97e22c3dc96daa8b5d1acb96d98c |
|
MD5 | 9f5ae9e8d104c00b93abe5ec066e9261 |
|
BLAKE2b-256 | 95ea2ec7ee0cdf80280608cca01980b9f6eaf59dcb2cfdd594d36d2f6aaa4a10 |