VM-X AI Langchain Python SDK
Project description
VM-X SDK for Python Langchain
Description
VM-X AI SDK client for Python Langchain
Installation
pip install vm-x-ai-langchain
poetry add vm-x-ai-langchain
Usage
Non-Streaming
from vmxai_langchain import ChatVMX
llm = ChatVMX(
resource="default",
)
messages = [
(
"system",
"You are a helpful translator. Translate the user sentence to French.",
),
("human", "I love programming."),
]
result = llm.invoke(messages)
Streaming
from vmxai_langchain import ChatVMX
llm = ChatVMX(
resource="default",
)
messages = [
(
"system",
"You are a helpful translator. Translate the user sentence to French.",
),
("human", "I love programming."),
]
for chunk in llm.stream(messages):
print(chunk.content, end="", flush=True)
Function Calling
Decorator
from langchain_core.messages import HumanMessage, ToolMessage
from langchain_core.tools import tool
from vmxai_langchain import ChatVMX
@tool
def add(a: int, b: int) -> int:
"""Adds a and b.
Args:
a: first int
b: second int
"""
return a + b
@tool
def multiply(a: int, b: int) -> int:
"""Multiplies a and b.
Args:
a: first int
b: second int
"""
return a * b
tools = [add, multiply]
llm = ChatVMX(
resource="default",
)
llm_with_tools = llm.bind_tools(tools)
query = "What is 3 * 12? Also, what is 11 + 49?"
messages = [HumanMessage(query)]
ai_msg = llm_with_tools.invoke(messages)
messages.append(ai_msg)
for tool_call in ai_msg.tool_calls:
selected_tool = {"add": add, "multiply": multiply}[tool_call["name"].lower()]
tool_output = selected_tool.invoke(tool_call["args"])
messages.append(ToolMessage(tool_output, tool_call_id=tool_call["id"]))
print(llm_with_tools.invoke(messages))
Pydantic
from langchain_core.pydantic_v1 import BaseModel, Field
from vmxai_langchain import ChatVMX
from vmxai_langchain.output_parsers.tools import PydanticToolsParser
# Note that the docstrings here are crucial, as they will be passed along
# to the model along with the class name.
class add(BaseModel):
"""Add two integers together."""
a: int = Field(..., description="First integer")
b: int = Field(..., description="Second integer")
class multiply(BaseModel):
"""Multiply two integers together."""
a: int = Field(..., description="First integer")
b: int = Field(..., description="Second integer")
tools = [add, multiply]
llm = ChatVMX(
resource="default",
)
llm_with_tools = llm.bind_tools(tools) | PydanticToolsParser(tools=[multiply, add])
query = "What is 3 * 12? Also, what is 11 + 49?"
print(llm_with_tools.invoke(query))
Function Calling Streaming
from langchain_core.pydantic_v1 import BaseModel, Field
from vmxai_langchain import ChatVMX
from vmxai_langchain.output_parsers.tools import PydanticToolsParser
# Note that the docstrings here are crucial, as they will be passed along
# to the model along with the class name.
class add(BaseModel):
"""Add two integers together."""
a: int = Field(..., description="First integer")
b: int = Field(..., description="Second integer")
class multiply(BaseModel):
"""Multiply two integers together."""
a: int = Field(..., description="First integer")
b: int = Field(..., description="Second integer")
tools = [add, multiply]
llm = ChatVMX(
resource="default",
)
llm_with_tools = llm.bind_tools(tools) | PydanticToolsParser(tools=[multiply, add])
query = "What is 3 * 12? Also, what is 11 + 49?"
for chunk in llm_with_tools.stream(query):
print(chunk)
Structured Output
from langchain_core.pydantic_v1 import BaseModel, Field
from vmxai_langchain import ChatVMX
class Joke(BaseModel):
setup: str = Field(description="The setup of the joke")
punchline: str = Field(description="The punchline to the joke")
llm = ChatVMX(resource="default")
structured_llm = llm.with_structured_output(Joke, strict=True)
print(structured_llm.invoke("Tell me a joke about cats"))
Limitations
- Async client is not supported.
json_mode
andjson_schema
Structured output are not supported.
Change Log
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
vm_x_ai_langchain-0.2.0.tar.gz
(17.7 kB
view details)
Built Distribution
File details
Details for the file vm_x_ai_langchain-0.2.0.tar.gz
.
File metadata
- Download URL: vm_x_ai_langchain-0.2.0.tar.gz
- Upload date:
- Size: 17.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.2 CPython/3.9.19 Linux/6.5.0-1025-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9ad58b483d09a1c88f57b03494a7a16a03c7fd484b6aa080979ae930c0959b49 |
|
MD5 | 137c8f8d6ebe2dd7017e85b1a2afacfd |
|
BLAKE2b-256 | 80ccabc76fbc1a2de83f4c2c44dc2f457dac6adb077caf2dd69ad6722761b888 |
File details
Details for the file vm_x_ai_langchain-0.2.0-py3-none-any.whl
.
File metadata
- Download URL: vm_x_ai_langchain-0.2.0-py3-none-any.whl
- Upload date:
- Size: 18.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.2 CPython/3.9.19 Linux/6.5.0-1025-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 14863455ab887ca1940d7a1bf13bf9124e1c945d77718b2dd5dd6a2fe1173bdb |
|
MD5 | f600301b9c654e5b52ec61a57f30a17a |
|
BLAKE2b-256 | 53dea063659119ac4c5b7437288180ffebfd95ef49fcac51b364fc5275ffe819 |