Skip to main content

llama-index llms zhipuai integration

Project description

LlamaIndex Llms Integration: ZhipuAI

Installation

%pip install llama-index-llms-zhipuai
!pip install llama-index

Basic usage

# Import ZhipuAI
from llama_index.llms.zhipuai import ZhipuAI

# Set your API key
api_key = "Your API KEY"

# Call complete function
response = ZhipuAI(model="glm-4", api_key=api_key).complete("who are you")
print(response)

# Output
# I am an AI assistant named ZhiPuQingYan(智谱清言), you can call me Xiaozhi🤖, which is developed based on the language model jointly trained by Tsinghua University KEG Lab and Zhipu AI Company in 2023. My job is to provide appropriate answers and support to users' questions and requests.

# Call complete with stop
response = ZhipuAI(model="glm-4", api_key=api_key).complete(
    prompt="who are you", stop=["Zhipu"]
)
print(response)

# Output
# I am an AI assistant named ZhiPuQingYan(智谱清言), you can call me Xiaozhi🤖, which is developed based on the language model jointly trained by Tsinghua University KEG Lab and Zhipu

# Call chat with a list of messages
from llama_index.core.llms import ChatMessage

messages = [
    ChatMessage(role="user", content="who are you"),
]

response = ZhipuAI(model="glm-4", api_key=api_key).chat(messages)
print(response)

# Output
# assistant: I am an AI assistant named ZhiPuQingYan(智谱清言), you can call me Xiaozhi🤖, which is developed based on the language model jointly trained by Tsinghua University KEG Lab and Zhipu AI Company in 2023. My job is to provide appropriate answers and support to users' questions and requests.

Streaming: Using stream endpoint

from llama_index.llms.zhipuai import ZhipuAI

llm = ZhipuAI(model="glm-4", api_key=api_key)

# Using stream_complete endpoint
response = llm.stream_complete("who are you")
for r in response:
    print(r.delta, end="")

# Using stream_chat endpoint
messages = [
    ChatMessage(role="user", content="who are you"),
]

response = llm.stream_chat(messages)
for r in response:
    print(r.delta, end="")

Function Calling

from llama_index.llms.zhipuai import ZhipuAI

llm = ZhipuAI(model="glm-4", api_key="YOUR API KEY")
tools = [
    {
        "type": "function",
        "function": {
            "name": "query_weather",
            "description": "Query the weather of the city provided by user",
            "parameters": {
                "type": "object",
                "properties": {
                    "city": {
                        "type": "string",
                        "description": "City to query",
                    },
                },
                "required": ["city"],
            },
        },
    }
]
response = llm.complete(
    "help me to find the weather in Shanghai",
    tools=tools,
    tool_choice="auto",
)
print(llm.get_tool_calls_from_response(response))

# Output
# [ToolSelection(tool_id='call_9097928240216277928', tool_name='query_weather', tool_kwargs={'city': 'Shanghai'})]

ZhipuAI Documentation

usage: https://bigmodel.cn/dev/howuse/introduction

api: https://bigmodel.cn/dev/api/normal-model/glm-4

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_index_llms_zhipuai-0.5.0.tar.gz (6.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llama_index_llms_zhipuai-0.5.0-py3-none-any.whl (6.3 kB view details)

Uploaded Python 3

File details

Details for the file llama_index_llms_zhipuai-0.5.0.tar.gz.

File metadata

  • Download URL: llama_index_llms_zhipuai-0.5.0.tar.gz
  • Upload date:
  • Size: 6.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.9 {"installer":{"name":"uv","version":"0.10.9","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_index_llms_zhipuai-0.5.0.tar.gz
Algorithm Hash digest
SHA256 65c50404a2a7201864696e193157dc21dca87e14bc2e2be1e8984c02f1a9a76e
MD5 1f7825531ad069dbbb8e8bf4a653fa00
BLAKE2b-256 3c519caffabed01dbd57d1be7631c5f696c5712472d2964b9eac0d0069d51ee5

See more details on using hashes here.

File details

Details for the file llama_index_llms_zhipuai-0.5.0-py3-none-any.whl.

File metadata

  • Download URL: llama_index_llms_zhipuai-0.5.0-py3-none-any.whl
  • Upload date:
  • Size: 6.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.9 {"installer":{"name":"uv","version":"0.10.9","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_index_llms_zhipuai-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 a68709de8d6a9fa1fbe9243235e9ed3a8b070dee8f0770a95e6215689efdd9e2
MD5 03685fde31af2918002c77a01627ea08
BLAKE2b-256 499c7ce3415bbfa4cc2943adad05b6ee2b67f38e744fa57ed20b87f42f5baa60

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page