Skip to main content

llama-index llms novita integration

Project description

LlamaIndex Llms Integration: NovitaAI

Installation

%pip install llama-index-llms-novita

Select Model

Large Language Models: https://novita.ai/llm-api?utm_source=github_llama_index&utm_medium=github_readme&utm_campaign=link

Basic usage

# Import NovitaAI
from llama_index.llms.novita import NovitaAI

# Set your API key
api_key = "Your API KEY"

# Call complete function
response = NovitaAI(
    model="meta-llama/llama-3.1-8b-instruct", api_key=api_key
).complete("who are you")
print(response)

# Call complete with stop
response = NovitaAI(
    model="meta-llama/llama-3.1-8b-instruct", api_key=api_key
).complete(prompt="who are you", stop=["AI"])
print(response)

# Call chat with a list of messages
from llama_index.core.llms import ChatMessage

messages = [
    ChatMessage(role="user", content="who are you"),
]

response = NovitaAI(
    model="meta-llama/llama-3.1-8b-instruct", api_key=api_key
).chat(messages)
print(response)

Streaming: Using stream endpoint

from llama_index.llms.novita import NovitaAI

# Set your API key
api_key = "Your API KEY"

llm = NovitaAI(model="meta-llama/llama-3.1-8b-instruct", api_key=api_key)

# Using stream_complete endpoint
response = llm.stream_complete("who are you")
for r in response:
    print(r.delta, end="")

# Using stream_chat endpoint
messages = [
    ChatMessage(role="user", content="who are you"),
]

response = llm.stream_chat(messages)
for r in response:
    print(r.delta, end="")

Function Calling

from datetime import datetime
from llama_index.core.tools import FunctionTool
from llama_index.llms.novita import NovitaAI

# Set your API key
api_key = "Your API KEY"


def get_current_time() -> dict:
    """Get the current time"""
    return {"time": datetime.now().strftime("%Y-%m-%d %H:%M:%S")}


llm = NovitaAI(model="deepseek/deepseek_v3", api_key=api_key)
tool = FunctionTool.from_defaults(fn=get_current_time)
response = llm.predict_and_call([tool], "What is the current time?")
print(response)

NovitaAI Documentation

API Documentation: https://novita.ai/docs/guides/llm-api

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_index_llms_novita-0.2.0.tar.gz (5.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llama_index_llms_novita-0.2.0-py3-none-any.whl (4.8 kB view details)

Uploaded Python 3

File details

Details for the file llama_index_llms_novita-0.2.0.tar.gz.

File metadata

File hashes

Hashes for llama_index_llms_novita-0.2.0.tar.gz
Algorithm Hash digest
SHA256 2bd7547f5eaa20d5fc052c57959bb844c3a0682453c1e270fe9f8accba3c89a8
MD5 44900f3da208979a9c060a45b381992e
BLAKE2b-256 78e9005e471143b131c673f2f714ba5038a02195108a0356ce10eeecff1fe968

See more details on using hashes here.

File details

Details for the file llama_index_llms_novita-0.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for llama_index_llms_novita-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 08853f794009c6d86df45df13cb8aa89f48816593f6efe3ba5ee39a50d939f71
MD5 23f70a95fc55ac0ec8e189193c8b0f16
BLAKE2b-256 aa404c7aee89ab2005e586e5bb542f22d4e43284626ac609bdf5893b3d66d060

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page