Skip to main content

LlamaIndex integration for DigitalOcean Gradient AI

Project description

llama-index-llms-digitalocean-gradientai

LlamaIndex integration for DigitalOcean Gradient AI.

Installation

pip install llama-index-llms-digitalocean-gradientai

This package uses the official gradient SDK (PyPI package: gradient) under the hood; it is installed automatically as a dependency.

Usage

Basic Usage

from llama_index.llms.digitalocean.gradientai import DigitalOceanGradientAILLM

llm = DigitalOceanGradientAILLM(
    model="openai-gpt-oss-120b",
    api_key="your-api-key",
    workspace_id="your-workspace-id"
)

response = llm.complete("What is DigitalOcean Gradient?")
print(response)

Chat Interface

from llama_index.core.llms import ChatMessage
from llama_index.llms.digitalocean.gradientai import DigitalOceanGradientAILLM

llm = DigitalOceanGradientAILLM(
    model="openai-gpt-oss-120b",
    api_key="your-api-key",
    workspace_id="your-workspace-id"
)

messages = [
    ChatMessage(role="system", content="You are a helpful assistant."),
    ChatMessage(role="user", content="What is the capital of France?")
]

response = llm.chat(messages)
print(response.message.content)

Streaming

from llama_index.llms.digitalocean.gradientai import DigitalOceanGradientAILLM

llm = DigitalOceanGradientAILLM(
    model="meta-llama-3-70b-instruct",
    api_key="your-api-key",
    workspace_id="your-workspace-id"
)

response_gen = llm.stream_complete("Tell me a story about AI:")
for delta in response_gen:
    print(delta.delta, end="", flush=True)

Async Usage

import asyncio
from llama_index.llms.gradient import DigitalOceanGradientAILLM

async def main():
llm = DigitalOceanGradientAILLM(
        model="meta-llama-3-70b-instruct",
        api_key="your-api-key",
        workspace_id="your-workspace-id"
    )
    response = await llm.acomplete("What is Gradient?")
    print(response)

asyncio.run(main())

With RAG Pipeline

from llama_index.core import VectorStoreIndex, Document
from llama_index.llms.digitalocean.gradientai import DigitalOceanGradientAILLM

llm = DigitalOceanGradientAILLM(
    model="meta-llama-3-70b-instruct",
    api_key="your-api-key",
    workspace_id="your-workspace-id"
)

documents = [Document(text="DigitalOcean Gradient is a managed LLM API service...")]
index = VectorStoreIndex.from_documents(documents)
query_engine = index.as_query_engine(llm=llm)
response = query_engine.query("What is Gradient?")
print(response)

Package Structure

llama-index-llms-digitalocean-gradientai/
├── llama_index/
│   └── llms/
│       └── digitalocean/
│           └── gradientai/
│               ├── __init__.py
│               └── base.py
├── setup.py
├── pyproject.toml
├── README.md
└── requirements.txt

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file llama_index_llms_digitalocean_gradientai-0.1.3.tar.gz.

File metadata

File hashes

Hashes for llama_index_llms_digitalocean_gradientai-0.1.3.tar.gz
Algorithm Hash digest
SHA256 87542041c1bb4e159c16140d8973c0062a3997b2c6dd984b0acc183f6aab4503
MD5 5c2194765e8316a51c95f8973d8e42c6
BLAKE2b-256 f5a3f3d7e5e85742c3dbdbfcd97ffc690f3dea42ccdf5d0abe0cb8d67b0217df

See more details on using hashes here.

File details

Details for the file llama_index_llms_digitalocean_gradientai-0.1.3-py3-none-any.whl.

File metadata

File hashes

Hashes for llama_index_llms_digitalocean_gradientai-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 3091f034550d7a2b7095cee59671b5108f3eea046968d76e7e11b938fc645bf6
MD5 ece6cd304a4d97493fd82e23a68d996c
BLAKE2b-256 a485b789551475606bb7dbff3757dfc0d4581b161ff1c4bae55ce625b84a2879

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page