Skip to main content

LlamaIndex integration for DigitalOcean Gradient AI

Project description

llama-index-llms-digitalocean-gradientai

LlamaIndex integration for DigitalOcean Gradient AI.

Installation

pip install llama-index-llms-digitalocean-gradientai

This package uses the official gradient SDK (PyPI package: gradient) under the hood; it is installed automatically as a dependency.

Usage

Basic Usage

from llama_index.llms.digitalocean.gradientai import DigitalOceanGradientAILLM

llm = DigitalOceanGradientAILLM(
    model="openai-gpt-oss-120b",
    api_key="your-api-key",
    workspace_id="your-workspace-id"
)

response = llm.complete("What is DigitalOcean Gradient?")
print(response)

Chat Interface

from llama_index.core.llms import ChatMessage
from llama_index.llms.digitalocean.gradientai import DigitalOceanGradientAILLM

llm = DigitalOceanGradientAILLM(
    model="openai-gpt-oss-120b",
    api_key="your-api-key",
    workspace_id="your-workspace-id"
)

messages = [
    ChatMessage(role="system", content="You are a helpful assistant."),
    ChatMessage(role="user", content="What is the capital of France?")
]

response = llm.chat(messages)
print(response.message.content)

Streaming

from llama_index.llms.digitalocean.gradientai import DigitalOceanGradientAILLM

llm = DigitalOceanGradientAILLM(
    model="meta-llama-3-70b-instruct",
    api_key="your-api-key",
    workspace_id="your-workspace-id"
)

response_gen = llm.stream_complete("Tell me a story about AI:")
for delta in response_gen:
    print(delta.delta, end="", flush=True)

Async Usage

import asyncio
from llama_index.llms.gradient import DigitalOceanGradientAILLM

async def main():
llm = DigitalOceanGradientAILLM(
        model="meta-llama-3-70b-instruct",
        api_key="your-api-key",
        workspace_id="your-workspace-id"
    )
    response = await llm.acomplete("What is Gradient?")
    print(response)

asyncio.run(main())

With RAG Pipeline

from llama_index.core import VectorStoreIndex, Document
from llama_index.llms.digitalocean.gradientai import DigitalOceanGradientAILLM

llm = DigitalOceanGradientAILLM(
    model="meta-llama-3-70b-instruct",
    api_key="your-api-key",
    workspace_id="your-workspace-id"
)

documents = [Document(text="DigitalOcean Gradient is a managed LLM API service...")]
index = VectorStoreIndex.from_documents(documents)
query_engine = index.as_query_engine(llm=llm)
response = query_engine.query("What is Gradient?")
print(response)

Package Structure

llama-index-llms-digitalocean-gradientai/
├── llama_index/
│   └── llms/
│       └── digitalocean/
│           └── gradientai/
│               ├── __init__.py
│               └── base.py
├── setup.py
├── pyproject.toml
├── README.md
└── requirements.txt

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file llama_index_llms_digitalocean_gradientai-0.1.4.tar.gz.

File metadata

File hashes

Hashes for llama_index_llms_digitalocean_gradientai-0.1.4.tar.gz
Algorithm Hash digest
SHA256 0eaf9d5c3ebd3245ecfb20faadb47a70dd6c4e40e53e9ec0f5e1bd3d4ad5aa02
MD5 c519013f6b73018f0093c5869f7b8f06
BLAKE2b-256 3115e627db6e3eac647d254d3d42c8855748d7b45a448c984a728b4ccfa818a5

See more details on using hashes here.

File details

Details for the file llama_index_llms_digitalocean_gradientai-0.1.4-py3-none-any.whl.

File metadata

File hashes

Hashes for llama_index_llms_digitalocean_gradientai-0.1.4-py3-none-any.whl
Algorithm Hash digest
SHA256 bf36e18675e7ba74affeaa5de28d999e42e8110047562e6ac4fc98fd0117a2d4
MD5 f6eba3942dee3bad49f59f06ba23cb76
BLAKE2b-256 9d610e6dc01591c68a3533a0bb6aa1ab192052142645a2874e3367f6996914eb

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page