Skip to main content

LlamaIndex integration for DigitalOcean Gradient AI

Project description

llama-index-llms-digitalocean-gradientai

LlamaIndex integration for DigitalOcean Gradient AI LLM.

Installation

pip install llama-index-llms-digitalocean-gradientai

This package uses the official gradient SDK (PyPI package: gradient) under the hood; it is installed automatically as a dependency.

Or install from source:

git clone https://github.com/yourusername/llama-index-llms-digitalocean-gradientai
cd llama-index-llms-digitalocean-gradientai
pip install -e .

Usage

Basic Usage

from llama_index.llms.gradient import DigitalOceanGradientAILLM

llm = DigitalOceanGradientAILLM(
    model="meta-llama-3-70b-instruct",
    api_key="your-api-key",
    workspace_id="your-workspace-id"
)

response = llm.complete("What is DigitalOcean Gradient?")
print(response)

Using Environment Variables

import os
from llama_index.llms.gradient import DigitalOceanGradientAILLM

os.environ["GRADIENT_API_KEY"] = "your-api-key"
os.environ["GRADIENT_WORKSPACE_ID"] = "your-workspace-id"

llm = DigitalOceanGradientAILLM(model="meta-llama-3-70b-instruct")

You can also use GRADIENT_MODEL_ACCESS_KEY (recommended) in place of GRADIENT_API_KEY.

Chat Interface

from llama_index.core.llms import ChatMessage
from llama_index.llms.gradient import DigitalOceanGradientAILLM

llm = DigitalOceanGradientAILLM(
    model="meta-llama-3-70b-instruct",
    api_key="your-api-key",
    workspace_id="your-workspace-id"
)

messages = [
    ChatMessage(role="system", content="You are a helpful assistant."),
    ChatMessage(role="user", content="What is Gradient?")
]

response = llm.chat(messages)
print(response.message.content)

Streaming

from llama_index.llms.gradient import DigitalOceanGradientAILLM

llm = DigitalOceanGradientAILLM(
    model="meta-llama-3-70b-instruct",
    api_key="your-api-key",
    workspace_id="your-workspace-id"
)

response_gen = llm.stream_complete("Tell me a story about AI:")
for delta in response_gen:
    print(delta.delta, end="", flush=True)

Async Usage

import asyncio
from llama_index.llms.gradient import DigitalOceanGradientAILLM

async def main():
llm = DigitalOceanGradientAILLM(
        model="meta-llama-3-70b-instruct",
        api_key="your-api-key",
        workspace_id="your-workspace-id"
    )
    response = await llm.acomplete("What is Gradient?")
    print(response)

asyncio.run(main())

With RAG Pipeline

from llama_index.core import VectorStoreIndex, Document
from llama_index.llms.gradient import GradientLLM

llm = DigitalOceanGradientAILLM(
    model="meta-llama-3-70b-instruct",
    api_key="your-api-key",
    workspace_id="your-workspace-id"
)

documents = [Document(text="DigitalOcean Gradient is a managed LLM API service...")]
index = VectorStoreIndex.from_documents(documents)
query_engine = index.as_query_engine(llm=llm)
response = query_engine.query("What is Gradient?")
print(response)

Package Structure

llama-index-llms-digitalocean-gradientai/
├── llama_index/
│   └── llms/
│       └── gradient/
│           ├── __init__.py
│           └── base.py
├── setup.py
├── pyproject.toml
├── README.md
└── requirements.txt

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file llama_index_llms_digitalocean_gradientai-0.1.1.tar.gz.

File metadata

File hashes

Hashes for llama_index_llms_digitalocean_gradientai-0.1.1.tar.gz
Algorithm Hash digest
SHA256 e35b3778979daf7e33fc26de2c8ccd4be267c477d4aad23786308127afc69684
MD5 134a8afe8b38808b080cfea4c8d3f0a4
BLAKE2b-256 3ef4017b384d44ffd9880b5a1e6722d3bb172279367af59bda482560a0a50358

See more details on using hashes here.

File details

Details for the file llama_index_llms_digitalocean_gradientai-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for llama_index_llms_digitalocean_gradientai-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 0910ad978612f81768cf393e690348c1c03d4db1084801113f8335775c9885f5
MD5 22fa5dedaa717cd1613365ba9ff2bf36
BLAKE2b-256 581a314765b260426c645a8ec1ed199aa0f6b68d7299b2a4d99354ccb606c6ae

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page