llama-index llms mymagic integration
Project description
LlamaIndex Llms Integration: Mymagic
Installation
To install the required package, run:
%pip install llama-index-llms-mymagic
!pip install llama-index
Setup
Before you begin, set up your cloud storage bucket and grant MyMagic API secure access. For detailed instructions, visit the MyMagic documentation.
Initialize MyMagicAI
Create an instance of MyMagicAI by providing your API key and storage configuration:
from llama_index.llms.mymagic import MyMagicAI
llm = MyMagicAI(
api_key="your-api-key",
storage_provider="s3", # Options: 's3' or 'gcs'
bucket_name="your-bucket-name",
session="your-session-name", # Directory for batch inference
role_arn="your-role-arn",
system_prompt="your-system-prompt",
region="your-bucket-region",
return_output=False, # Set to True to return output JSON
input_json_file=None, # Input file stored on the bucket
list_inputs=None, # List of inputs for small batch
structured_output=None, # JSON schema of the output
)
Note: If
return_output
is set toTrue
,max_tokens
should be at least 100.
Generate Completions
To generate a text completion for a question, use the complete
method:
resp = llm.complete(
question="your-question",
model="choose-model", # Supported models: mistral7b, llama7b, mixtral8x7b, codellama70b, llama70b, etc.
max_tokens=5, # Number of tokens to generate (default is 10)
)
print(
resp
) # The response indicates if the final output is stored in your bucket or raises an exception if the job failed
Asynchronous Requests
For asynchronous operations, use the acomplete
endpoint:
import asyncio
async def main():
response = await llm.acomplete(
question="your-question",
model="choose-model", # Supported models listed in the documentation
max_tokens=5, # Number of tokens to generate (default is 10)
)
print("Async completion response:", response)
await main()
LLM Implementation example
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file llama_index_llms_mymagic-0.3.0.tar.gz
.
File metadata
- Download URL: llama_index_llms_mymagic-0.3.0.tar.gz
- Upload date:
- Size: 4.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.3 CPython/3.11.10 Darwin/22.3.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4daff3b54514bd7ef82bfa4ca4687ea53dec6088f109697dd81af5a27239a3c8 |
|
MD5 | 50f567c8ec50273d23318032de4827b2 |
|
BLAKE2b-256 | 7dc5ba5578960bde5a6f3871f85f535af5ab09d6a070044b587f1eb49348bc0a |
File details
Details for the file llama_index_llms_mymagic-0.3.0-py3-none-any.whl
.
File metadata
- Download URL: llama_index_llms_mymagic-0.3.0-py3-none-any.whl
- Upload date:
- Size: 4.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.3 CPython/3.11.10 Darwin/22.3.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4d0b57eb2d2d18d900bf15fc9e68bae6c7a69591c769e5227f81b9c723d3f3cf |
|
MD5 | 3e9a13e4c386851334128b96a60c30d2 |
|
BLAKE2b-256 | c313b5749d4750c5fd4c95a5c317ce4d2574554764d29180a224329fb0892da7 |