Skip to main content

llama-index llms dashscope integration

Project description

LlamaIndex Llms Integration: Dashscope

Installation

  1. Install the required Python package:

    pip install llama-index-llms-dashscope
    
  2. Set the DashScope API key as an environment variable:

    export DASHSCOPE_API_KEY=YOUR_DASHSCOPE_API_KEY
    

    Alternatively, you can set it in your Python script:

    import os
    
    os.environ["DASHSCOPE_API_KEY"] = "YOUR_DASHSCOPE_API_KEY"
    

Usage

Basic Recipe Generation

To generate a basic vanilla cake recipe:

from llama_index.llms.dashscope import DashScope, DashScopeGenerationModels

# Initialize DashScope object
dashscope_llm = DashScope(model_name=DashScopeGenerationModels.QWEN_MAX)

# Generate a vanilla cake recipe
resp = dashscope_llm.complete("How to make cake?")
print(resp)

Streaming Recipe Responses

For real-time streamed responses:

responses = dashscope_llm.stream_complete("How to make cake?")
for response in responses:
    print(response.delta, end="")

Multi-Round Conversation

To have a conversation with the assistant and ask for a sugar-free cake recipe:

from llama_index.core.base.llms.types import MessageRole, ChatMessage

messages = [
    ChatMessage(
        role=MessageRole.SYSTEM, content="You are a helpful assistant."
    ),
    ChatMessage(role=MessageRole.USER, content="How to make cake?"),
]

# Get first round response
resp = dashscope_llm.chat(messages)
print(resp)

# Continue conversation
messages.append(
    ChatMessage(role=MessageRole.ASSISTANT, content=resp.message.content)
)
messages.append(
    ChatMessage(role=MessageRole.USER, content="How to make it without sugar?")
)

# Get second round response
resp = dashscope_llm.chat(messages)
print(resp)

Handling Sugar-Free Recipes

For sugar-free cake recipes using honey as a sweetener:

resp = dashscope_llm.complete("How to make cake without sugar?")
print(resp)

LLM Implementation example

https://docs.llamaindex.ai/en/stable/examples/llm/dashscope/

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_index_llms_dashscope-0.3.0.tar.gz (5.3 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file llama_index_llms_dashscope-0.3.0.tar.gz.

File metadata

File hashes

Hashes for llama_index_llms_dashscope-0.3.0.tar.gz
Algorithm Hash digest
SHA256 c5cf190ac12896c3a1b4773b742fbbf81d6e04453338e872ceecfdd1370c6a45
MD5 90b2ef0ab3102b885ba9855074c077ed
BLAKE2b-256 9903b4f4a30e83f6d1efdc5f08390cfec8f4ce4b579f1998993e692d9ba0aedb

See more details on using hashes here.

File details

Details for the file llama_index_llms_dashscope-0.3.0-py3-none-any.whl.

File metadata

File hashes

Hashes for llama_index_llms_dashscope-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 cd9f17bddfa5b365aabda57cad807d16f47dbfac5383385d781bb68148b3fdb6
MD5 7154e9f0af27785ca95d1373440be338
BLAKE2b-256 604bca8921810f471eef61a10d2afd77b192658810cb6e45a5ce4ceeca85d857

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page