Skip to main content

llama-index llms konko integration

Project description

LlamaIndex Llms Integration: Konko

Installation

  1. Install the required Python packages:

    %pip install llama-index-llms-konko
    !pip install llama-index
    
  2. Set the API keys as environment variables:

    export KONKO_API_KEY=<your-api-key>
    export OPENAI_API_KEY=<your-api-key>
    

Usage

Import Required Libraries

import os
from llama_index.llms.konko import Konko
from llama_index.core.llms import ChatMessage

Chat with Konko Model

To chat with a Konko model:

os.environ["KONKO_API_KEY"] = "<your-api-key>"
llm = Konko(model="meta-llama/llama-2-13b-chat")
messages = ChatMessage(role="user", content="Explain Big Bang Theory briefly")

resp = llm.chat([messages])
print(resp)

Chat with OpenAI Model

To chat with an OpenAI model:

os.environ["OPENAI_API_KEY"] = "<your-api-key>"
llm = Konko(model="gpt-3.5-turbo")
message = ChatMessage(role="user", content="Explain Big Bang Theory briefly")

resp = llm.chat([message])
print(resp)

Streaming Responses

To stream a response for longer messages:

message = ChatMessage(role="user", content="Tell me a story in 250 words")
resp = llm.stream_chat([message], max_tokens=1000)

for r in resp:
    print(r.delta, end="")

Complete with Prompt

To generate a completion based on a system prompt:

llm = Konko(model="phind/phind-codellama-34b-v2", max_tokens=100)
text = """### System Prompt
You are an intelligent programming assistant.

### User Message
Implement a linked list in C++

### Assistant
..."""

resp = llm.stream_complete(text, max_tokens=1000)
for r in resp:
    print(r.delta, end="")

LLM Implementation example

https://docs.llamaindex.ai/en/stable/examples/llm/konko/

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_index_llms_konko-0.3.0.tar.gz (8.1 kB view details)

Uploaded Source

Built Distribution

llama_index_llms_konko-0.3.0-py3-none-any.whl (8.4 kB view details)

Uploaded Python 3

File details

Details for the file llama_index_llms_konko-0.3.0.tar.gz.

File metadata

  • Download URL: llama_index_llms_konko-0.3.0.tar.gz
  • Upload date:
  • Size: 8.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.11.10 Darwin/22.3.0

File hashes

Hashes for llama_index_llms_konko-0.3.0.tar.gz
Algorithm Hash digest
SHA256 b2c3bb38ae8b8edf0b94bbbbbe8200c0feaa8fe9e988384e653e38fdb722e2f7
MD5 78589f369bc243445dfca5696937c087
BLAKE2b-256 b4f0e0be17f4bee9abced9a8cd3ee234f391a75810891e35025c1f32cf904da1

See more details on using hashes here.

File details

Details for the file llama_index_llms_konko-0.3.0-py3-none-any.whl.

File metadata

File hashes

Hashes for llama_index_llms_konko-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b4bb57ebcb9c96223acb2cc7d57d75b80800564a6678db274d83b82ddb67730d
MD5 fcabc36cdb72d73ed86eee11e2869c37
BLAKE2b-256 0636afa4bbb0d1791acb2083db320b944e3fe60155567e3e1b67359cf5fada2a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page