Skip to main content

llama-index llms Aleph Alpha integration

Project description

LlamaIndex LLM Integration: Aleph Alpha

This README details the process of integrating Aleph Alpha's Large Language Models (LLMs) with LlamaIndex. Utilizing Aleph Alpha's API, users can generate completions, facilitate question-answering, and perform a variety of other natural language processing tasks directly within the LlamaIndex framework.

Features

  • Text Completion: Use Aleph Alpha LLMs to generate text completions for prompts.
  • Model Selection: Access the latest Aleph Alpha models, including the Luminous model family, to generate responses.
  • Advanced Sampling Controls: Customize the response generation with parameters like temperature, top_k, top_p, presence_penalty, and more, to fine-tune the creativity and relevance of the generated text.
  • Control Parameters: Apply attention control parameters for advanced use cases, affecting how the model focuses on different parts of the input.

Installation

pip install llama-index-llms-alephalpha

Usage

from llama_index.llms.alephalpha import AlephAlpha
  1. Request Parameters:

    • model: Specify the model name (e.g., luminous-base-control). The latest model version is always used.
    • prompt: The text prompt for the model to complete.
    • maximum_tokens: The maximum number of tokens to generate.
    • temperature: Adjusts the randomness of the completions.
    • top_k: Limits the sampled tokens to the top k probabilities.
    • top_p: Limits the sampled tokens to the cumulative probability of the top tokens.
    • log_probs: Set to true to return the log probabilities of the tokens.
    • echo: Set to true to return the input prompt along with the completion.
    • penalty_exceptions: A list of tokens that should not be penalized.
    • n: Number of completions to generate.
  2. Advanced Sampling Parameters: (Optional)

    • presence_penalty & frequency_penalty: Adjust to discourage repetition.
    • sequence_penalty: Reduces likelihood of repeating token sequences.
    • hosting: Option to process the request in Aleph Alpha's own datacenters for enhanced data privacy.

Response Structure

* `model_version`: The name and version of the model used.
* `completions`: A list containing the generated text completion(s) and optional metadata:
    * `completion`: The generated text completion.
    * `log_probs`: Log probabilities of the tokens in the completion.
    * `raw_completion`: The raw completion without any post-processing.
    * `completion_tokens`: Completion split into tokens.
    * `finish_reason`: Reason for completion termination.
* `num_tokens_prompt_total`: Total number of tokens in the input prompt.
* `num_tokens_generated`: Number of tokens generated in the completion.

Example

Refer to the example notebook for a comprehensive guide on generating text completions with Aleph Alpha models in LlamaIndex.

API Documentation

For further details on the API and available models, please consult Aleph Alpha's API Documentation.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_index_llms_alephalpha-0.5.0.tar.gz (7.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llama_index_llms_alephalpha-0.5.0-py3-none-any.whl (7.2 kB view details)

Uploaded Python 3

File details

Details for the file llama_index_llms_alephalpha-0.5.0.tar.gz.

File metadata

  • Download URL: llama_index_llms_alephalpha-0.5.0.tar.gz
  • Upload date:
  • Size: 7.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.9 {"installer":{"name":"uv","version":"0.10.9","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_index_llms_alephalpha-0.5.0.tar.gz
Algorithm Hash digest
SHA256 9210dd203a192cf5565cbcd1b511a9001d57c1d02ebe59036430878d6014d4d3
MD5 53582c460da4c4e9465e58bba8b04959
BLAKE2b-256 b4417086830b7d3105184ff3a599ed1ab52e5df7845a1843a2da867f3041eb2f

See more details on using hashes here.

File details

Details for the file llama_index_llms_alephalpha-0.5.0-py3-none-any.whl.

File metadata

  • Download URL: llama_index_llms_alephalpha-0.5.0-py3-none-any.whl
  • Upload date:
  • Size: 7.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.9 {"installer":{"name":"uv","version":"0.10.9","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_index_llms_alephalpha-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 0ad397e252973a667f92f1d1441acff333a177174244d869940fc0119f9ea83b
MD5 313a0e4267965a0d4cc0b9dda9195993
BLAKE2b-256 8382a2f4d6da9bb4230515b5ad725751c7f32941fa2550bb4ed01dbeb41a74b8

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page