Skip to main content

SDK for using LLM

Project description

SFN_LLM_Client

This is an enhanced and improved version with latest llm provider chat completion feature The sfn_llm_client now includes:

  • Updated to the latest version of OpenAI.
  • Integrated Cortex LLM provider support.
  • Latest improvements and updates to the codebase for better performance and compatibility.

Features

  • Supports multiple LLM providers, including OpenAI and Cortex.
  • Easily extensible to include new LLM providers by implementing base client classes.
  • Well-documented and tested.

Adding a New LLM Client

To add a new LLM client, follow these steps:

  1. Implement BaseLLMClient or BaseLLMAPIClient:
    If you're adding a new LLM provider, you'll need to implement either the BaseLLMClient or BaseLLMAPIClient interfaces.

  2. Register in LLMAPIClientFactory:
    If you're adding a client based on BaseLLMAPIClient, don't forget to register it in the LLMAPIClientFactory so that it's available for use.

Adding Dependencies

If your LLM client requires additional dependencies, you can add them to the pyproject.toml file under the appropriate section.

Contributing

Contributions are welcome! If you'd like to help improve this SDK, please check out the todos or open an issue or pull request.

Credits

the core forked functionality taken from llm-client-sdk created by uripeled2.

Contact:

For any queries or issues, please contact the maintainer at: rajesh@stepfunction.ai

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sfn_llm_client-0.2.0a2.tar.gz (24.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

sfn_llm_client-0.2.0a2-py3-none-any.whl (35.0 kB view details)

Uploaded Python 3

File details

Details for the file sfn_llm_client-0.2.0a2.tar.gz.

File metadata

  • Download URL: sfn_llm_client-0.2.0a2.tar.gz
  • Upload date:
  • Size: 24.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.18

File hashes

Hashes for sfn_llm_client-0.2.0a2.tar.gz
Algorithm Hash digest
SHA256 6a576b3c73d7c2fd9259ce7a70a07cd072c81efb1b3d9bc1d075405254dc99fb
MD5 b7dd7c3048b3dc41114964549ebc0ee5
BLAKE2b-256 c5b22764d46b7be073cdc40beec0b88a7b9f61d14fc319ce8089a302550b57c1

See more details on using hashes here.

File details

Details for the file sfn_llm_client-0.2.0a2-py3-none-any.whl.

File metadata

File hashes

Hashes for sfn_llm_client-0.2.0a2-py3-none-any.whl
Algorithm Hash digest
SHA256 7be49174828ccd2da2bb5ea16c4c6e1bae5b4a3fcf35f8166b181c1b29c81df4
MD5 4c0d8971ffb38d609ed58c0d551ac773
BLAKE2b-256 99f292ffdb3b6b5725c639664d100743313bafca6567555f5ef406f585ddb362

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page