Skip to main content

A small example package

Project description

Lite LLM Client

This project made for very light llm client. the main idea is do not use any llm client library.

setup

How to pass API_KEY

  1. use parameter of LLMConfig
from lite_llm_client import LiteLLMClient, OpenAIConfig, LLMMessage, LLMMessageRole

client = LiteLLMClient(OpenAIConfig(api_key="YOUR API KEY"))
answer = client.chat_completions(messages=[LLMMessage(role=LLMMessage.USER, content="hello ai?")])

print(answer)
  1. use .env
    • rename .env_example to .env
    • replace YOUR KEY to real api_key
OPENAI_API_KEY=YOUR KEY
ANTHROPIC_API_KEY=YOUR KEY
GEMINI_API_KEY=YOUR KEY

Known issue

  • gemini path may not stable. guide code has /v1beta/.... sometimes gemini returns http 500 error

Roadmap

Future version

  • support multimodal (image and text)

0.1.0

  • 2024-07-21 support OpenAI
  • 2024-07-25 support Anthropic
  • 2024-07-27 add options for inference
  • 2024-07-28 support Gemini
  • 2024-07-30 support streaming (OpenAI). simple SSE implement.
  • 2024-07-31 support streaming (Anthropic).
  • 2024-08-01 support streaming (Gemini). unstable google gemini.
  • 2024-08-13 support inference result(token count, stop readon)

Reference

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lite_llm_client-0.1.2.tar.gz (8.8 kB view details)

Uploaded Source

Built Distribution

lite_llm_client-0.1.2-py3-none-any.whl (10.7 kB view details)

Uploaded Python 3

File details

Details for the file lite_llm_client-0.1.2.tar.gz.

File metadata

  • Download URL: lite_llm_client-0.1.2.tar.gz
  • Upload date:
  • Size: 8.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.12

File hashes

Hashes for lite_llm_client-0.1.2.tar.gz
Algorithm Hash digest
SHA256 43d4d91495f9ad19cbc201fad1f33176695c62f3576e43ba06ec16ed13954875
MD5 4729c49c3a39ccba192c83444c404836
BLAKE2b-256 1fcc101d3ab4cd16c2a567ae4b4f9fc4280c6c6947013144323ceeb1bc9a5b77

See more details on using hashes here.

File details

Details for the file lite_llm_client-0.1.2-py3-none-any.whl.

File metadata

File hashes

Hashes for lite_llm_client-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 134f15e1c7f121600f6c58c7a0732c746031dc1fb3c69eff3f403fe0de8b3d27
MD5 44a7dcb6b473dce6e99429123cf6319e
BLAKE2b-256 ed142c14fb9468911bc514443a2f24677e822ffe9335136c60c095308f9ec92b

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page