Skip to main content

An unofficial Formosa Foundation Model API client implementation compatible with LangChain and OpenAI

Project description

unofficial_ffm_openai_client

An unofficial Formosa Foundation Model client implementation based on OpenAI and LangChain

Introduction

This is an unofficial Python client implementation for the Formosa Foundation Model public endpoint, compatible with the OpenAI Python client and LangChain. Currently, it only implements the Conversation API and supports the public endpoint. Note that the synchronous API is not yet implemented.

Changelog

  • 0.2.1 - [fix] Remove unnecessary import from chat_completion.py
  • 0.2.0
    • Add a callback for counting token consumption in streaming and non-streaming mode.
    • Add json error handling for sync streaming mode.
    • Add token consumption info in the result of streaming and non-streaming mode.
  • 0.1.3 - Support function calls.
  • 0.1.2 - Support embeddings.

Usage

Install using pypi:

pip install unofficial-ffm-openai

You can use it similarly to the original OpenAIChat, with a few different parameters:

from ffm.langchain.language_models.ffm import FfmChatOpenAI

chat_ffm = FfmChatOpenAI(
    ffm_endpoint="https://api-ams.twcc.ai/api",
    max_tokens=1000,
    temperature=0.5,
    top_k=50,
    top_p=1.0,
    frequency_penalty=1.0,
    ffm_api_key="your key",
    ffm_deployment="ffm-mistral-7b-32k-instruct",  # or other model name
    streaming=True,
    callbacks=callbacks
)
from ffm.embeddings import FFMEmbeddings

embedding = FFMEmbeddings(
    base_url="",
    api_key="your key")

Callbacks

You cen use the ffm callbacks as using other callbacks such as openai callback, here is the example:

from ffm.langchain.callbacks import get_ffm_callback

with get_ffm_callback() as cb:
    ...do something using llm...
    
    total_tokens = cb.total_tokens
    prompt_tokens = cb.prompt_tokens
    completion_tokens = cb.completion_tokens
    successful_requests = cb.successful_requests
  • Note: Cost calculation has not been done yet but is in progress.

Limitation

Currently, it has only been tested with the following dependencies:

langchain                         0.1.20
langchain-community               0.0.38
langchain-core                    0.1.52
langchain-openai                  0.1.7
langchain-text-splitters          0.0.2
langchainhub                      0.1.15

and the OpenAI client:

openai                            1.30.1

TODO

  • Full implementation for the synchronous API.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

unofficial_ffm_openai-0.2.1.tar.gz (16.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

unofficial_ffm_openai-0.2.1-py3-none-any.whl (21.5 kB view details)

Uploaded Python 3

File details

Details for the file unofficial_ffm_openai-0.2.1.tar.gz.

File metadata

  • Download URL: unofficial_ffm_openai-0.2.1.tar.gz
  • Upload date:
  • Size: 16.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.10.11

File hashes

Hashes for unofficial_ffm_openai-0.2.1.tar.gz
Algorithm Hash digest
SHA256 a58d28460a3711ed583f945610444bd466458cd8ecf65ed5e5f937ddaec4d9fc
MD5 4eaebc50c8b936381785747e901182dc
BLAKE2b-256 73b6f2de27603a80187f0f0ea358b129b66b13ced5a11a9967442d712bc65634

See more details on using hashes here.

File details

Details for the file unofficial_ffm_openai-0.2.1-py3-none-any.whl.

File metadata

File hashes

Hashes for unofficial_ffm_openai-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 7b35d126e17b2f8ce3e496308d8313bd396a2f985edb8f934adfe20ba2424bea
MD5 e0c0a6537a0550e4ac053096b3e01ee3
BLAKE2b-256 717e34847b4504deb0e0e244a8d1f02f59855d5978dcba1cb0b74e0790ec03c5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page