Skip to main content

An unofficial Formosa Foundation Model API client implementation compatible with LangChain and OpenAI

Project description

unofficial_ffm_openai_client

An unofficial Formosa Foundation Model client implementation based on OpenAI and LangChain

Introduction

This is an unofficial Python client implementation for the Formosa Foundation Model public endpoint, compatible with the OpenAI Python client and LangChain. Currently, it only implements the Conversation API and supports the public endpoint. Note that the synchronous API is not yet implemented.

Changelog

  • 0.2.0
    • Add a callback for counting token consumption in streaming and non-streaming mode.
    • Add json error handling for sync streaming mode.
    • Add token consumption info in the result of streaming and non-streaming mode.
  • 0.1.3 - Support function calls.
  • 0.1.2 - Support embeddings.

Usage

Install using pypi:

pip install unofficial-ffm-openai

You can use it similarly to the original OpenAIChat, with a few different parameters:

from ffm.langchain.language_models.ffm import FfmChatOpenAI

chat_ffm = FfmChatOpenAI(
    ffm_endpoint="https://api-ams.twcc.ai/api",
    max_tokens=1000,
    temperature=0.5,
    top_k=50,
    top_p=1.0,
    frequency_penalty=1.0,
    ffm_api_key="your key",
    ffm_deployment="ffm-mistral-7b-32k-instruct",  # or other model name
    streaming=True,
    callbacks=callbacks
)
from ffm.embeddings import FFMEmbeddings

embedding = FFMEmbeddings(
    base_url="",
    api_key="your key")

Callbacks

You cen use the ffm callbacks as using other callbacks such as openai callback, here is the example:

from ffm.langchain.callbacks import get_ffm_callback

with get_ffm_callback() as cb:
    ...do something using llm...
    
    total_tokens = cb.total_tokens
    prompt_tokens = cb.prompt_tokens
    completion_tokens = cb.completion_tokens
    successful_requests = cb.successful_requests
  • Note: Cost calculation has not been done yet but is in progress.

Limitation

Currently, it has only been tested with the following dependencies:

langchain                         0.1.20
langchain-community               0.0.38
langchain-core                    0.1.52
langchain-openai                  0.1.7
langchain-text-splitters          0.0.2
langchainhub                      0.1.15

and the OpenAI client:

openai                            1.30.1

TODO

  • Full implementation for the synchronous API.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

unofficial_ffm_openai-0.2.0.tar.gz (16.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

unofficial_ffm_openai-0.2.0-py3-none-any.whl (21.5 kB view details)

Uploaded Python 3

File details

Details for the file unofficial_ffm_openai-0.2.0.tar.gz.

File metadata

  • Download URL: unofficial_ffm_openai-0.2.0.tar.gz
  • Upload date:
  • Size: 16.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.10.11

File hashes

Hashes for unofficial_ffm_openai-0.2.0.tar.gz
Algorithm Hash digest
SHA256 430379065796922bc24b7d7987c6071af91d4335a6d253d272509737a5b3b580
MD5 a266a791aab810bf433f1fb0f9d97720
BLAKE2b-256 e9d51f789d0ce622ced49af667bc54ace4dffe9f5068065113ab187d2f547bbc

See more details on using hashes here.

File details

Details for the file unofficial_ffm_openai-0.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for unofficial_ffm_openai-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 77377403a3f691fba37da3b8c8067b57dac4a95eef595fdd0acc7f7df139ca93
MD5 8d63ea33086fa7730463a9b11540913f
BLAKE2b-256 8f774e817988a740e6abc64ba24ed2ae3684fdd6992a34ab06be9c9c2d83b494

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page