Skip to main content

This Python package provides a comprehensive and efficient wrapper for the OpenAI API, designed to support advanced application development based on ChatGPT.

Project description

OpenAI API Wrapper for Advanced ChatGPT Development

This Python package provides a comprehensive and efficient wrapper for the OpenAI API, designed to support advanced application development based on ChatGPT. The package simplifies the integration process and offers additional functionalities such as conversation management, model fine-tuning, retrieval of embeddings, automatic question generation, and offline model execution.

Features

  • Easy interaction with the OpenAI API for ChatGPT applications
  • Conversation management for multiple chat sessions
  • Support for fine-tuning the ChatGPT model
  • Retrieval of embeddings for specific text passages
  • Automatic question generation for given paragraphs
  • Offline model execution with compatible models

Installation

You can install the package using pip:

pip install openai-api-wrapper

Usage

Here is a basic example of how to use the OpenAI API Wrapper:

python

    """
    Set environment variable OPENAI_API_KEY to your OpenAI API key.
    Set environment variable OPENAI_PROXY to your OpenAI proxy setting, 
    such as "http://<proxy_server>:<proxy_port>" or "socks5://<proxy_server>:<proxy_port>" .
    """
    from openai_api_wrapper import ChatBot

    # Initialize the Chatbot instance
    chatbot = ChatBot()

    def show_result(conversation_id, reply_content):
        print(f"[{conversation_id}] AI Response: {reply_content}")
        conversation_turns = chatbot.get_conversation_turns(conversation_id=conversation_id)
        print(f"[{conversation_id}] Conversation turns: {conversation_turns}")

    # ----- The default conversation -----

    # Generate a reply to the given prompt.
    prompt = "What is the capital of France?"
    print(f"[default] User message: {prompt}")
    reply_content= chatbot.ask(prompt)
    show_result("default", reply_content)

    # Generate a reply to the given prompt using streaming API.
    prompt = "Tell me a joke."
    print(f"[default] User message: {prompt}")
    reply_content= chatbot.ask(prompt, stream=True)
    show_result("default", reply_content)

    # ----- Two conversations -----

    # Generate a reply_contentto the given prompt.
    conversation_id = "conversation_1"
    prompt = "What is the capital of France?"
    print(f"[{conversation_id}] User message: {prompt}")

    reply_content= chatbot.ask(prompt, conversation_id=conversation_id)
    show_result(conversation_id, reply_content)


    # Generate a reply_contentto the given prompt using streaming API.
    conversation_id = "conversation_2"
    prompt = "Tell me a joke."
    print(f"[{conversation_id}] User message: {prompt}")

    reply_content= chatbot.ask(prompt, conversation_id=conversation_id, stream=True)
    # unfinished_reply_content= ""
    # for chunked_reply_content in chatbot.ask_stream_iterator(prompt, conversation_id=conversation_id):
    #     unfinished_reply_content+= chunked_reply_content
    #     print(unfinished_reply_content)
    # reply_content= chatbot.get_last_reply_content(conversation_id)

    show_result(conversation_id, reply_content)

For a more detailed example with all the available features, check out the example.py file in the repository.

Documentation

You can find the complete documentation for this package in the docs folder, or check out the source code for more details on the implementation.

Roadmap

We plan to continually improve and expand the functionality of this package. Some of the upcoming features include:

  • Integration with various machine learning frameworks
  • Support for multi-modal inputs (e.g., text, images, audio)
  • Expansion of available pre-trained models
  • Simplified deployment options for various platforms

Contributing

We welcome contributions from the community! If you'd like to contribute, please follow these steps:

  1. Fork the repository
  2. Create a new branch for your changes (git checkout -b my-feature)
  3. Commit your changes (git commit -am 'Add some feature')
  4. Push the branch (git push origin my-feature)
  5. Create a new pull request

Please make sure to add tests and documentation for any new features or changes.

License

This project is licensed under the MIT License. See the LICENSE file for more details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openai-api-wrapper-0.3.0.tar.gz (5.9 kB view details)

Uploaded Source

Built Distribution

openai_api_wrapper-0.3.0-py3-none-any.whl (5.7 kB view details)

Uploaded Python 3

File details

Details for the file openai-api-wrapper-0.3.0.tar.gz.

File metadata

  • Download URL: openai-api-wrapper-0.3.0.tar.gz
  • Upload date:
  • Size: 5.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.4

File hashes

Hashes for openai-api-wrapper-0.3.0.tar.gz
Algorithm Hash digest
SHA256 a7174591211ce896eb9a9ee2ec6518cf955946750e72507d9c8fac64a5bdd844
MD5 2b10f00f6bd965898e4a14fca40b58f3
BLAKE2b-256 a4d81827a766e7e0c48b16010de22cbda281689dbb7f0c770e9ab04be0bae313

See more details on using hashes here.

File details

Details for the file openai_api_wrapper-0.3.0-py3-none-any.whl.

File metadata

File hashes

Hashes for openai_api_wrapper-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 66df1782f5ddfa6d23ce4eb2d386b257f13fdfcbb14819f9d45b786877db2379
MD5 4aee8634bf8509202020de1c2c098f49
BLAKE2b-256 dc7c85481e9bf186d6b90c700241f658b6d255837a6e68b40aa81f6ac99d3f11

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page