Skip to main content

This python package help to interact with Generative AI - Large Language Models. It interacts with AIaaS LLM , AIaaS embedding , AIaaS Audio set of APIs to cater the request.

Project description

AIaaS Falcon Logo

AIaaS Falcon

Documentation Coverage

Description

Generative AI - LLM library interacts with specific API, allowing operations such as listing models, creating embeddings, and generating text based on certain configurations.

Installation

Ensure you have the requests and google-api-core libraries installed:

pip install aiaas-falcon

Usage

  1. Initialization:

    from aiaas_falcon import Falcon # Assuming the class is saved in a file named falcon_client.py
    
    falcon = Falcon(api_key="<Your_API_Key>", host_name_port="<Your_Host_Name_Port>")
    
  2. Listing Models:

    models = falcon_client.list_models()
    print(models)
    
  3. Creating an Embedding:

    response = falcon_client.create_embedding(file_path="<Your_File_Path>")
    print(response)
    
  4. Generating Text:

    response = falcon_client.generate_text(chat_history=[], query="<Your_Query>")
    print(response)
    

Methods

  • list_models(self) - Retrieves available models.
  • create_embedding(self, file_path) - Creates embeddings from a provided file.
  • generate_text(self, chat_history=[], query="", use_default=1, conversation_config={}, config={}) - Generates text based on provided parameters.

Example usage:

# Example usage

from aiaas_falcon import Falcon  # Make sure the Falcon class is imported

# Initialize the Falcon object with the API key, host name and port
falcon = Falcon(api_key='_____API_KEY_____', host_name_port='34.16.138.59:8888', transport="rest")

# List available models
model = falcon.list_models()['models']

# Check if any model is available
if model:
    # Create an embedding
    response = falcon.create_embedding(['/content/01Aug2023.csv'])
    print(response)
    print('Embedding Success')

    # Define a prompt
    prompt = 'What is Account status key?'

    # Generate text based on the prompt and other parameters
    completion = falcon.generate_text(
         query=prompt,
         chat_history=[],
         use_default=1,
         conversation_config={
            "k": 5,
            "fetch_k": 50000,
            "bot_context_setting": "Do note that Your are a data dictionary bot. Your task is to fully answer the user's query based on the information provided to you."
         },
         config={"max_new_tokens": 1200, "temperature": 0.4, "top_k": 40, "top_p": 0.95, "batch_size": 256}
    )

    print(completion)
    print("Generate Success")

else:
    print("No suitable model found")

Conclusion

The Falcon API Client simplifies interactions with the specified API, providing a straightforward way to perform various operations such as listing models, creating embeddings, and generating text.

Authors

Google Colab

Badges

MIT License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aiaas_falcon-0.1.2.tar.gz (3.3 kB view hashes)

Uploaded Source

Built Distribution

aiaas_falcon-0.1.2-py3-none-any.whl (3.9 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page