Skip to main content

An avahiai library which makes your Gen-AI tasks effortless

Project description

avahiplatform

GitHub stars PyPI - License PyPI - Downloads

Quickstart

Installation

You can install avahiplatform by running:

pip install avahiplatform

Welcome to AvahiPlatform! ๐Ÿš€

Hey there, AI enthusiast! ๐Ÿ‘‹ Are you ready to supercharge your Gen-AI projects? Look no further than AvahiPlatform - your new best friend in the world of Large Language Models (LLMs)!

With AvahiPlatform, you can create and deploy GenAI applications on Bedrock in just 60 seconds. It's that fast and easy!

What's AvahiPlatform all about?

AvahiPlatform is not just a library; it's your ticket to effortless AI-powered applications. We've taken the complexity out of working with LLMs on AWS Bedrock, so you can focus on what really matters - bringing your brilliant ideas to life!

Here's what makes AvahiPlatform special:

  • Simplicity at its core: With just a few lines of Python code, you'll be up and running. No PhD in AI required! ๐Ÿ˜‰
  • AWS Bedrock integration: We've done the heavy lifting to seamlessly connect you with the power of AWS Bedrock. It's like having a direct line to AI goodness!
  • Enterprise-ready: Whether you're a solo developer or part of a large team, AvahiPlatform scales with your needs. From proof-of-concept to production, we've got you covered.
  • Python-friendly: If you can Python, you can AvahiPlatform. It's that simple!
  • Global Gradio URL: Quickly generate and share a URL to allow others to experience your functionality directly from your running environment.
  • Observability with metrics tracking and optional Prometheus integration ๐Ÿ“Š

๐Ÿงฑ What can you build with avahiplatform?

  • Text summarization (plain text, local files, S3 files) ๐Ÿ“
  • Structured information extraction ๐Ÿ—๏ธ
  • Data masking ๐Ÿ•ต๏ธโ€โ™€๏ธ
  • Natural Language to SQL conversion ๐Ÿ—ฃ๏ธโžก๏ธ๐Ÿ’พ
  • PDF summarization ๐Ÿ“„
  • Grammar correction โœ๏ธ
  • Product description generation ๐Ÿ›๏ธ
  • Image generation ๐ŸŽจ
  • Image similarity ๐Ÿ”๐Ÿ–ผ๏ธ
  • Medical scribing ๐Ÿ‘ฉโ€โš•๏ธ
  • ICD-10 code generation ๐Ÿฅ
  • CSV querying ๐Ÿ“Š
  • Retrieval-Augmented Generation (RaG) with Sources ๐Ÿ”๐Ÿ“š
  • Semantic Search ๐Ÿ”Ž๐Ÿ’ก
  • Chatbot ๐Ÿค–
  • Global gradio URL for Any Functionality/Features ๐ŸŒ
  • Support for custom prompts and different Anthropic Claude model versions ๐Ÿง 
  • Error handling with user-friendly messages ๐Ÿ› ๏ธ

Basic Usage

Open In Colab

With the provided Google Colab notebook, you can easily test and explore the features of this project. Simply click the "Open In Colab" badge above to get started!

import avahiplatform

# Initialize observability - You can access these metrics on specified prometheus_port, i.e in this case: 8000
avahiplatform.initialize_observability(metrics_file='./metrics.jsonl', start_prometheus=True, prometheus_port=8000)
# If you don't want to get prometheus metrics avahiplatform.initialize_observability(metrics_file='./metrics.jsonl', start_prometheus=True, prometheus_port=8000)

# Summarization - Text summarization (plain text, local files, S3 files) ๐Ÿ“
summary, input_tokens, output_tokens, cost = avahiplatform.summarize("This is a test string to summarize.")
print("Summary:", summary)

# Structured Extraction - Structured information extraction ๐Ÿ—๏ธ
extraction, input_tokens, output_tokens, cost = avahiplatform.structredExtraction("This is a test string for extraction.")
print("Extraction:", extraction)

# Data Masking - Data masking ๐Ÿ•ต๏ธโ€โ™€๏ธ
masked_data, input_tokens, output_tokens, cost = avahiplatform.DataMasking("This is a test string for Data Masking.")
print("Masked Data:", masked_data)

# PDF Summarization - PDF summarization ๐Ÿ“„
summary, _, _, _ = avahiplatform.summarize("path/to/pdf/file.pdf")
print("PDF Summary:", summary)

# Grammar Correction - Grammar correction โœ๏ธ
corrected_text, _, _, _ = avahiplatform.grammarAssistant("Text with grammatical errors")
print("Corrected Text:", corrected_text)

# Product Description Generation - Product description generation ๐Ÿ›๏ธ
description, _, _, _ = avahiplatform.productDescriptionAssistant("SKU123", "Summer Sale", "Young Adults")
print("Product Description:", description)

# Image Generation - Image generation ๐ŸŽจ
image, seed, cost = avahiplatform.imageGeneration("A beautiful sunset over mountains")
print("Generated Image:", image)

# Image similarity ๐Ÿ”๐Ÿ–ผ๏ธ 
# It supports pil image, local path, s3 path in 1st argument
# It supports pil image, local path, s3 path, folder_path, list of pil images in second argument
similarity_score, cost = avahiplatform.imageSimilarity("ford_endeavour.jpeg", "ford_interior.jpeg")
print("similarity_score:", similarity_score)

# Medical Scribing - Medical scribing ๐Ÿ‘ฉโ€โš•๏ธ
medical_summary, _ = avahiplatform.medicalscribing("path/to/audio.mp3", "input-bucket", "iam-arn")
print("Medical Summary:", medical_summary)

# Icd10code Generation ๐Ÿฅ
codes = avahiplatform.icdcoding("Any prescription or path/to/prescription.txt")
print("Icd 10 codes:", codes)

# CSV querying ๐Ÿ“Š
csv_files = {
    "df1": "path/to/1st_csv.csv",
    "df2": "path/to/2nd_csv.csv"
}
# In the `csv_files` dictionary, you can pass any number of CSV files 1 or more than 1, but they must follow the structure where the key is the name and the value is the path(s3_path or local_path).
csv_query_answer = avahiplatform.query_csv("How many active locations are there in each region?",
                                           csv_file_paths=csv_files)
print(f"csv query answer: {csv_query_answer}")

# RaG with Sources ๐Ÿ”๐Ÿ“š
answer, sources = avahiplatform.perform_rag_with_sources("What is kafka?", s3_path="s3://your-bucket-path-where-doc-is-present/")
print(f"Generated answer: {answer}")
print(f"Retrieved sourcce: {sources}")

# Semantic Search ๐Ÿ”Ž๐Ÿ’ก
similar_docs = avahiplatform.perform_semantic_search("What is kafka?", s3_path="s3://your-bucket-path-where-doc-is-present/")
print(f"similar docs: {similar_docs}")

# Chatbot ๐Ÿค–
--------------------------------
chatbot = avahiplatform.chatbot()
chatbot.initialize_instance(system_prompt="You are a python developer, you only answer queries related to python only, if you get any other queries, then please say I don't know")
chatbot_response = chatbot.chat(user_input="Create me a function to add 2 numbers")
print(f"chatbot_response: {chatbot_response}")

chatbot_response = chatbot.chat(user_input="What is avahi?")
print(f"chatbot_response: {chatbot_response}")

# Get chat history
chatbot_history = chatbot.get_history()
print(chatbot_history)

# clear_chat_history
chatbot.clear_history()
---------------------------------------

# Few examples for getting global gradio URL for Any Functionality/Features ๐ŸŒ

# For summarizer
avahiplatform.summarize.create_url()

# For medical-scribing
avahiplatform.medicalscribing.create_url()

# For csv querying
avahiplatform.query_csv.create_url()

# For RAG with sources
avahiplatform.perform_rag_with_sources.create_url()

# For chatbot we first have initialize chatbot and then we can create the url
chatbot = avahiplatform.chatbot()
chatbot.create_url()

# This will generate a global URL which you can share with anyone, allowing them to explore and utilize any of the features which is running in your environment using avahiplatform sdk

Configuration

AWS Credentials Setup ๐Ÿ”

AvahiPlatform requires AWS credentials to access AWS Bedrock and S3 services. You have two options for providing your AWS credentials:

Default AWS Credentials

  • Configure your AWS credentials in the ~/.aws/credentials file
  • Or use the AWS CLI to set up your credentials

Explicit AWS Credentials

  • Pass the AWS Access Key ID and Secret Access Key directly when calling functions

๐Ÿ’ก Tip: For detailed instructions on setting up AWS credentials, please refer to the AWS CLI Configuration Guide.

Ensuring your AWS credentials are correctly set up will allow you to seamlessly use all of AvahiPlatform's powerful features. If you encounter any issues with authentication, double-check your credential configuration or reach out to our support team for assistance.

Usage Examples

Summarization

# Summarize text
summary, _, _, _ = avahiplatform.summarize("Text to summarize")

# Summarize a local file
summary, _, _, _ = avahiplatform.summarize("path/to/local/file.txt")

# Summarize a file from S3
summary, _, _, _ = avahiplatform.summarize("s3://bucket-name/file.txt", 
                                            aws_access_key_id="your_access_key", 
                                            aws_secret_access_key="your_secret_key")

Structured Extraction

extraction, _, _, _ = avahiplatform.structredExtraction("Text for extraction")

Data Masking

masked_data, _, _, _ = avahiplatform.DataMasking("Text containing sensitive information")

Natural Language to SQL

result = avahiplatform.nl2sql("Your natural language query", 
                               db_type="postgresql", username="user", password="pass",
                               host="localhost", port=5432, dbname="mydb")

PDF Summarization

summary, _, _, _ = avahiplatform.pdfsummarizer("path/to/file.pdf")

Grammar Correction

corrected_text, _, _, _ = avahiplatform.grammarAssistant("Text with grammatical errors")

Product Description Generation

description, _, _, _ = avahiplatform.productDescriptionAssistant("SKU123", "Summer Sale", "Young Adults")

Image Generation

image, seed, cost = avahiplatform.imageGeneration("A beautiful sunset over mountains")

Medical Scribing

summary, transcript = avahiplatform.medicalscribing("path/to/audio.mp3", "input-bucket", "iam-arn")

# Note in medical scribe in iam_arn: It should have iam pass role inline policy which should look like this:
{
	"Version": "2012-10-17",
	"Statement": [
		{
			"Effect": "Allow",
			"Action": [
				"iam:GetRole",
				"iam:PassRole"
			],
			"Resource": [
				"arn:aws:iam::<account-id>:role/<role-name>"
			]
		}
	]
}

Along with this, the role/user should have full access to both Transcribe and Comprehend.

ICD-10 Code Generation

icd_code = avahiplatform.icdcoding("local_file.txt")

CSV Querying

csv_files = {
    "df1": "path/to/1st_csv.csv",
    "df2": "path/to/2nd_csv.csv"
}
# In the `csv_files` dictionary, you can pass any number of CSV files 1 or more than 1, but they must follow the structure where the key is the name and the value is the path(s3_path or local_path).
csv_query_answer = avahiplatform.query_csv("<Your query goes here>",
                                           csv_file_paths=csv_files)
print(f"csv query answer: {csv_query_answer}")

Global Gradio URL for Any Functionality/Features ๐ŸŒ

avahiplatform.feature_name.create_url()

For example:
- avahiplatform.summarize.create_url()
- avahiplatform.medicalscribing.create_url()
- avahiplatform.query_csv.create_url()

# For interactive chatbot
chatbot = avahiplatform.chatbot()
chatbot.create_url()

Error Handling ๐Ÿ› ๏ธ

AvahiPlatform provides user-friendly error messages for common issues, ensuring you can quickly identify and resolve any problems. Here are some examples:

  • โŒ Invalid AWS credentials
  • ๐Ÿ” File not found
  • ๐Ÿ”Œ Database connection errors
  • โš ๏ธ Unexpected errors

Our detailed error messages will guide you towards quick resolutions, keeping your development process smooth and efficient.

Requirements ๐Ÿ“‹

To use AvahiPlatform, make sure you have the following:

  • Python 3.9 or higher

Required Libraries:

boto3==1.34.160
python-docx==1.1.2
PyMuPDF==1.24.9
loguru==0.7.2
setuptools==72.1.0
chromadb==0.5.3
sqlalchemy>=2.0.35
gradio>=4.44.0
tabulate==0.9.0
python-magic-bin>=0.4.14
pillow>=10.4.0
pandas>=2.2.3

You can install these dependencies using pip. We recommend using a virtual environment for your project.

Contributing ๐Ÿค

We welcome contributions from the community! Whether you've found a bug or have a feature in mind, we'd love to hear from you. Here's how you can contribute:

  1. Open an issue to discuss your ideas or report bugs
  2. Fork the repository and create a new branch for your feature
  3. Submit a pull request with your changes

Let's make AvahiPlatform even better together!

License ๐Ÿ“„

This project is licensed under the MIT License. See the Open-source MIT license file for details.

Contact Us ๐Ÿ“ฌ

We're here to help! If you have any questions, suggestions, or just want to say hi, feel free to reach out:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

avahiplatform-0.0.13.tar.gz (39.9 kB view details)

Uploaded Source

Built Distribution

avahiplatform-0.0.13-py3-none-any.whl (55.7 kB view details)

Uploaded Python 3

File details

Details for the file avahiplatform-0.0.13.tar.gz.

File metadata

  • Download URL: avahiplatform-0.0.13.tar.gz
  • Upload date:
  • Size: 39.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.20

File hashes

Hashes for avahiplatform-0.0.13.tar.gz
Algorithm Hash digest
SHA256 b905a11933b90fce5d3e045d7f9ccf1e1242e6bf30eebc68aab02e29243188be
MD5 b5e4f9ef0ad94a80e2f8d97f51ed39ac
BLAKE2b-256 cb536ebe3c2c48d1e24dcb0c6276a9e32bf5e3828fdab7a9668269ff255412e4

See more details on using hashes here.

File details

Details for the file avahiplatform-0.0.13-py3-none-any.whl.

File metadata

File hashes

Hashes for avahiplatform-0.0.13-py3-none-any.whl
Algorithm Hash digest
SHA256 3b2fbfeb5a849cad7ff10eeed8f4701896a820d0a10e2c545831afc9b9bef3be
MD5 3ea43ce22af4392e723a8c05bad8be76
BLAKE2b-256 9895b78bc14924e78adddde6c782ca1b190189d854bc3c18d4299ae986a9d1a3

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page