PyGPTs simplifies interacting with AI models like Hugging Face Transformers and Google Gemini. It streamlines model management, handles rate limits, and provides an easy-to-use API for text generation and chat sessions.
Project description
PyGPTs simplifies interaction with various AI models, including Hugging Face Transformers and pre-trained models available through APIs like Google Gemini. It provides a streamlined interface for managing models, pipelines, and tokenizers, handling rate limits, and accessing different model configurations.
Key Features
Hugging Face Integration: Easily load and utilize pre-trained models from Hugging Face’s transformers library. Configure models, tokenizers, and pipelines with flexible settings.
Gemini API Support: Interact with Google’s Gemini models through a dedicated wrapper. Manage API keys, track usage limits, and handle different model versions.
Rate Limiting: Built-in rate limiting for Gemini API calls to avoid exceeding quotas and ensure continuous operation.
Multiple Model Management: The GeminiManager allows using multiple Gemini models with different API keys, automatically switching between them based on availability and usage limits.
Simplified Interface: PyGPTs provides a clean and easy-to-use API for generating text, managing chat sessions, and accessing model information.
Extensible Design: Built with modularity in mind, PyGPTs can be extended to support other AI APIs and model providers.
Installation
pip install PyGPTs
Modules:
`PyGPTs.Gemini`: Provides classes for interacting with Google Gemini:
`PyGPTs.HuggingFace`: Provides classes for seamless integration with Hugging Face:
Usage Examples
Gemini:
from PyGPTs.Gemini import GeminiSettings, Gemini
settings = GeminiSettings(api_key="YOUR_API_KEY")
gemini = Gemini(settings)
gemini.start_chat()
response = gemini.send_message("Hello, Gemini!", chat_index=0)
print(response.text)
Hugging Face:
from PyGPTs.HuggingFace.Transformers import HuggingFaceTransformerSettings, HuggingFaceTransformer
from transformers import AutoModelForCausalLM
settings = HuggingFaceTransformerSettings(
pretrained_model_name_or_path="gpt2",
model_class=AutoModelForCausalLM,
task="text-generation"
)
transformer = HuggingFaceTransformer(settings)
generated_text = transformer.generate_content("Once upon a time")
print(generated_text)
This library offers a powerful and convenient way to integrate various AI models into your projects. Its flexible design and comprehensive feature set make it a valuable tool for developers working with large language models and other AI-driven applications.
Future Notes
PyGPTs is an actively developing project. We are continually working on expanding its capabilities, including adding support for new AI models and APIs, improving performance, and enhancing the user experience. Contributions, feature requests, and bug reports are welcome! We encourage you to get involved and help shape the future of PyGPTs.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file pygpts-1.0.1.tar.gz
.
File metadata
- Download URL: pygpts-1.0.1.tar.gz
- Upload date:
- Size: 22.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.9.20
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2163581d51de1887e5ec0af2c8ee806394ed6fcae19d644662d846b64ac65417 |
|
MD5 | 634132ab6046799a1d0c7d6a7c7182e2 |
|
BLAKE2b-256 | a5097f54e0111a18ce7dd36bed366ab68da40696da621e624ffe505c0f751be6 |
File details
Details for the file PyGPTs-1.0.1-py3-none-any.whl
.
File metadata
- Download URL: PyGPTs-1.0.1-py3-none-any.whl
- Upload date:
- Size: 28.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.9.20
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0b14132197fcb24e7a4a11c927d82d390a73b93596939e11348f022a5e702bd2 |
|
MD5 | 23075c61d6e964607e585519622612ea |
|
BLAKE2b-256 | 85adde28a4bf613555edb4e9ac169efefb8f04e44c1f18faccbf8919594d0a83 |