A lightweight client for interacting with LLMs served via Ollama
Project description
OllamaFreeAPI
Unlock AI Innovation for Free
Access the world's best open language models in one place!
OllamaFreeAPI provides free access to leading open-source LLMs including:
- 🦙 LLaMA 3 (Meta)
- 🌪️ Mistral (Mistral AI)
- 🔍 DeepSeek (DeepSeek)
- 🦄 Qwen (Alibaba Cloud)
No payments. No credit cards. Just pure AI power at your fingertips.
pip install ollamafreeapi
📚 Documentation
- API Reference - Complete API documentation
- Usage Examples - Practical code examples
- Model Catalog - Available models and their capabilities
Why Choose OllamaFreeAPI?
| Feature | Others | OllamaFreeAPI |
|---|---|---|
| Free Access | ❌ Limited trials | ✅ Always free |
| Model Variety | 3-5 models | 50+ models |
| Global Infrastructure | Single region | 5 continents |
| Ease of Use | Complex setup | Zero-config |
| Community Support | Paid only | Free & active |
📊 Project Statistics
Here are some key statistics about the current state of OllamaFreeAPI:
- Active Models: 651 (Ready to use right now)
- Model Types: 6 (Different families of models)
- Quantization Methods: 8 (Ways to run faster)
- Average Size of Models: 5.3 GB
🚀 Quick Start
Streaming Example
from ollamafreeapi import OllamaFreeAPI
client = OllamaFreeAPI()
# Stream responses in real-time
for chunk in client.stream_chat('llama3.3:70b', 'Tell me a story:'):
print(chunk, end='', flush=True)
Non-Streaming Example
from ollamafreeapi import OllamaFreeAPI
client = OllamaFreeAPI()
# Get instant responses
response = client.chat(
model_name="llama3.3:70b",
prompt="Explain neural networks like I'm five",
temperature=0.7
)
print(response)
🌟 Featured Models
Popular Foundation Models
llama3:8b-instruct- Meta's latest 8B parameter modelmistral:7b-v0.2- High-performance 7B modeldeepseek-r1:7b- Strong reasoning capabilitiesqwen:7b-chat- Alibaba's versatile model
Specialized Models
llama3:code- Optimized for programmingmistral:storyteller- Creative writing specialistdeepseek-coder- STEM and math expert
🌍 Global Infrastructure
Our free API is powered by:
- 25+ dedicated GPU servers
- 5 global regions (NA, EU, Asia)
- Automatic load balancing
- 99.5% uptime SLA
📄 API Reference
Core Methods
# List available models
api.list_models()
# Get model details
api.get_model_info("mistral:7b")
# Generate text
api.chat(model_name="llama3:latest", prompt="Your message")
# Stream responses
for chunk in api.stream_chat(...):
print(chunk, end='')
Advanced Features
# Check server locations
api.get_model_servers("deepseek-r1:7b")
# Generate raw API request
api.generate_api_request(...)
# Get performance metrics
api.get_server_status()
💎 Free Tier Limits
| Resource | Free Tier | Pro Tier |
|---|---|---|
| Requests | 100/hr | 10,000/hr |
| Tokens | 16k | 128k |
| Speed | 50 t/s | 150 t/s |
| Models | 7B only | All sizes |
🤝 Contributing
We welcome contributions! Please see our Contributing Guide for details.
📄 License
Open-source MIT license - View License
🔗 Links
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ollamafreeapi-0.1.2.tar.gz.
File metadata
- Download URL: ollamafreeapi-0.1.2.tar.gz
- Upload date:
- Size: 38.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
59265e7d28a54f1776c8a206e4a2efd18a188dd1bfd0b5b38077a570cec379bc
|
|
| MD5 |
87e8d407e0fa4a126829c564c4587d31
|
|
| BLAKE2b-256 |
4a868fbaeb94389134ca501cd2910d4936a1350e8ba984d403fb47bb1e8a68da
|
File details
Details for the file ollamafreeapi-0.1.2-py3-none-any.whl.
File metadata
- Download URL: ollamafreeapi-0.1.2-py3-none-any.whl
- Upload date:
- Size: 42.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
039557a45a8e8a4e3288268a2e053aa419f52a55499315756eb99bcd00095929
|
|
| MD5 |
33a19121b7903ad576cc5a5251775d09
|
|
| BLAKE2b-256 |
cb46db354440d23b2e92156e5c885020064d8546b82f9cfc92f644eeb77c1499
|