Skip to main content

A lightweight client for interacting with LLMs served via Ollama

Project description

OllamaFreeAPI

PyPI Version Python Versions License Free API

Unlock AI Innovation for Free

Access the world's best open language models in one place!

OllamaFreeAPI provides free access to leading open-source LLMs including:

  • 🦙 LLaMA (Meta)
  • 🌪️ Mistral (Mistral AI)
  • 🔍 DeepSeek (DeepSeek)
  • 🦄 Qwen (Alibaba Cloud)

No payments. No credit cards. Just pure AI power at your fingertips.

pip install ollamafreeapi

📚 Documentation

Why Choose OllamaFreeAPI?

Feature Others OllamaFreeAPI
Free Access ❌ Limited trials ✅ Always free
Model Variety 3-5 models Verified endpoints only
Reliability Highly variable Validated active models
Ease of Use Complex setup Zero-config
Community Support Paid only Free & active

📊 Project Statistics

Here are some key statistics about the current state of OllamaFreeAPI:

  • Active Models: 16 (Ready to use and tested)
  • Model Families: 3 (gemma, llama, qwen)
  • Endpoints: 6 highly reliable server nodes

🚀 Quick Start

Streaming Example

from ollamafreeapi import OllamaFreeAPI

client = OllamaFreeAPI()

# Stream responses in real-time
for chunk in client.stream_chat('What is quantum computing?', model='llama3.2:3b'):
    print(chunk, end='', flush=True)

Non-Streaming Example

from ollamafreeapi import OllamaFreeAPI

client = OllamaFreeAPI()

# Get instant responses
response = client.chat(
    model="gpt-oss:20b",
    prompt="Explain neural networks like I'm five",
    temperature=0.7
)
print(response)

🌟 Featured Models

Popular Foundation Models

  • llama3.2:3b - Meta's efficient 3.2B parameter model
  • deepseek-r1:latest - Strong reasoning capabilities built on Qwen
  • gpt-oss:20b - Powerful Gemma-based 20B completion model
  • mistral:latest - High-performance baseline Mistral model

Specialized Models

  • mistral-nemo:custom - 12.2B open weights language model
  • bakllava:latest - Vision and language model
  • smollm2:135m - Extremely lightweight assistant

🌍 Global Infrastructure

Our free API is powered by distributed community nodes:

  • Fast response times
  • Automatic load balancing and server selection
  • Real-time availability checks

📄 API Reference

Core Methods

# List available models
api.list_models()  

# Get model details
api.get_model_info("mistral:latest")  

# Generate text
api.chat(model="llama3.2:3b", prompt="Your message")

# Stream responses
for chunk in api.stream_chat(prompt="Hello!", model="llama3:latest"):
    print(chunk, end='')

Advanced Features

# Check server locations
api.get_model_servers("deepseek-r1:latest")

# Generate raw API request
api.generate_api_request(model="llama3.2:3b", prompt="Hello")

# Get random model parameters (useful for LangChain integration)
api.get_llm_params()

🤝 Contributing

We welcome contributions! Please see our Contributing Guide for details.

📄 License

Open-source MIT license - View License

🔗 Links

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ollamafreeapi-0.1.4.tar.gz (12.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ollamafreeapi-0.1.4-py3-none-any.whl (12.6 kB view details)

Uploaded Python 3

File details

Details for the file ollamafreeapi-0.1.4.tar.gz.

File metadata

  • Download URL: ollamafreeapi-0.1.4.tar.gz
  • Upload date:
  • Size: 12.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for ollamafreeapi-0.1.4.tar.gz
Algorithm Hash digest
SHA256 f2aeaecafa0ef77509995468b161d17031f75cd34950eb9aafaa1bdfa6e4e3ed
MD5 bfb3fb40fd41d6b65235aececae12ed7
BLAKE2b-256 ba43af53005b1e0167ae8cd4c14c7944b167242af7afd58f40046de2b5dcee3b

See more details on using hashes here.

File details

Details for the file ollamafreeapi-0.1.4-py3-none-any.whl.

File metadata

  • Download URL: ollamafreeapi-0.1.4-py3-none-any.whl
  • Upload date:
  • Size: 12.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for ollamafreeapi-0.1.4-py3-none-any.whl
Algorithm Hash digest
SHA256 824ff0489522112091e356ddfab18fc511f3a76c4f199c9858d7ca05a2ea8187
MD5 72f8e0ac41f9210c7ddef291d1fe549d
BLAKE2b-256 f537f1a7337cda12792304d3d8f251d141c5febb308bcc33a8634d1c846ad59f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page