Mojentic is an agentic framework that aims to provide a simple and flexible way to assemble teams of agents to solve complex problems.
Project description
Mojentic
Mojentic is a framework that provides a simple and flexible way to interact with Large Language Models (LLMs). It offers integration with various LLM providers and includes tools for structured output generation, task automation, and more. With comprehensive support for all OpenAI models including GPT-5 and automatic parameter adaptation, Mojentic handles the complexities of different model types seamlessly. The future direction is to facilitate a team of agents, but the current focus is on robust LLM interaction capabilities.
🚀 Features
- LLM Integration: Support for multiple LLM providers (OpenAI, Ollama)
- Latest OpenAI Models: Full support for GPT-5, GPT-4.1, and all reasoning models (o1, o3, o4 series)
- Automatic Model Adaptation: Seamless parameter handling across different OpenAI model types
- Structured Output: Generate structured data from LLM responses using Pydantic models
- Tools Integration: Utilities for date resolution, image analysis, and more
- Multi-modal Capabilities: Process and analyze images alongside text
- Simple API: Easy-to-use interface for LLM interactions
- Future Development: Working towards an agent framework with team coordination capabilities
📋 Requirements
- Python 3.11+
- Ollama (for local LLM support)
- Required models:
mxbai-embed-largefor embeddings
- Required models:
🔧 Installation
# Install from PyPI
pip install mojentic
Or install from source
git clone https://github.com/svetzal/mojentic.git
cd mojentic
pip install -e .
🚦 Quick Start
from mojentic.llm import LLMBroker
from mojentic.llm.gateways import OpenAIGateway, OllamaGateway
from mojentic.llm.gateways.models import LLMMessage
from mojentic.llm.tools.date_resolver import ResolveDateTool
from pydantic import BaseModel, Field
# Initialize with OpenAI (supports all models including GPT-5, GPT-4.1, reasoning models)
openai_llm = LLMBroker(model="gpt-5", gateway=OpenAIGateway(api_key="your_api_key"))
# Or use other models: "gpt-4o", "gpt-4.1", "o1-mini", "o3-mini", etc.
# Or use Ollama for local LLMs
ollama_llm = LLMBroker(model="qwen3:32b")
# Simple text generation
result = openai_llm.generate(messages=[LLMMessage(content='Hello, how are you?')])
print(result)
# Generate structured output
class Sentiment(BaseModel):
label: str = Field(..., description="Label for the sentiment")
sentiment = openai_llm.generate_object(
messages=[LLMMessage(content="Hello, how are you?")],
object_model=Sentiment
)
print(sentiment.label)
# Use tools with the LLM
result = openai_llm.generate(
messages=[LLMMessage(content='What is the date on Friday?')],
tools=[ResolveDateTool()]
)
print(result)
# Image analysis
result = openai_llm.generate(messages=[
LLMMessage(content='What is in this image?', image_paths=['path/to/image.jpg'])
])
print(result)
🔑 OpenAI configuration
OpenAIGateway now supports environment-variable defaults so you can get started without hardcoding secrets:
- If you omit
api_key, it will use theOPENAI_API_KEYenvironment variable. - If you omit
base_url, it will use theOPENAI_API_ENDPOINTenvironment variable (useful for custom endpoints like Azure/OpenAI-compatible proxies). - Precedence: values you pass explicitly to
OpenAIGateway(api_key=..., base_url=...)always override environment variables.
Examples:
from mojentic.llm import LLMBroker
from mojentic.llm.gateways import OpenAIGateway
# 1) Easiest: rely on environment variables
# export OPENAI_API_KEY=sk-...
# export OPENAI_API_ENDPOINT=https://api.openai.com/v1 # optional
llm = LLMBroker(
model="gpt-4o-mini",
gateway=OpenAIGateway() # picks up OPENAI_API_KEY/OPENAI_API_ENDPOINT automatically
)
# 2) Explicitly override one or both values
llm = LLMBroker(
model="gpt-4o-mini",
gateway=OpenAIGateway(api_key="your_key", base_url="https://api.openai.com/v1")
)
🤖 OpenAI Model Support
The framework automatically handles parameter differences between model types, so you can switch between any models without code changes.
Model-Specific Limitations
Some models have specific parameter restrictions that are automatically handled:
- GPT-5 Series: Only supports
temperature=1.0(default). Other temperature values are automatically adjusted with a warning. - o1 & o4 Series: Only supports
temperature=1.0(default). Other temperature values are automatically adjusted with a warning. - o3 Series: Does not support the
temperatureparameter at all. The parameter is automatically removed with a warning. - All Reasoning Models (o1, o3, o4, GPT-5): Use
max_completion_tokensinstead ofmax_tokens, and have limited tool support.
The framework will automatically adapt parameters and log warnings when unsupported values are provided.
🏗️ Project Structure
src/
├── mojentic/ # Main package
│ ├── llm/ # LLM integration (primary focus)
│ │ ├── gateways/ # LLM provider adapters (OpenAI, Ollama)
│ │ ├── registry/ # Model registration
│ │ └── tools/ # Utility tools for LLMs
│ ├── agents/ # Agent implementations (under development)
│ ├── context/ # Shared memory and context (under development)
├── _examples/ # Usage examples
The primary focus is currently on the llm module, which provides robust capabilities for interacting with various LLM providers.
📚 Documentation
Visit the documentation for comprehensive guides, API reference, and examples.
🧪 Development
# Clone the repository
git clone https://github.com/svetzal/mojentic.git
cd mojentic
# Install dependencies
pip install -e ".[dev]"
# Run tests
pytest
✅ Project Status
The agentic aspects of this framework are in the highest state of flux. The first layer has stabilized, as have the simpler parts of the second layer, and we're working on the stability of the asynchronous pubsub architecture. We expect Python 3.14 will be the real enabler for the async aspects of the second layer.
📄 License
This code is Copyright 2025 Mojility, Inc. and is freely provided under the terms of the MIT license.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file mojentic-0.9.0.tar.gz.
File metadata
- Download URL: mojentic-0.9.0.tar.gz
- Upload date:
- Size: 105.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ea0706a8a965919c2f9cb53e817973d6aaeb8df3888a02003d6d149a12e3ac65
|
|
| MD5 |
184bb5c3916aa104e055a62d7c64476d
|
|
| BLAKE2b-256 |
37a04849a3adf65b44ab0221130f38b6b59b83c2d9b9cc4ee1f7556e5ea54a13
|
Provenance
The following attestation bundles were made for mojentic-0.9.0.tar.gz:
Publisher:
build.yml on svetzal/mojentic
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
mojentic-0.9.0.tar.gz -
Subject digest:
ea0706a8a965919c2f9cb53e817973d6aaeb8df3888a02003d6d149a12e3ac65 - Sigstore transparency entry: 698793581
- Sigstore integration time:
-
Permalink:
svetzal/mojentic@459ab07257e14414e819bbf33c49b9c2ae5beac6 -
Branch / Tag:
refs/tags/v0.9.0 - Owner: https://github.com/svetzal
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
build.yml@459ab07257e14414e819bbf33c49b9c2ae5beac6 -
Trigger Event:
release
-
Statement type:
File details
Details for the file mojentic-0.9.0-py3-none-any.whl.
File metadata
- Download URL: mojentic-0.9.0-py3-none-any.whl
- Upload date:
- Size: 160.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
96e7458dc9b764d35268c2b8581202ca2d82d48170170796f43281190721a981
|
|
| MD5 |
a13e682f220e474398b0adee1c469e1d
|
|
| BLAKE2b-256 |
1ef43692520cd026dd9fdae231302307513e33390def7702aceb9d154cdeb98e
|
Provenance
The following attestation bundles were made for mojentic-0.9.0-py3-none-any.whl:
Publisher:
build.yml on svetzal/mojentic
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
mojentic-0.9.0-py3-none-any.whl -
Subject digest:
96e7458dc9b764d35268c2b8581202ca2d82d48170170796f43281190721a981 - Sigstore transparency entry: 698793586
- Sigstore integration time:
-
Permalink:
svetzal/mojentic@459ab07257e14414e819bbf33c49b9c2ae5beac6 -
Branch / Tag:
refs/tags/v0.9.0 - Owner: https://github.com/svetzal
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
build.yml@459ab07257e14414e819bbf33c49b9c2ae5beac6 -
Trigger Event:
release
-
Statement type: