Any-LLM powered machine translator for wagtail-localize
Project description
Wagtail Localize AI Translator
A machine translator for Wagtail Localize that uses Any-LLM for translation
Prerequisites
- Python 3.11+
- a Wagtail project with Wagtail Localize correctly configured
Dependencies
Installation
Install the package using pip:
pip install wagtail-localize-ai
In your settings.py file,
- Add
wagtail_localize_aito yourINSTALLED_APPS - Add
wagtail.contrib.settingsto yourINSTALLED_APPS(used to setup model and prompt) - Setup
WAGTAILLOCALIZE_MACHINE_TRANSLATORlike this:WAGTAILLOCALIZE_MACHINE_TRANSLATOR = { "CLASS": "wagtail_localize_ai.translator.AITranslator", }
Then, run python manage.py migrate wagtail_localize_ai to create the required database tables
Setting up providers
To set up providers add to your settings.py file a dict called AI_PROVIDERS.
Each entry is an arbitrary key identifying the provider instance, with a dict of configuration.
Reserved keys (all start with _):
_name: display name shown in the Wagtail admin_provider: any-llm provider name (e.g.openai,anthropic). Defaults to the entry key, so you can omit it when the key matches the provider name.
All other keys are passed as kwargs to AnyLLM.create(provider, **kwargs).
You can find supported providers and their kwargs in the any-llm documentation.
Having multiple instances of the same provider (e.g. different API keys or endpoints) is supported:
AI_PROVIDERS = {
"openai_main": {
"_name": "OpenAI (main)",
"_provider": "openai",
"api_key": "sk-...",
},
"openai_secondary": {
"_name": "OpenAI (secondary)",
"_provider": "openai",
"api_key": "sk-other-...",
"api_base": "https://my-proxy.example.com/v1",
},
}
Here are some examples for the most popular providers:
OpenAI
AI_PROVIDERS = {
"openai": {
"_name": "OpenAI",
"api_key": "sk-...",
"organization": "org-...", # Optional
"api_base": "https://api.openai.com/v1", # Optional
}
}
Anthropic
AI_PROVIDERS = {
"anthropic": {
"_name": "Anthropic",
"api_key": "sk-ant-...",
}
}
Azure OpenAI
Provider key: azureopenai (OpenAI-compatible Azure endpoint).
AI_PROVIDERS = {
"azureopenai": {
"_name": "Azure OpenAI",
"api_key": "...",
"api_base": "https://<resource>.openai.azure.com",
"api_version": "preview", # Optional, defaults to "preview"
}
}
Azure AI
Provider key: azure (Azure AI Inference SDK, for models deployed on Azure AI Foundry).
AI_PROVIDERS = {
"azure": {
"_name": "Azure AI",
"api_key": "...",
"api_base": "https://<model-deployment-name>.<region>.models.ai.azure.com",
}
}
Gemini
AI_PROVIDERS = {
"gemini": {
"_name": "Gemini",
"api_key": "...",
}
}
Vertex AI
Provider key: vertexai. project, location and credentials are passed directly to the Google GenAI client.
Using Application Default Credentials (e.g. gcloud auth application-default login):
AI_PROVIDERS = {
"vertexai": {
"_name": "Vertex AI",
"project": "my-gcp-project",
"location": "europe-west8",
}
}
Using a service account JSON file:
from google.oauth2 import service_account
AI_PROVIDERS = {
"vertexai": {
"_name": "Vertex AI",
"project": "my-gcp-project",
"location": "europe-west8",
"credentials": service_account.Credentials.from_service_account_file(
"/path/to/service_account.json",
scopes=["https://www.googleapis.com/auth/cloud-platform"],
),
}
}
Usage
You can set the provider, model and prompt from the AI Translator page reachable from the Settings menu entry.
You can also see the token usage from the Logs pagea in the Reports menu entry.
License
This project is released under the BSD license.
Contributors
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file wagtail_localize_ai-0.3.0.tar.gz.
File metadata
- Download URL: wagtail_localize_ai-0.3.0.tar.gz
- Upload date:
- Size: 11.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ff54cf06787230ef6dd441627783e12cd45c6b566d8565ced104d8a74583858f
|
|
| MD5 |
4054955323980e2c2faddf56243d3539
|
|
| BLAKE2b-256 |
713725fcfb605bbe660fdd163d6445ba3d2d002957f43d8f9332337e70de4da1
|
Provenance
The following attestation bundles were made for wagtail_localize_ai-0.3.0.tar.gz:
Publisher:
python-publish.yml on infofactory/wagtail-localize-ai
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
wagtail_localize_ai-0.3.0.tar.gz -
Subject digest:
ff54cf06787230ef6dd441627783e12cd45c6b566d8565ced104d8a74583858f - Sigstore transparency entry: 1361627176
- Sigstore integration time:
-
Permalink:
infofactory/wagtail-localize-ai@742cc25fdebfdf9c9d7be5f6300f2161e3fa5a09 -
Branch / Tag:
refs/tags/0.3.0 - Owner: https://github.com/infofactory
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-publish.yml@742cc25fdebfdf9c9d7be5f6300f2161e3fa5a09 -
Trigger Event:
release
-
Statement type:
File details
Details for the file wagtail_localize_ai-0.3.0-py3-none-any.whl.
File metadata
- Download URL: wagtail_localize_ai-0.3.0-py3-none-any.whl
- Upload date:
- Size: 14.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
845d08cb2cb16ab85b6d4c0a99b3e9255b5bcb047f47c160de4f51ed5d4aaf66
|
|
| MD5 |
b583d4016c3965ead41b78f91f8d69e8
|
|
| BLAKE2b-256 |
b432e4df3a1e1a543f32a6c8774810395903feae33a8a53d1363f9124c896d04
|
Provenance
The following attestation bundles were made for wagtail_localize_ai-0.3.0-py3-none-any.whl:
Publisher:
python-publish.yml on infofactory/wagtail-localize-ai
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
wagtail_localize_ai-0.3.0-py3-none-any.whl -
Subject digest:
845d08cb2cb16ab85b6d4c0a99b3e9255b5bcb047f47c160de4f51ed5d4aaf66 - Sigstore transparency entry: 1361627185
- Sigstore integration time:
-
Permalink:
infofactory/wagtail-localize-ai@742cc25fdebfdf9c9d7be5f6300f2161e3fa5a09 -
Branch / Tag:
refs/tags/0.3.0 - Owner: https://github.com/infofactory
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-publish.yml@742cc25fdebfdf9c9d7be5f6300f2161e3fa5a09 -
Trigger Event:
release
-
Statement type: