Skip to main content

Any-LLM powered machine translator for wagtail-localize

Project description

Wagtail Localize AI Translator

A machine translator for Wagtail Localize that uses Any-LLM for translation

Prerequisites

  • Python 3.11+
  • a Wagtail project with Wagtail Localize correctly configured

Dependencies

Installation

Install the package using pip:

pip install wagtail-localize-ai

In your settings.py file,

  • Add wagtail_localize_ai to your INSTALLED_APPS
  • Add wagtail.contrib.settings to your INSTALLED_APPS (used to setup model and prompt)
  • Setup WAGTAILLOCALIZE_MACHINE_TRANSLATOR like this:
    WAGTAILLOCALIZE_MACHINE_TRANSLATOR = {
        "CLASS": "wagtail_localize_ai.translator.AITranslator",
    }
    

Then, run python manage.py migrate wagtail_localize_ai to create the required database tables

Setting up providers

To set up providers add to your settings.py file a dict called AI_PROVIDERS.
Each entry is an arbitrary key identifying the provider instance, with a dict of configuration.

Reserved keys (all start with _):

  • _name: display name shown in the Wagtail admin
  • _provider: any-llm provider name (e.g. openai, anthropic). Defaults to the entry key, so you can omit it when the key matches the provider name.

All other keys are passed as kwargs to AnyLLM.create(provider, **kwargs).

You can find supported providers and their kwargs in the any-llm documentation.

Having multiple instances of the same provider (e.g. different API keys or endpoints) is supported:

AI_PROVIDERS = {
    "openai_main": {
        "_name": "OpenAI (main)",
        "_provider": "openai",
        "api_key": "sk-...",
    },
    "openai_secondary": {
        "_name": "OpenAI (secondary)",
        "_provider": "openai",
        "api_key": "sk-other-...",
        "api_base": "https://my-proxy.example.com/v1",
    },
}

Here are some examples for the most popular providers:

OpenAI

AI_PROVIDERS = {
    "openai": {
        "_name": "OpenAI",
        "api_key": "sk-...",
        "organization": "org-...",  # Optional
        "api_base": "https://api.openai.com/v1",  # Optional
    }
}

Anthropic

AI_PROVIDERS = {
    "anthropic": {
        "_name": "Anthropic",
        "api_key": "sk-ant-...",
    }
}

Azure OpenAI

Provider key: azureopenai (OpenAI-compatible Azure endpoint).

AI_PROVIDERS = {
    "azureopenai": {
        "_name": "Azure OpenAI",
        "api_key": "...",
        "api_base": "https://<resource>.openai.azure.com",
        "api_version": "preview",  # Optional, defaults to "preview"
    }
}

Azure AI

Provider key: azure (Azure AI Inference SDK, for models deployed on Azure AI Foundry).

AI_PROVIDERS = {
    "azure": {
        "_name": "Azure AI",
        "api_key": "...",
        "api_base": "https://<model-deployment-name>.<region>.models.ai.azure.com",
    }
}

Gemini

AI_PROVIDERS = {
    "gemini": {
        "_name": "Gemini",
        "api_key": "...",
    }
}

Vertex AI

Provider key: vertexai. project, location and credentials are passed directly to the Google GenAI client.

Using Application Default Credentials (e.g. gcloud auth application-default login):

AI_PROVIDERS = {
    "vertexai": {
        "_name": "Vertex AI",
        "project": "my-gcp-project",
        "location": "europe-west8",
    }
}

Using a service account JSON file:

from google.oauth2 import service_account

AI_PROVIDERS = {
    "vertexai": {
        "_name": "Vertex AI",
        "project": "my-gcp-project",
        "location": "europe-west8",
        "credentials": service_account.Credentials.from_service_account_file(
            "/path/to/service_account.json",
            scopes=["https://www.googleapis.com/auth/cloud-platform"],
        ),
    }
}

Usage

You can set the provider, model and prompt from the AI Translator page reachable from the Settings menu entry.
You can also see the token usage from the Logs pagea in the Reports menu entry.

License

This project is released under the BSD license.

Contributors

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

wagtail_localize_ai-0.3.0.tar.gz (11.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

wagtail_localize_ai-0.3.0-py3-none-any.whl (14.9 kB view details)

Uploaded Python 3

File details

Details for the file wagtail_localize_ai-0.3.0.tar.gz.

File metadata

  • Download URL: wagtail_localize_ai-0.3.0.tar.gz
  • Upload date:
  • Size: 11.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for wagtail_localize_ai-0.3.0.tar.gz
Algorithm Hash digest
SHA256 ff54cf06787230ef6dd441627783e12cd45c6b566d8565ced104d8a74583858f
MD5 4054955323980e2c2faddf56243d3539
BLAKE2b-256 713725fcfb605bbe660fdd163d6445ba3d2d002957f43d8f9332337e70de4da1

See more details on using hashes here.

Provenance

The following attestation bundles were made for wagtail_localize_ai-0.3.0.tar.gz:

Publisher: python-publish.yml on infofactory/wagtail-localize-ai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file wagtail_localize_ai-0.3.0-py3-none-any.whl.

File metadata

File hashes

Hashes for wagtail_localize_ai-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 845d08cb2cb16ab85b6d4c0a99b3e9255b5bcb047f47c160de4f51ed5d4aaf66
MD5 b583d4016c3965ead41b78f91f8d69e8
BLAKE2b-256 b432e4df3a1e1a543f32a6c8774810395903feae33a8a53d1363f9124c896d04

See more details on using hashes here.

Provenance

The following attestation bundles were made for wagtail_localize_ai-0.3.0-py3-none-any.whl:

Publisher: python-publish.yml on infofactory/wagtail-localize-ai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page