An Open edX plugin to translate courses
Project description
OL Open edX Course Translations
An Open edX plugin to manage course translations.
Purpose
Translate course content into multiple languages to enhance accessibility for a global audience.
Setup
For detailed installation instructions, please refer to the plugin installation guide.
Installation required in:
Studio (CMS)
LMS (for auto language selection feature)
Configuration
Add the following configuration values to the config file in Open edX. For any release after Juniper, that config file is /edx/etc/lms.yml and /edx/etc/cms.yml. If you’re using private.py, add these values to lms/envs/private.py and cms/envs/private.py. These should be added to the top level. Ask a fellow developer for these values.
# Enable auto language selection ENABLE_AUTO_LANGUAGE_SELECTION: true # Translation providers configuration TRANSLATIONS_PROVIDERS: { "default_provider": "mistral", # Default provider to use "deepl": { "api_key": "<YOUR_DEEPL_API_KEY>", }, "openai": { "api_key": "<YOUR_OPENAI_API_KEY>", "default_model": "gpt-5.2", }, "gemini": { "api_key": "<YOUR_GEMINI_API_KEY>", "default_model": "gemini-3-pro-preview", }, "mistral": { "api_key": "<YOUR_MISTRAL_API_KEY>", "default_model": "mistral-large-latest", }, } TRANSLATIONS_GITHUB_TOKEN: <YOUR_GITHUB_TOKEN> TRANSLATIONS_REPO_PATH: "" TRANSLATIONS_REPO_URL: "https://github.com/mitodl/mitxonline-translations.git" LITE_LLM_REQUEST_TIMEOUT: 120 # Timeout for LLM API requests in secondsFor Tutor installations, these values can also be managed through a custom Tutor plugin.
Translation Providers
The plugin supports multiple translation providers:
DeepL
OpenAI (GPT models)
Gemini (Google)
Mistral
Configuration
All providers are configured through the TRANSLATIONS_PROVIDERS dictionary in your settings:
TRANSLATIONS_PROVIDERS = {
"default_provider": "mistral", # Optional: default provider for commands
"deepl": {
"api_key": "<YOUR_DEEPL_API_KEY>",
},
"openai": {
"api_key": "<YOUR_OPENAI_API_KEY>",
"default_model": "gpt-5.2", # Optional: used when model not specified
},
"gemini": {
"api_key": "<YOUR_GEMINI_API_KEY>",
"default_model": "gemini-3-pro-preview",
},
"mistral": {
"api_key": "<YOUR_MISTRAL_API_KEY>",
"default_model": "mistral-large-latest",
},
}
Important Notes:
DeepL Configuration: DeepL must be configured in TRANSLATIONS_PROVIDERS['deepl']['api_key'].
DeepL for Subtitle Repair: DeepL is used as a fallback repair mechanism for subtitle translations when LLM providers fail validation. Even if you use LLM providers for primary translation, you should configure DeepL to enable automatic repair.
Default Models: The default_model in each provider’s configuration is used when you specify a provider without a model (e.g., openai instead of openai/gpt-5.2).
Provider Selection
You can specify providers in three ways:
Provider only (uses default model from settings):
./manage.py cms translate_course \
--target-language AR \
--course-dir /path/to/course.tar.gz \
--content-translation-provider openai \
--srt-translation-provider gemini
Provider with specific model:
./manage.py cms translate_course \
--target-language AR \
--course-dir /path/to/course.tar.gz \
--content-translation-provider openai/gpt-5.2 \
--srt-translation-provider gemini/gemini-3-pro-preview
DeepL (no model needed):
./manage.py cms translate_course \
--target-language AR \
--course-dir /path/to/course.tar.gz \
--content-translation-provider deepl \
--srt-translation-provider deepl
Note: If you specify a provider without a model (e.g., openai instead of openai/gpt-5.2), the system will use the default_model configured in TRANSLATIONS_PROVIDERS for that provider.
Translating a Course
Open the course in Studio.
Go to Tools -> Export Course.
Export the course as a .tar.gz file.
Go to the CMS shell
Run the management command to translate the course:
./manage.py cms translate_course \ --source-language EN \ --target-language AR \ --course-dir /path/to/course.tar.gz \ --content-translation-provider openai \ --srt-translation-provider gemini \ --glossary-dir /path/to/glossary
Command Options:
--source-language: Source language code (default: EN)
--target-language: Target language code (required)
--course-dir: Path to exported course tar.gz file (required)
--content-translation-provider: Translation provider for content (XML/HTML and text) (required).
Format:
deepl - uses DeepL (no model needed)
PROVIDER - uses provider with default model from settings (e.g., openai, gemini, mistral)
PROVIDER/MODEL - uses provider with specific model (e.g., openai/gpt-5.2, gemini/gemini-3-pro-preview, mistral/mistral-large-latest)
--srt-translation-provider: Translation provider for SRT subtitles (required). Same format as --content-translation-provider
--glossary-dir: Path to glossary directory (optional)
Examples:
# Use DeepL for both content and subtitles
./manage.py cms translate_course \
--target-language AR \
--course-dir /path/to/course.tar.gz \
--content-translation-provider deepl \
--srt-translation-provider deepl
# Use OpenAI and Gemini with default models from settings
./manage.py cms translate_course \
--target-language FR \
--course-dir /path/to/course.tar.gz \
--content-translation-provider openai \
--srt-translation-provider gemini
# Use OpenAI with specific model for content, Gemini with default for subtitles
./manage.py cms translate_course \
--target-language FR \
--course-dir /path/to/course.tar.gz \
--content-translation-provider openai/gpt-5.2 \
--srt-translation-provider gemini
# Use Mistral with specific model and glossary
./manage.py cms translate_course \
--target-language ES \
--course-dir /path/to/course.tar.gz \
--content-translation-provider mistral/mistral-large-latest \
--srt-translation-provider mistral/mistral-large-latest \
--glossary-dir /path/to/glossary
Glossary Support:
Create language-specific glossary files in the glossary directory:
glossaries/machine_learning/
├── ar.txt # Arabic glossary
├── fr.txt # French glossary
└── es.txt # Spanish glossary
Format: One term per line as “source_term : translated_term”
# ES HINTS
## TERM MAPPINGS
These are preferred terminology choices for this language. Use them whenever they sound natural; adapt freely if context requires.
- 'accuracy' : 'exactitud'
- 'activation function' : 'función de activación'
- 'artificial intelligence' : 'inteligencia artificial'
- 'AUC' : 'AUC'
Subtitle Translation and Validation
The course translation system includes robust subtitle (SRT) translation with automatic validation and repair mechanisms to ensure high-quality translations with preserved timing information.
Translation Process
The subtitle translation follows a multi-stage process with built-in quality checks:
Initial Translation: Subtitles are translated using your configured provider (DeepL or LLM)
Validation: Timestamps, subtitle count, and content are validated to ensure integrity
Automatic Retry: If validation fails, the system automatically retries translation (up to 1 additional attempt)
DeepL Repair Fallback: If retries fail, the system automatically falls back to DeepL for repair
Why DeepL for Repair?
When subtitle translations fail validation (mismatched timestamps, incorrect subtitle counts, or blank translations), the system automatically uses DeepL as a repair mechanism, regardless of which provider was initially used. This design choice is based on extensive testing and production experience:
Higher Reliability: LLMs frequently fail to preserve subtitle structure and timestamps correctly, even with detailed prompting
Consistent Formatting: DeepL’s specialized subtitle translation API maintains timing precision through XML tag handling
Lower Failure Rate: DeepL demonstrates significantly better success rates for subtitle translation compared to LLMs
Timestamp Preservation: DeepL’s built-in XML tag handling ensures start and end times remain intact during translation
Validation Rules
The system validates subtitle translations against these criteria:
Subtitle Count: Translated file must have the same number of subtitle blocks as the original
Index Matching: Each subtitle block index must match the original (e.g., if original has blocks 1-100, translation must have blocks 1-100 in the same order)
Timestamp Preservation: Start and end times for each subtitle block must remain unchanged
Content Validation: Non-empty original subtitles must have non-empty translations (blank translations are flagged as errors)
Example Validation Process:
1. Initial Translation (using OpenAI):
✓ 150 subtitle blocks translated
✗ Validation failed: 3 blocks have mismatched timestamps
2. Retry Attempt:
✓ 150 subtitle blocks translated
✗ Validation failed: 2 blocks still have issues
3. DeepL Repair:
✓ 150 subtitle blocks retranslated using DeepL
✓ Validation passed: All timestamps and content validated
✅ Translation completed successfully
Failure Handling
If subtitle repair fails after all attempts (including DeepL fallback):
The translation task will fail with a ValueError
The entire course translation will be aborted to prevent incomplete translations
The translated course directory will be automatically cleaned up
An error message will indicate which subtitle file caused the failure
No partial or corrupted translation files will be left behind
Auto Language Selection
The plugin includes an auto language selection feature that automatically sets the user’s language preference based on the course language. When enabled, users will see the static site content in the course’s configured language.
To enable auto language selection:
Set ENABLE_AUTO_LANGUAGE_SELECTION to true in your settings.
Set SHARED_COOKIE_DOMAIN to your domain (e.g., .local.openedx.io for local tutor setup) to allow cookies to be shared between LMS and CMS.
How it works:
LMS: The CourseLanguageCookieMiddleware automatically detects course URLs and sets the language preference based on the course’s configured language.
CMS: The CourseLanguageCookieResetMiddleware ensures Studio always uses English for the authoring interface.
Admin areas: Admin URLs (/admin, /sysadmin, instructor dashboards) are forced to use English regardless of course language.
MFE Integration
To make auto language selection work with Micro-Frontends (MFEs), you need to use a custom Footer component that handles language detection and switching.
Setup:
Use the Footer component from src/bridge/settings/openedx/mfe/slot_config/Footer.jsx in the ol-infrastructure repository.
Enable auto language selection in each MFE by adding the following to their .env.development file:
ENABLE_AUTO_LANGUAGE_SELECTION="true"This custom Footer component: - Detects the current course context in MFEs - Automatically switches the MFE language based on the course’s configured language - Ensures consistent language experience across the platform
Configure your MFE slot overrides to use this custom Footer component instead of the default one.
Note: The custom Footer is required because MFEs run as separate applications and need their own mechanism to detect and respond to course language settings. The environment variable must be set in each MFE’s configuration for the feature to work properly.
Generating static content translations
This command synchronizes translation keys from edx-platform and MFE’s, translates empty keys using LLM, and automatically creates a pull request in the translations repository.
What it does:
Syncs translation keys from edx-platform and MFE’s to the translations repository
Extracts empty translation keys that need translation
Translates empty keys using the specified LLM provider and model
Applies translations to JSON and PO files
Commits changes to a new branch
Creates a pull request with translation statistics
Usage:
Go to the CMS shell
Run the management command:
./manage.py cms sync_and_translate_language <LANGUAGE_CODE> [OPTIONS]
Required arguments:
LANGUAGE_CODE: Language code (e.g., el, fr, es_ES)
Optional arguments:
--iso-code: ISO code for JSON files (default: same as language code)
--provider: Translation provider (openai, gemini, mistral). Default is taken from TRANSLATIONS_PROVIDERS['default_provider'] setting
--model: LLM model name. If not specified, uses the default_model for the selected provider from TRANSLATIONS_PROVIDERS. Examples: gpt-5.2, gemini-3-pro-preview, mistral-large-latest
--repo-path: Path to mitxonline-translations repository (can also be set via TRANSLATIONS_REPO_PATH setting or environment variable)
--repo-url: GitHub repository URL (default: https://github.com/mitodl/mitxonline-translations.git, can also be set via TRANSLATIONS_REPO_URL setting or environment variable)
--glossary: Use glossary from plugin glossaries folder (looks for {plugin_dir}/glossaries/machine_learning/{lang_code}.txt)
--batch-size: Number of keys to translate per API request (default: 200, recommended: 200-300 for most models)
--mfe: Filter by specific MFE(s). Use edx-platform for backend translations
--dry-run: Run without committing or creating PR
Examples:
# Use default provider (from TRANSLATIONS_PROVIDERS['default_provider']) with its default model ./manage.py cms sync_and_translate_language el # Use OpenAI provider with its default model (gpt-5.2) ./manage.py cms sync_and_translate_language el --provider openai # Use OpenAI provider with a specific model ./manage.py cms sync_and_translate_language el --provider openai --model gpt-5.2 # Use Mistral provider with a specific model and glossary ./manage.py cms sync_and_translate_language el --provider mistral --model mistral-large-latest --glossary --batch-size 250
License
The code in this repository is licensed under the AGPL 3.0 unless otherwise noted.
Please see LICENSE.txt for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ol_openedx_course_translations-0.3.10.tar.gz.
File metadata
- Download URL: ol_openedx_course_translations-0.3.10.tar.gz
- Upload date:
- Size: 91.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9b6538fd4b454d08adc7c45f203b1dada93191d006afa8bad476b5eb661b2073
|
|
| MD5 |
3ec24078752ba1f6093fa66c14f402a7
|
|
| BLAKE2b-256 |
bde68b6716daa913c5986401f466aa459651480a071fe843494ffe8222e6fa35
|
File details
Details for the file ol_openedx_course_translations-0.3.10-py3-none-any.whl.
File metadata
- Download URL: ol_openedx_course_translations-0.3.10-py3-none-any.whl
- Upload date:
- Size: 108.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
69340869699a043d2038aa741f40e68b668a34507ecdd244330171dcdb4889bd
|
|
| MD5 |
8a622df8a0aa63f6e5c28ce5011ee65c
|
|
| BLAKE2b-256 |
ecbc7c8096ab8eb71510f2ebaf5adb2706bccdd80af6b376e0780e1d5ff33583
|