
Translator AI
STDIOAI-powered JSON i18n translator supporting multiple providers with caching and deduplication
AI-powered JSON i18n translator supporting multiple providers with caching and deduplication
Fast and efficient JSON i18n translator supporting multiple AI providers (Google Gemini, OpenAI & Ollama/DeepSeek) with intelligent caching, multi-file deduplication, and MCP integration.
npm install -g translator-ai
npm install translator-ai
Create a .env
file in your project root or set the environment variable:
GEMINI_API_KEY=your_gemini_api_key_here
Get your API key from Google AI Studio.
Create a .env
file in your project root or set the environment variable:
OPENAI_API_KEY=your_openai_api_key_here
Get your API key from OpenAI Platform.
For completely local translation without API costs:
ollama pull deepseek-r1:latest
--provider ollama
flag:
translator-ai source.json -l es -o spanish.json --provider ollama
# Translate a single file translator-ai source.json -l es -o spanish.json # Translate multiple files with deduplication translator-ai src/locales/en/*.json -l es -o "{dir}/{name}.{lang}.json" # Use glob patterns translator-ai "src/**/*.en.json" -l fr -o "{dir}/{name}.fr.json"
translator-ai <inputFiles...> [options]
Arguments:
inputFiles Path(s) to source JSON file(s) or glob patterns
Options:
-l, --lang <langCodes> Target language code(s), comma-separated for multiple
-o, --output <pattern> Output file path or pattern
--stdout Output to stdout instead of file
--stats Show detailed performance statistics
--no-cache Disable incremental translation cache
--cache-file <path> Custom cache file path
--provider <type> Translation provider: gemini, openai, or ollama (default: gemini)
--ollama-url <url> Ollama API URL (default: http://localhost:11434)
--ollama-model <model> Ollama model name (default: deepseek-r1:latest)
--gemini-model <model> Gemini model name (default: gemini-2.0-flash-lite)
--openai-model <model> OpenAI model name (default: gpt-4o-mini)
--list-providers List available translation providers
--verbose Enable verbose output for debugging
--detect-source Auto-detect source language instead of assuming English
--dry-run Preview what would be translated without making API calls
--preserve-formats Preserve URLs, emails, numbers, dates, and other formats
--metadata Add translation metadata to output files (may break some i18n parsers)
--sort-keys Sort output JSON keys alphabetically
--check-keys Verify all source keys exist in output (exit with error if keys are missing)
-h, --help Display help
-V, --version Display version
Output Pattern Variables (for multiple files):
{dir} - Original directory path
{name} - Original filename without extension
{lang} - Target language code
translator-ai en.json -l es -o es.json
# All JSON files in a directory translator-ai locales/en/*.json -l es -o "locales/es/{name}.json" # Recursive glob pattern translator-ai "src/**/en.json" -l fr -o "{dir}/fr.json" # Multiple specific files translator-ai file1.json file2.json file3.json -l de -o "{name}.de.json"
# Shows statistics including how many API calls were saved translator-ai src/i18n/*.json -l ja -o "{dir}/{name}.{lang}.json" --stats
translator-ai en.json -l de --stdout > de.json
translator-ai en.json -l de --stdout | jq
translator-ai en.json -l ja -o ja.json --no-cache
translator-ai en.json -l ko -o ko.json --cache-file /path/to/cache.json
# Basic usage with Ollama translator-ai en.json -l es -o es.json --provider ollama # Use a different Ollama model translator-ai en.json -l fr -o fr.json --provider ollama --ollama-model llama2:latest # Connect to remote Ollama instance translator-ai en.json -l de -o de.json --provider ollama --ollama-url http://192.168.1.100:11434 # Check available providers translator-ai --list-providers
# Detect source language automatically translator-ai content.json -l es -o spanish.json --detect-source # Translate to multiple languages at once translator-ai en.json -l es,fr,de,ja -o translations/{lang}.json # Dry run - see what would be translated without making API calls translator-ai en.json -l es -o es.json --dry-run # Preserve formats (URLs, emails, dates, numbers, template variables) translator-ai app.json -l fr -o app-fr.json --preserve-formats # Include translation metadata (disabled by default to ensure compatibility) translator-ai en.json -l fr -o fr.json --metadata # Sort keys alphabetically for consistent output translator-ai en.json -l fr -o fr.json --sort-keys # Verify all keys are present in the translation translator-ai en.json -l fr -o fr.json --check-keys # Use a different Gemini model translator-ai en.json -l es -o es.json --gemini-model gemini-2.5-flash # Combine features translator-ai src/**/*.json -l es,fr,de -o "{dir}/{name}.{lang}.json" \ --detect-source --preserve-formats --stats --check-keys
The --gemini-model
option allows you to choose from various Gemini models. Popular options include:
gemini-2.0-flash-lite
(default) - Fast and efficient for most translationsgemini-2.5-flash
- Enhanced performance with newer capabilitiesgemini-pro
- More sophisticated understanding for complex translationsgemini-1.5-pro
- Previous generation pro modelgemini-1.5-flash
- Previous generation fast modelExample usage:
# Use the latest flash model translator-ai en.json -l es -o es.json --gemini-model gemini-2.5-flash # Use the default lightweight model translator-ai en.json -l fr -o fr.json --gemini-model gemini-2.0-flash-lite
The --openai-model
option allows you to choose from various OpenAI models. Popular options include:
gpt-4o-mini
(default) - Cost-effective and fast for most translationsgpt-4o
- Most capable model with advanced understandinggpt-4-turbo
- Previous generation flagship modelgpt-3.5-turbo
- Fast and efficient for simpler translationsExample usage:
# Use OpenAI with the default model translator-ai en.json -l es -o es.json --provider openai # Use GPT-4o for complex translations translator-ai en.json -l ja -o ja.json --provider openai --openai-model gpt-4o # Use GPT-3.5-turbo for faster, simpler translations translator-ai en.json -l fr -o fr.json --provider openai --openai-model gpt-3.5-turbo
When enabled with the --metadata
flag, translator-ai adds metadata to help track translations:
{ "_translator_metadata": { "tool": "translator-ai v1.1.0", "repository": "https://github.com/DatanoiseTV/translator-ai", "provider": "Google Gemini", "source_language": "English", "target_language": "fr", "timestamp": "2025-06-20T12:34:56.789Z", "total_strings": 42, "source_file": "en.json" }, "greeting": "Bonjour", "farewell": "Au revoir" }
Metadata is disabled by default to ensure compatibility with i18n parsers. Use --metadata
to enable it.
Use the --sort-keys
flag to sort all JSON keys alphabetically in the output:
translator-ai en.json -l es -o es.json --sort-keys
This ensures consistent ordering across translations and makes diffs cleaner. Keys are sorted:
Use the --check-keys
flag to ensure translation completeness:
translator-ai en.json -l es -o es.json --check-keys
This feature:
It should support any standardized language codes.
When translating multiple files, translator-ai automatically:
Example: If 10 files share 50% of their strings, you save ~50% on API calls!
%APPDATA%\translator-ai\translation-cache.json
~/Library/Caches/translator-ai/translation-cache.json
~/.cache/translator-ai/translation-cache.json
The cache file stores translations indexed by:
This ensures that:
gemini-2.0-flash-lite
(default) - Fastest, most cost-effectivegemini-pro
- Balanced performancegemini-1.5-pro
- Advanced capabilitiesgemini-1.5-flash
- Fast with good quality--stats
flag to monitor performance and optimization opportunitiestranslator-ai can be used as an MCP server, allowing AI assistants like Claude Desktop to translate files directly.
Add to your Claude Desktop configuration:
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%\Claude\claude_desktop_config.json
{ "mcpServers": { "translator-ai": { "command": "npx", "args": [ "-y", "translator-ai-mcp" ], "env": { "GEMINI_API_KEY": "your-gemini-api-key-here" // Or for Ollama: // "TRANSLATOR_PROVIDER": "ollama" } } } }
Once configured, you can ask Claude to translate files:
Human: Can you translate my English locale file to Spanish?
Claude: I'll translate your English locale file to Spanish using translator-ai.
<use_tool name="translate_json">
{
"inputFile": "locales/en.json",
"targetLanguage": "es",
"outputFile": "locales/es.json"
}
</use_tool>
Successfully translated! The file has been saved to locales/es.json.
For multiple files with deduplication:
Human: Translate all my English JSON files in the locales folder to German.
Claude: I'll translate all your English JSON files to German with deduplication.
<use_tool name="translate_multiple">
{
"pattern": "locales/en/*.json",
"targetLanguage": "de",
"outputPattern": "locales/de/{name}.json",
"showStats": true
}
</use_tool>
Translation complete! Processed 5 files with 23% deduplication savings.
translate_json: Translate a single JSON file
inputFile
: Path to source filetargetLanguage
: Target language codeoutputFile
: Output file pathtranslate_multiple: Translate multiple files with deduplication
pattern
: File pattern or pathstargetLanguage
: Target language codeoutputPattern
: Output pattern with {dir}, {name}, {lang} variablesshowStats
: Show deduplication statistics (optional)Since translator-ai works with JSON files, you'll need to convert YAML to JSON and back. Here's a practical workflow:
# Install yaml conversion tools npm install -g js-yaml # or pip install pyyaml
translate-hugo.sh
):#!/bin/bash # translate-hugo.sh - Translate Hugo YAML i18n files # Function to translate YAML file translate_yaml() { local input_file=$1 local lang=$2 local output_file=$3 echo "Translating $input_file to $lang..." # Convert YAML to JSON npx js-yaml $input_file > temp_input.json # Translate JSON translator-ai temp_input.json -l $lang -o temp_output.json # Convert back to YAML npx js-yaml temp_output.json > $output_file # Cleanup rm temp_input.json temp_output.json } # Translate Hugo i18n files translate_yaml themes/your-theme/i18n/en.yaml es themes/your-theme/i18n/es.yaml translate_yaml themes/your-theme/i18n/en.yaml fr themes/your-theme/i18n/fr.yaml translate_yaml themes/your-theme/i18n/en.yaml de themes/your-theme/i18n/de.yaml
#!/usr/bin/env python3 # hugo-translate.py import yaml import json import subprocess import sys import os def yaml_to_json(yaml_file): """Convert YAML to JSON""" with open(yaml_file, 'r', encoding='utf-8') as f: data = yaml.safe_load(f) return json.dumps(data, ensure_ascii=False, indent=2) def json_to_yaml(json_str): """Convert JSON back to YAML""" data = json.loads(json_str) return yaml.dump(data, allow_unicode=True, default_flow_style=False) def translate_yaml_file(input_yaml, target_lang, output_yaml): """Translate a YAML file using translator-ai""" # Create temp JSON file temp_json_in = 'temp_in.json' temp_json_out = f'temp_out_{target_lang}.json' try: # Convert YAML to JSON json_content = yaml_to_json(input_yaml) with open(temp_json_in, 'w', encoding='utf-8') as f: f.write(json_content) # Run translator-ai cmd = [ 'translator-ai', temp_json_in, '-l', target_lang, '-o', temp_json_out ] subprocess.run(cmd, check=True) # Read translated JSON and convert back to YAML with open(temp_json_out, 'r', encoding='utf-8') as f: translated_json = f.read() yaml_content = json_to_yaml(translated_json) # Write YAML output with open(output_yaml, 'w', encoding='utf-8') as f: f.write(yaml_content) print(f"✓ Translated {input_yaml} to {output_yaml}") finally: # Cleanup temp files for f in [temp_json_in, temp_json_out]: if os.path.exists(f): os.remove(f) # Usage if __name__ == "__main__": languages = ['es', 'fr', 'de', 'ja'] for lang in languages: translate_yaml_file( 'i18n/en.yaml', lang, f'i18n/{lang}.yaml' )
Create translate-yaml.js
:
#!/usr/bin/env node const fs = require('fs'); const yaml = require('js-yaml'); const { execSync } = require('child_process'); const path = require('path'); function translateYamlFile(inputPath, targetLang, outputPath) { console.log(`Translating ${inputPath} to ${targetLang}...`); // Read and parse YAML const yamlContent = fs.readFileSync(inputPath, 'utf8'); const data = yaml.load(yamlContent); // Write temporary JSON const tempJsonIn = `temp_${path.basename(inputPath)}.json`; const tempJsonOut = `temp_${path.basename(inputPath)}_${targetLang}.json`; fs.writeFileSync(tempJsonIn, JSON.stringify(data, null, 2)); try { // Translate using translator-ai execSync(`translator-ai ${tempJsonIn} -l ${targetLang} -o ${tempJsonOut}`); // Read translated JSON const translatedData = JSON.parse(fs.readFileSync(tempJsonOut, 'utf8')); // Convert back to YAML const translatedYaml = yaml.dump(translatedData, { indent: 2, lineWidth: -1, noRefs: true }); // Write output YAML fs.writeFileSync(outputPath, translatedYaml); console.log(`✓ Created ${outputPath}`); } finally { // Cleanup [tempJsonIn, tempJsonOut].forEach(f => { if (fs.existsSync(f)) fs.unlinkSync(f); }); } } // Example usage const languages = ['es', 'fr', 'de']; languages.forEach(lang => { translateYamlFile( 'i18n/en.yaml', lang, `i18n/${lang}.yaml` ); });
Hugo supports two translation methods: by filename (about.en.md
, about.fr.md
) or by content directory (content/en/
, content/fr/
). Here's how to automate both:
Create hugo-translate-files.sh
:
#!/bin/bash # Translate Hugo content files using filename convention SOURCE_LANG="en" TARGET_LANGS=("es" "fr" "de" "ja") # Find all English content files find content -name "*.${SOURCE_LANG}.md" | while read -r file; do # Extract base filename without language suffix base_name="${file%.${SOURCE_LANG}.md}" for lang in "${TARGET_LANGS[@]}"; do output_file="${base_name}.${lang}.md" # Skip if translation already exists if [ -f "$output_file" ]; then echo "Skipping $output_file (already exists)" continue fi # Extract front matter awk '/^---$/{p=1; next} p&&/^---$/{exit} p' "$file" > temp_frontmatter.yaml # Convert front matter to JSON npx js-yaml temp_frontmatter.yaml > temp_frontmatter.json # Translate front matter translator-ai temp_frontmatter.json -l "$lang" -o "temp_translated.json" # Convert back to YAML echo "---" > "$output_file" npx js-yaml temp_translated.json >> "$output_file" echo "---" >> "$output_file" # Copy content (you might want to translate this too) awk '/^---$/{p++} p==2{print}' "$file" | tail -n +2 >> "$output_file" echo "Created $output_file" done # Cleanup rm -f temp_frontmatter.yaml temp_frontmatter.json temp_translated.json done
config.yaml
):defaultContentLanguage: en defaultContentLanguageInSubdir: false languages: en: contentDir: content/en languageName: English weight: 1 es: contentDir: content/es languageName: Español weight: 2 fr: contentDir: content/fr languageName: Français weight: 3 # Rest of your config...
hugo-translate-dirs.js
):#!/usr/bin/env node const fs = require('fs-extra'); const path = require('path'); const yaml = require('js-yaml'); const { execSync } = require('child_process'); const glob = require('glob'); const SOURCE_LANG = 'en'; const TARGET_LANGS = ['es', 'fr', 'de']; async function translateHugoContent() { // Ensure target directories exist for (const lang of TARGET_LANGS) { await fs.ensureDir(`content/${lang}`); } // Find all content files in source language const files = glob.sync(`content/${SOURCE_LANG}/**/*.md`); for (const file of files) { const relativePath = path.relative(`content/${SOURCE_LANG}`, file); for (const lang of TARGET_LANGS) { const targetFile = path.join(`content/${lang}`, relativePath); // Skip if already translated if (await fs.pathExists(targetFile)) { console.log(`Skipping ${targetFile} (exists)`); continue; } await translateFile(file, targetFile, lang); } } } async function translateFile(sourceFile, targetFile, targetLang) { console.log(`Translating ${sourceFile} to ${targetLang}...`); const content = await fs.readFile(sourceFile, 'utf8'); const frontMatterMatch = content.match(/^---\n([\s\S]*?)\n---/); if (!frontMatterMatch) { // No front matter, just copy await fs.ensureDir(path.dirname(targetFile)); await fs.copyFile(sourceFile, targetFile); return; } // Parse front matter const frontMatter = yaml.load(frontMatterMatch[1]); const body = content.substring(frontMatterMatch[0].length); // Extract translatable fields const translatable = { title: frontMatter.title || '', description: frontMatter.description || '', summary: frontMatter.summary || '', keywords: frontMatter.keywords || [] }; // Save for translation await fs.writeJson('temp_meta.json', translatable); // Translate execSync(`translator-ai temp_meta.json -l ${targetLang} -o temp_translated.json`); // Read translations const translated = await fs.readJson('temp_translated.json'); // Update front matter Object.assign(frontMatter, translated); // Write translated file await fs.ensureDir(path.dirname(targetFile)); const newContent = `---\n${yaml.dump(frontMatter)}---${body}`; await fs.writeFile(targetFile, newContent); // Cleanup await fs.remove('temp_meta.json'); await fs.remove('temp_translated.json'); console.log(`✓ Created ${targetFile}`); } // Run translation translateHugoContent().catch(console.error);
npm install -g translator-ai js-yaml
# Makefile for Hugo translations LANGUAGES := es fr de ja zh SOURCE_YAML := i18n/en.yaml THEME_DIR := themes/your-theme .PHONY: translate translate: $(foreach lang,$(LANGUAGES),translate-$(lang)) translate-%: @echo "Translating to $*..." @npx js-yaml $(SOURCE_YAML) > temp.json @translator-ai temp.json -l $* -o temp_$*.json @npx js-yaml temp_$*.json > i18n/$*.yaml @rm temp.json temp_$*.json @echo "✓ Created i18n/$*.yaml" .PHONY: translate-theme translate-theme: @for lang in $(LANGUAGES); do \ make translate-theme-$$lang; \ done translate-theme-%: @echo "Translating theme to $*..." @npx js-yaml $(THEME_DIR)/i18n/en.yaml > temp_theme.json @translator-ai temp_theme.json -l $* -o temp_theme_$*.json @npx js-yaml temp_theme_$*.json > $(THEME_DIR)/i18n/$*.yaml @rm temp_theme.json temp_theme_$*.json .PHONY: clean clean: @rm -f temp*.json # Translate everything .PHONY: all all: translate translate-theme
Usage:
# Translate to all languages make all # Translate to specific language make translate-es # Translate theme files make translate-theme
Here's a comprehensive script that handles both content and i18n translations:
#!/usr/bin/env node // hugo-complete-translator.js const fs = require('fs-extra'); const path = require('path'); const yaml = require('js-yaml'); const { execSync } = require('child_process'); const glob = require('glob'); class HugoTranslator { constructor(targetLanguages = ['es', 'fr', 'de']) { this.targetLanguages = targetLanguages; this.tempFiles = []; } async translateSite() { console.log('Starting Hugo site translation...\n'); // 1. Translate i18n files await this.translateI18nFiles(); // 2. Translate content await this.translateContent(); // 3. Update config await this.updateConfig(); console.log('\nTranslation complete!'); } async translateI18nFiles() { console.log('Translating i18n files...'); const i18nFiles = glob.sync('i18n/en.{yaml,yml,toml}'); for (const file of i18nFiles) { const ext = path.extname(file); for (const lang of this.targetLanguages) { const outputFile = `i18n/${lang}${ext}`; if (await fs.pathExists(outputFile)) { console.log(` Skipping ${outputFile} (exists)`); continue; } // Convert to JSON const tempJson = `temp_i18n_${lang}.json`; await this.convertToJson(file, tempJson); // Translate const translatedJson = `temp_i18n_${lang}_translated.json`; execSync(`translator-ai ${tempJson} -l ${lang} -o ${translatedJson}`); // Convert back await this.convertFromJson(translatedJson, outputFile, ext); // Cleanup await fs.remove(tempJson); await fs.remove(translatedJson); console.log(` ✓ Created ${outputFile}`); } } } async translateContent() { console.log('\nTranslating content...'); // Detect translation method const useContentDirs = await fs.pathExists('content/en'); if (useContentDirs) { await this.translateContentByDirectory(); } else { await this.translateContentByFilename(); } } async translateContentByDirectory() { const files = glob.sync('content/en/**/*.md'); for (const file of files) { const relativePath = path.relative('content/en', file); for (const lang of this.targetLanguages) { const targetFile = path.join('content', lang, relativePath); if (await fs.pathExists(targetFile)) continue; await this.translateMarkdownFile(file, targetFile, lang); } } } async translateContentByFilename() { const files = glob.sync('content/**/*.en.md'); for (const file of files) { const baseName = file.replace('.en.md', ''); for (const lang of this.targetLanguages) { const targetFile = `${baseName}.${lang}.md`; if (await fs.pathExists(targetFile)) continue; await this.translateMarkdownFile(file, targetFile, lang); } } } async translateMarkdownFile(sourceFile, targetFile, targetLang) { const content = await fs.readFile(sourceFile, 'utf8'); const frontMatterMatch = content.match(/^---\n([\s\S]*?)\n---/); if (!frontMatterMatch) { await fs.copy(sourceFile, targetFile); return; } const frontMatter = yaml.load(frontMatterMatch[1]); const body = content.substring(frontMatterMatch[0].length); // Translate front matter const translatable = this.extractTranslatableFields(frontMatter); const tempJson = `temp_content_${path.basename(sourceFile)}.json`; const translatedJson = `${tempJson}.translated`; await fs.writeJson(tempJson, translatable); execSync(`translator-ai ${tempJson} -l ${targetLang} -o ${translatedJson}`); const translated = await fs.readJson(translatedJson); Object.assign(frontMatter, translated); // Write translated file await fs.ensureDir(path.dirname(targetFile)); const newContent = `---\n${yaml.dump(frontMatter)}---${body}`; await fs.writeFile(targetFile, newContent); // Cleanup await fs.remove(tempJson); await fs.remove(translatedJson); console.log(` ✓ ${targetFile}`); } extractTranslatableFields(frontMatter) { const fields = ['title', 'description', 'summary', 'keywords', 'tags']; const translatable = {}; fields.forEach(field => { if (frontMatter[field]) { translatable[field] = frontMatter[field]; } }); return translatable; } async convertToJson(inputFile, outputFile) { const ext = path.extname(inputFile); const content = await fs.readFile(inputFile, 'utf8'); let data; if (ext === '.yaml' || ext === '.yml') { data = yaml.load(content); } else if (ext === '.toml') { // You'd need a TOML parser here throw new Error('TOML support not implemented in this example'); } await fs.writeJson(outputFile, data, { spaces: 2 }); } async convertFromJson(inputFile, outputFile, format) { const data = await fs.readJson(inputFile); let content; if (format === '.yaml' || format === '.yml') { content = yaml.dump(data, { indent: 2, lineWidth: -1, noRefs: true }); } else if (format === '.toml') { throw new Error('TOML support not implemented in this example'); } await fs.writeFile(outputFile, content); } async updateConfig() { console.log('\nUpdating Hugo config...'); const configFile = glob.sync('config.{yaml,yml,toml,json}')[0]; if (!configFile) return; // This is a simplified example - you'd need to properly parse and update console.log(' ! Remember to update your config.yaml with language settings'); } } // Run the translator if (require.main === module) { const translator = new HugoTranslator(['es', 'fr', 'de']); translator.translateSite().catch(console.error); } module.exports = HugoTranslator;
If you're using Hugo Modules, you can create a translation module:
// go.mod module github.com/yourusername/hugo-translator go 1.19 require ( github.com/yourusername/your-theme v1.0.0 )
Then in your package.json
:
{ "scripts": { "translate": "node hugo-complete-translator.js", "translate:content": "node hugo-complete-translator.js --content-only", "translate:i18n": "node hugo-complete-translator.js --i18n-only", "build": "npm run translate && hugo" } }
For Jekyll posts with YAML front matter:
#!/usr/bin/env python3 # translate-jekyll-posts.py import os import yaml import json import subprocess import frontmatter def translate_jekyll_post(post_path, target_lang, output_dir): """Translate Jekyll post including front matter""" # Load post with front matter post = frontmatter.load(post_path) # Extract translatable front matter fields translatable = { 'title': post.metadata.get('title', ''), 'description': post.metadata.get('description', ''), 'excerpt': post.metadata.get('excerpt', '') } # Save as JSON for translation with open('temp_meta.json', 'w', encoding='utf-8') as f: json.dump(translatable, f, ensure_ascii=False, indent=2) # Translate subprocess.run([ 'translator-ai', 'temp_meta.json', '-l', target_lang, '-o', f'temp_meta_{target_lang}.json' ]) # Load translations with open(f'temp_meta_{target_lang}.json', 'r', encoding='utf-8') as f: translations = json.load(f) # Update post metadata for key, value in translations.items(): if value: # Only update if translation exists post.metadata[key] = value # Add language to metadata post.metadata['lang'] = target_lang # Save translated post output_path = os.path.join(output_dir, os.path.basename(post_path)) with open(output_path, 'w', encoding='utf-8') as f: f.write(frontmatter.dumps(post)) # Cleanup os.remove('temp_meta.json') os.remove(f'temp_meta_{target_lang}.json') # Translate all posts for lang in ['es', 'fr', 'de']: os.makedirs(f'_posts/{lang}', exist_ok=True) for post in os.listdir('_posts/en'): if post.endswith('.md'): translate_jekyll_post( f'_posts/en/{post}', lang, f'_posts/{lang}' )
js-yaml
with proper options to maintain YAML structureIf you frequently work with YAML files, consider creating a wrapper script that handles conversion automatically, or request YAML support as a feature for translator-ai.
git clone https://github.com/DatanoiseTV/translator-ai.git cd translator-ai npm install npm run build
npm start -- test.json -l es -o output.json
This project requires attribution for both commercial and non-commercial use. See LICENSE file for details.
Contributions are welcome! Please feel free to submit a Pull Request.
For issues, questions, or suggestions, please open an issue on GitHub.
If you find this tool useful, consider supporting the development: