diff --git a/.claude/skills/port-from-gt-js.md b/.claude/skills/port-from-gt-js.md new file mode 100644 index 0000000..f0646c0 --- /dev/null +++ b/.claude/skills/port-from-gt-js.md @@ -0,0 +1,204 @@ +--- +name: port-from-gt-js +description: Style guide for porting features from the gt (JavaScript) monorepo to this gt-python repo. Use whenever you touch Python packages that mirror a JS counterpart — the JS repo is the source of truth and Python trails it, so new work in Python should match existing JS semantics while conforming to Python idioms. +user-invocable: true +allowed-tools: Agent, Bash, Read, Write, Edit, Glob, Grep, TaskCreate, TaskUpdate +--- + +# Porting from `gt` (JS) → `gt-python` + +The JS monorepo at `~/Documents/dev/gt` is the **canonical implementation**. `gt-python` trails it. When you port a feature you are translating semantics faithfully; when you design a new feature here, you still match JS conventions unless there is a Python-specific reason not to. + +## Read this first + +Before touching code, pull up the corresponding JS package side-by-side. The package map is small and stable: + +| gt (JS) | gt-python | Notes | +| ----------------------------------------------- | ----------------------------------------------------------------- | ---------------------------------------------------------------------------- | +| `packages/core` (`generaltranslation`) | `packages/generaltranslation` | Core toolkit: `GT` class, locales, formatting, translate API client, hashing | +| `packages/i18n` (`gt-i18n`) | `packages/gt-i18n` | Runtime `I18nManager`, `t()` / `msg()`, storage adapters, loaders | +| `packages/supported-locales` | `packages/generaltranslation-supported-locales` | Static locale registry | +| — (bundled into `generaltranslation`) | `packages/generaltranslation-intl-messageformat` + `...-icu-messageformat-parser` | Python splits the ICU/formatter libs out; JS uses `intl-messageformat` + `@formatjs/icu-messageformat-parser` as deps inside core | +| `packages/next` / `packages/react` / `packages/node` | `packages/gt-fastapi` / `packages/gt-flask` | Different frameworks, same shape: provider + middleware + manager singleton | +| `packages/cli`, `packages/compiler`, `packages/python-extractor` | — (no counterpart) | Extraction + build-time rewriting lives in the JS CLI; don't reinvent it here | + +The JS side has source extraction (Babel/SWC/compiler) and the `gt` CLI; Python **intentionally** does not. Python code is discovered by the JS CLI's `@generaltranslation/python-extractor` (tree-sitter-python). Don't build a Python extractor or CLI unless the user explicitly asks. + +## Naming conventions — the translation table + +Apply these mechanically first, then read the code to catch exceptions. + +| JS | Python | +| ----------------------------------- | ------------------------------------------------ | +| `camelCase` function | `snake_case` function | +| `PascalCase` class / type | `PascalCase` class / `TypedDict` / `dataclass` | +| `camelCase.ts` file | `_snake_case.py` file (leading underscore marks module-private) | +| `const libraryDefaultLocale` | `LIBRARY_DEFAULT_LOCALE` (module-level UPPER_SNAKE for public constants) | +| TS `interface` / `type` | `TypedDict` (for API shapes) or `@dataclass` (for value objects) | +| Interface in `types.ts` | `TypedDict` in `_types.py` | +| `./internal` subpath export | `.internal` submodule | +| `./types` subpath export | Re-export from `__init__.py`, or `.X` submodule | +| `./errors` subpath export | `.errors` submodule | + +### Concrete identifier map (spot-check) + +These are representative, not exhaustive — confirm when porting. + +| JS | Python | +| ------------------------- | ---------------------------- | +| `class GT` | `class GT` | +| `class I18nManager` | `class I18nManager` | +| `class ApiError` | `class ApiError` | +| `new GT({ sourceLocale, targetLocale })` | `GT(source_locale=..., target_locale=...)` | +| `gt.queryBranchData(q)` | `await gt.query_branch_data(q)` | +| `gt.translateMany(...)` | `await gt.translate_many(...)` | +| `hashSource` | `hash_source` | +| `standardizeLocale` | `standardize_locale` | +| `getLocaleProperties` | `get_locale_properties` | +| `isValidLocale` | `is_valid_locale` | +| `requiresTranslation` | `requires_translation` | +| `invalidLocaleError(...)` | `invalid_locale_error(...)` | +| `$context` / `$id` / `$maxChars` user option | `_context` / `_id` / `_max_chars` user kwarg (Python has no `$`) | + +### The `_prefix` convention — two different meanings + +- **JS `_foo`** (e.g. `_translateMany`, `_standardizeLocale`) marks an internal implementation re-imported into the public barrel. The barrel then exposes it without the underscore as a method on `GT`. +- **Python `_foo.py`** marks a module-private file inside a package. The function *inside* that file has no underscore. When the `GT` class imports it, it aliases with `as _translate_many`: + + ```python + # packages/generaltranslation/src/generaltranslation/_gt.py + from generaltranslation.translate import translate_many as _translate_many + ``` + + This mirrors the JS pattern and keeps class internals shadow-free. + +## Options, kwargs, and wire format + +This is the trickiest area — Python is Pythonic outside, JS-compatible on the wire. + +1. **Public Python APIs** take keyword-only arguments (`*, ...`) using `snake_case`. See `GT.__init__`. +2. **"Options dicts"** (second-class JS-style config bags) exist where they mirror a JS signature that takes `options`. Python accepts **both** `snake_case` and `camelCase` keys in those dicts for interop — see `GT.enqueue_files` and `GT.download_file` which check both `source_locale`/`sourceLocale` and `file_id`/`fileId`. +3. **HTTP request bodies** are always `camelCase` — the API is the JS API. See `translate_many()` in `translate/_translate.py` building `{ "targetLocale", "sourceLocale", "metadata" }`. +4. **HTTP response bodies** are `camelCase`. Don't rename fields into snake_case in the returned dict; keep wire shape intact so callers see the same shape as the JS SDK returns. The GT class re-maps `locale` fields through `resolve_alias_locale` but preserves the surrounding keys. + +When porting a new method: accept both cases in the options dict, build the body in camelCase, return whatever the API returns unchanged (plus whatever locale-alias remapping the JS side does). + +## Types: TypedDict vs dataclass vs Pydantic + +- **`TypedDict`** for anything that crosses the wire or mirrors a JS `interface` (e.g. `TranslationResult`, `JobStatusEntry`, `TranslateOptions`). Default location: `_types.py` inside the submodule. +- **`@dataclass`** for value objects with behavior/defaults (e.g. `LocaleProperties` — 19 fields defaulting to `""`). +- **`Literal[...]`** for enum-like unions (e.g. `PluralType`, `RetryPolicy`). +- **Type aliases** for wire shapes that are just dicts: `CustomMapping = dict[str, str | dict[str, str]]`. +- **Do NOT use Pydantic.** The core deliberately avoids it. Babel is the i18n backend; validation is manual. + +Annotate everything — `mypy` is configured with `disallow_untyped_defs = true`. + +## Errors + +Both repos use the same two-layer pattern: + +1. A **custom exception** for external failures: JS `ApiError` / Python `ApiError` (in `generaltranslation.errors._api_error`). +2. **Error message factories** — functions returning pre-formatted strings: JS `invalidLocaleError(locale)` → Python `invalid_locale_error(locale)`. Collected in `errors/_messages.py`. + +Validation failures in public methods use `ValueError(error_message_factory(...))`; HTTP failures use `ApiError`. Don't invent new exception types unless the JS side has a new one. + +## Async, HTTP, retries + +- **`httpx.AsyncClient`** is the HTTP client. Never switch to `requests`. +- Core methods are `async def`. The `GT` class exposes them as `async`. Formatting and locale utilities are sync (same as JS). +- Retry policy mirrors JS: `"exponential" | "linear" | "none"`, `MAX_RETRIES = 3`, `INITIAL_DELAY_MS = 500`. See `translate/_request.py` — `api_request()` is the one place retries happen; do not reimplement per-endpoint. +- Timeouts are **always in milliseconds** at the public API boundary (matches JS); convert to seconds right at the `httpx` call site (`timeout_s = timeout_ms / 1000.0`). +- `asyncio.sleep()` between retries. `asyncio.TimeoutError` / `httpx.TimeoutException` are distinct — wrap both. + +## Framework integrations + +Don't invent a new shape — both `gt-fastapi` and `gt-flask` already exemplify it: + +- Single entry point `initialize_gt(app, *, default_locale=None, locales=None, project_id=None, ..., get_locale=None, load_translations=None, eager_loading=True, config_path=None, load_config=None) -> I18nManager`. +- Calls `set_i18n_manager(manager)` for singleton access. +- Registers middleware / `before_request` hook to detect locale per request (default: `Accept-Language`). +- Eager-loads translations on startup when `eager_loading=True`: + - FastAPI: async lifespan context manager. + - Flask: `asyncio.run(manager.load_all_translations())` as a sync-to-async bridge. +- Reads config from `gt.config.json` via `load_gt_config()` — CLI flags (JS side) have no Python equivalent, so file config + explicit kwargs are the only sources. + +If you add a new framework (e.g. Django, Starlette), copy the shape from `gt-flask/_setup.py` or `gt-fastapi/_setup.py` and vary only the locale-detection hook and lifespan mechanism. + +## Package layout (per-package) + +``` +packages// +├── pyproject.toml # project + uv build + workspace sources +├── README.md +├── src// # snake_case module name +│ ├── __init__.py # public API re-exports + __all__ +│ ├── py.typed # PEP 561 marker (MUST exist) +│ ├── _gt.py / _foo.py # private modules (leading underscore) +│ └── / +│ ├── __init__.py # submodule re-exports +│ ├── _types.py # TypedDicts +│ └── _impl.py +└── tests/ # pytest, NOT __tests__ + ├── / + │ └── test_foo.py + └── conftest.py +``` + +- **Module name** uses underscores: `gt-i18n` package → `gt_i18n` module (Python identifiers can't contain hyphens). +- **Workspace dep** between packages goes through `[tool.uv.sources]` with `{ workspace = true }`. Don't use relative paths. +- **`py.typed`** must exist in every shippable package. +- **`__init__.py`** declares `__all__` explicitly — treat it like the JS barrel file. + +## Testing + +- `pytest` + `pytest-asyncio` (already in dev-deps). `@pytest.mark.asyncio` on async tests. +- Test files: `test_*.py` in `tests/`. Test classes: `class TestFoo:`. Test functions: `def test_bar(...)`. +- Use `monkeypatch` for env vars (`GT_API_KEY`, `GT_PROJECT_ID`). +- Do **not** collocate tests next to source (that's the JS convention). Use the `tests/` folder per package. +- For HTTP, mock `httpx.AsyncClient` rather than patching `api_request` — tests should cover the retry/error paths. + +## Tooling & commands + +- `make check` — full gate (lint + format + typecheck + test). Run before declaring work done. +- `make lint` / `make format` — Ruff only; line length 120; rules `E,F,I,N,W,UP`. +- `make typecheck` — mypy strict (`disallow_untyped_defs`). +- `make test` — pytest across all packages. +- `uv run pytest packages//` — targeted test run. +- `make changeset` (`sampo add`) — for any user-visible change; pick the affected packages and bump type. + +## Porting workflow (when asked to port feature X from gt) + +1. **Locate the JS source.** Find the matching file(s) in `~/Documents/dev/gt/packages//src`. Read them fully. Note: public signature, internal helpers, types, error messages, tests. +2. **Locate the Python target.** Find the matching Python package under `packages/`. If it doesn't exist yet, stop and confirm with the user before creating a new package. +3. **Check for drift.** Is the existing Python already partially ported? Compare `__init__.py` exports on both sides. Note what's missing. +4. **Translate names** via the tables above. `camelCase` → `snake_case`, `_internal.ts` → `_internal.py`, interfaces → `TypedDict`. +5. **Translate logic.** Keep control flow identical where possible. Replace `fetch` → `httpx`, `Promise.all` → `asyncio.gather`, `Object.fromEntries` → dict comprehension, `?.` → `.get()` / explicit `if` checks, `...spread` → `{**a, **b}`. +6. **Wire format untouched.** Request/response bodies stay `camelCase`. Accept both cases in incoming options dicts. +7. **Mirror errors.** Same validation conditions, same messages — use the Python error-message factories, adding new ones in `errors/_messages.py` if the JS side has one that doesn't exist yet. +8. **Tests.** Port the JS tests to `tests/` with pytest-style assertions. Tests that rely on JSX / TS-only behaviors don't port — skip and note it. +9. **Re-export.** Add new public symbols to `__init__.py` and `__all__`. If the JS side exposes them via a subpath (`/internal`, `/types`), add them to the corresponding Python submodule's `__init__.py`. +10. **Changeset.** `sampo add`, pick affected packages, choose bump type. The `gt-python` release process depends on this. + +## Known deliberate divergences + +Not every difference is a bug. The following are intentional and should be preserved: + +- **No Python CLI / compiler / extractor-in-repo.** The JS CLI drives extraction across both ecosystems. +- **Python ships ICU/formatter as separate workspace packages** (`generaltranslation-intl-messageformat`, `...-icu-messageformat-parser`) because there's no pure-Python `intl-messageformat` on PyPI to depend on. +- **`_context` / `_id` / `_max_chars` (underscore prefix)** instead of JS's `$context` / `$id` / `$maxChars` — `$` is not a valid Python identifier character. +- **`StorageAdapter` / `ContextVarStorageAdapter`** — Python's equivalent of JS's `FallbackStorageAdapter` (which uses `AsyncLocalStorage`). Python uses `contextvars`. +- **Flask integration uses `asyncio.run()`** as a sync/async bridge during startup — required because Flask is sync. Don't "fix" this to async-everywhere. +- **`babel` (the Python lib)** is a core dependency for CLDR data / number formatting / plurals. Unrelated to the JS Babel compiler. + +## Quick reference: where things live + +- Public API barrel: `packages//src//__init__.py` +- Error messages: `packages/generaltranslation/src/generaltranslation/errors/_messages.py` +- HTTP client + retries: `packages/generaltranslation/src/generaltranslation/translate/_request.py` +- `GT` class: `packages/generaltranslation/src/generaltranslation/_gt.py` +- `I18nManager`: `packages/gt-i18n/src/gt_i18n/i18n_manager/_i18n_manager.py` +- `t()` / `msg()`: `packages/gt-i18n/src/gt_i18n/translation_functions/_t.py` / `_msg.py` +- Framework setup reference: `packages/gt-fastapi/src/gt_fastapi/_setup.py`, `packages/gt-flask/src/gt_flask/_setup.py` +- Settings / defaults: `packages/generaltranslation/src/generaltranslation/_settings.py` + +When in doubt: read the JS file, then read an already-ported Python neighbour, then write code that matches both. diff --git a/.sampo/changesets/runtime-translation.md b/.sampo/changesets/runtime-translation.md new file mode 100644 index 0000000..5405bb7 --- /dev/null +++ b/.sampo/changesets/runtime-translation.md @@ -0,0 +1,12 @@ +--- +pypi/generaltranslation: patch +pypi/gt-i18n: minor +pypi/gt-fastapi: minor +pypi/gt-flask: minor +--- + +feat: add runtime translation system (ports gt PR #1207 / #1217) + +- **gt-i18n**: new public async `tx()` for runtime string translation via `GT.translate_many`. Replaces `TranslationsManager` with a two-level cache hierarchy (`LocalesCache` → `TranslationsCache`) with batching, in-flight dedup, and a configurable concurrency cap. Adds new `I18nManager` methods (`lookup_translation`, `lookup_translation_with_fallback`, `load_translations`, `get_lookup_translation`) plus `lifecycle`, `batch_size`, `batch_interval_ms`, `max_concurrent_requests`, and `translation_timeout_ms` constructor kwargs. `hash_message` now accepts `format="ICU" | "STRING" | "I18NEXT"` and only applies `index_vars()` for ICU — subtle breaking change for any pre-existing STRING/I18NEXT cache keys. Deprecates `get_translations`, `get_translation_resolver`, `resolve_translation_sync`, and `get_translation_loader` with `DeprecationWarning`. +- **gt-fastapi / gt-flask**: `initialize_gt()` gains `lifecycle`, `batch_size`, `batch_interval_ms`, `max_concurrent_requests`, and `translation_timeout_ms` kwargs (all forwarded to `I18nManager`). Both packages now re-export `tx`. +- **generaltranslation**: `hash_source` and `hash_template` now pass `ensure_ascii=False` to `json.dumps`, matching JS `JSON.stringify` semantics for non-ASCII content. Fixes a cross-SDK hash divergence for messages and contexts containing unicode. diff --git a/packages/generaltranslation/src/generaltranslation/_id/_hash.py b/packages/generaltranslation/src/generaltranslation/_id/_hash.py index 086d23d..4201c81 100644 --- a/packages/generaltranslation/src/generaltranslation/_id/_hash.py +++ b/packages/generaltranslation/src/generaltranslation/_id/_hash.py @@ -33,7 +33,9 @@ def hash_source( if max_chars is not None: sanitized_data["maxChars"] = abs(max_chars) - stringified = json.dumps(sanitized_data, sort_keys=True, separators=(",", ":")) + # ensure_ascii=False matches JS JSON.stringify, which preserves raw UTF-8. + # Without this, Python escapes non-ASCII to \uXXXX and diverges from JS hashes. + stringified = json.dumps(sanitized_data, sort_keys=True, separators=(",", ":"), ensure_ascii=False) return hash_function(stringified) @@ -44,5 +46,5 @@ def hash_template( """Hash a template dict.""" if hash_function is None: hash_function = hash_string - stringified = json.dumps(template, sort_keys=True, separators=(",", ":")) + stringified = json.dumps(template, sort_keys=True, separators=(",", ":"), ensure_ascii=False) return hash_function(stringified) diff --git a/packages/gt-fastapi/src/gt_fastapi/__init__.py b/packages/gt-fastapi/src/gt_fastapi/__init__.py index ad6a4c1..f532964 100644 --- a/packages/gt-fastapi/src/gt_fastapi/__init__.py +++ b/packages/gt-fastapi/src/gt_fastapi/__init__.py @@ -10,6 +10,7 @@ get_locales, get_version_id, t, + tx, ) from gt_fastapi._setup import initialize_gt @@ -17,6 +18,7 @@ __all__ = [ "initialize_gt", "t", + "tx", "declare_var", "derive", "declare_static", diff --git a/packages/gt-fastapi/src/gt_fastapi/_setup.py b/packages/gt-fastapi/src/gt_fastapi/_setup.py index a95b5a1..8a51edf 100644 --- a/packages/gt-fastapi/src/gt_fastapi/_setup.py +++ b/packages/gt-fastapi/src/gt_fastapi/_setup.py @@ -9,6 +9,7 @@ from generaltranslation import CustomMapping from generaltranslation._settings import LIBRARY_DEFAULT_LOCALE from gt_i18n import I18nManager, set_i18n_manager +from gt_i18n.i18n_manager._lifecycle import LifecycleCallbacks from gt_i18n.internal import GTConfig, _detect_from_accept_language, load_gt_config @@ -26,6 +27,12 @@ def initialize_gt( eager_loading: bool = True, config_path: str | None = None, load_config: Callable[[str | None], GTConfig] | None = None, + lifecycle: LifecycleCallbacks | None = None, + batch_size: int = 25, + batch_interval_ms: int = 50, + max_concurrent_requests: int = 100, + translation_timeout_ms: int = 12_000, + cache_expiry_time: int = 60_000, ) -> I18nManager: """Initialize General Translation for a FastAPI app. @@ -68,6 +75,12 @@ def initialize_gt( cache_url=resolved_cache_url, load_translations=load_translations, version_id=resolved_version_id, + lifecycle=lifecycle, + batch_size=batch_size, + batch_interval_ms=batch_interval_ms, + max_concurrent_requests=max_concurrent_requests, + translation_timeout_ms=translation_timeout_ms, + cache_expiry_time=cache_expiry_time, ) set_i18n_manager(manager) diff --git a/packages/gt-fastapi/tests/test_fastapi_lifecycle_forwarding.py b/packages/gt-fastapi/tests/test_fastapi_lifecycle_forwarding.py new file mode 100644 index 0000000..847df70 --- /dev/null +++ b/packages/gt-fastapi/tests/test_fastapi_lifecycle_forwarding.py @@ -0,0 +1,97 @@ +"""Golden-standard test: ``initialize_gt()`` forwards the ``lifecycle`` kwarg. + +The gt-fastapi ``initialize_gt`` entry point is a thin pass-through to +``I18nManager(...)``. PR #1207 added lifecycle callbacks to the I18nManager; +this test pins that gt-fastapi accepts and forwards the ``lifecycle`` kwarg +so users can observe cache behavior without bypassing the framework helper. + +Contract: +- ``initialize_gt(app, ..., lifecycle={...})`` constructs the I18nManager with + those callbacks wired through. +- A runtime translate miss / hit triggers the corresponding callback. + +This test should FAIL until PR #1207 is ported — both the ``lifecycle`` kwarg +on ``initialize_gt`` and the underlying I18nManager plumbing need to exist. +""" + +from __future__ import annotations + +from collections.abc import Generator +from typing import Any + +import pytest +from fastapi import FastAPI +from gt_fastapi import initialize_gt +from gt_i18n.i18n_manager._i18n_manager import I18nManager + + +@pytest.fixture(autouse=True) +def _reset_singleton() -> Generator[None, None, None]: + import gt_i18n.i18n_manager._singleton as mod + + old = mod._manager + yield + mod._manager = old + + +class _FakeGT: + """Minimal GT test double returning an echoing translation.""" + + async def translate_many( + self, + sources: dict[str, Any], + options: dict[str, Any] | str, + timeout: int | None = None, + ) -> dict[str, Any]: + if isinstance(options, str): + options = {"target_locale": options} + locale = options.get("target_locale", options.get("targetLocale", "?")) + return { + h: {"success": True, "translation": f"[{locale}]{entry['source']}", "locale": locale} + for h, entry in sources.items() + } + + +# --------------------------------------------------------------------------- +# test_fastapi_initialize_gt_forwards_lifecycle_kwarg +# +# Example: +# events = [] +# manager = initialize_gt( +# app, +# default_locale="en", locales=["en", "es"], +# load_translations=lambda loc: {}, +# lifecycle={"on_translations_cache_miss": lambda **p: events.append(p)}, +# ) +# # After a runtime miss, `events` has one entry. +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_fastapi_initialize_gt_forwards_lifecycle_kwarg(monkeypatch: pytest.MonkeyPatch) -> None: + """``initialize_gt`` accepts ``lifecycle`` and wires it through to the I18nManager.""" + gt = _FakeGT() + monkeypatch.setattr(I18nManager, "get_gt_instance", lambda self: gt) + + events: list[dict[str, Any]] = [] + + app = FastAPI() + manager = initialize_gt( + app, + default_locale="en", + locales=["en", "es"], + load_translations=lambda locale: {}, + eager_loading=False, + lifecycle={ + "on_translations_cache_miss": lambda *, locale, hash, value: events.append( + {"locale": locale, "hash": hash, "value": value} + ), + }, + ) + + manager.set_locale("es") + await manager.lookup_translation_with_fallback("Hello", _format="STRING") + + assert len(events) == 1, "on_translations_cache_miss should have fired once via the forwarded lifecycle dict" + assert events[0]["locale"] == "es" + assert events[0]["value"] == "[es]Hello" diff --git a/packages/gt-flask/src/gt_flask/__init__.py b/packages/gt-flask/src/gt_flask/__init__.py index f4d9202..d8641a2 100644 --- a/packages/gt-flask/src/gt_flask/__init__.py +++ b/packages/gt-flask/src/gt_flask/__init__.py @@ -10,6 +10,7 @@ get_locales, get_version_id, t, + tx, ) from gt_flask._setup import initialize_gt @@ -17,6 +18,7 @@ __all__ = [ "initialize_gt", "t", + "tx", "declare_var", "derive", "declare_static", diff --git a/packages/gt-flask/src/gt_flask/_setup.py b/packages/gt-flask/src/gt_flask/_setup.py index 004ee73..d29d5cb 100644 --- a/packages/gt-flask/src/gt_flask/_setup.py +++ b/packages/gt-flask/src/gt_flask/_setup.py @@ -9,6 +9,7 @@ from generaltranslation import CustomMapping from generaltranslation._settings import LIBRARY_DEFAULT_LOCALE from gt_i18n import I18nManager, set_i18n_manager +from gt_i18n.i18n_manager._lifecycle import LifecycleCallbacks from gt_i18n.internal import GTConfig, _detect_from_accept_language, load_gt_config @@ -26,6 +27,12 @@ def initialize_gt( eager_loading: bool = True, config_path: str | None = None, load_config: Callable[[str | None], GTConfig] | None = None, + lifecycle: LifecycleCallbacks | None = None, + batch_size: int = 25, + batch_interval_ms: int = 50, + max_concurrent_requests: int = 100, + translation_timeout_ms: int = 12_000, + cache_expiry_time: int = 60_000, ) -> I18nManager: """Initialize General Translation for a Flask app. @@ -68,6 +75,12 @@ def initialize_gt( cache_url=resolved_cache_url, load_translations=load_translations, version_id=resolved_version_id, + lifecycle=lifecycle, + batch_size=batch_size, + batch_interval_ms=batch_interval_ms, + max_concurrent_requests=max_concurrent_requests, + translation_timeout_ms=translation_timeout_ms, + cache_expiry_time=cache_expiry_time, ) set_i18n_manager(manager) diff --git a/packages/gt-flask/tests/test_flask_lifecycle_forwarding.py b/packages/gt-flask/tests/test_flask_lifecycle_forwarding.py new file mode 100644 index 0000000..7ebb316 --- /dev/null +++ b/packages/gt-flask/tests/test_flask_lifecycle_forwarding.py @@ -0,0 +1,91 @@ +"""Golden-standard test: ``initialize_gt()`` forwards the ``lifecycle`` kwarg. + +Sibling to the identically-named test in gt-fastapi. Both frameworks are thin +pass-throughs to ``I18nManager(...)`` and both must accept and forward the +``lifecycle`` kwarg so users can observe cache behavior without bypassing +``initialize_gt``. + +This test should FAIL until PR #1207 is ported. +""" + +from __future__ import annotations + +from collections.abc import Generator +from typing import Any + +import pytest +from flask import Flask +from gt_flask import initialize_gt +from gt_i18n.i18n_manager._i18n_manager import I18nManager + + +@pytest.fixture(autouse=True) +def _reset_singleton() -> Generator[None, None, None]: + import gt_i18n.i18n_manager._singleton as mod + + old = mod._manager + yield + mod._manager = old + + +class _FakeGT: + """Minimal GT test double returning an echoing translation.""" + + async def translate_many( + self, + sources: dict[str, Any], + options: dict[str, Any] | str, + timeout: int | None = None, + ) -> dict[str, Any]: + if isinstance(options, str): + options = {"target_locale": options} + locale = options.get("target_locale", options.get("targetLocale", "?")) + return { + h: {"success": True, "translation": f"[{locale}]{entry['source']}", "locale": locale} + for h, entry in sources.items() + } + + +# --------------------------------------------------------------------------- +# test_flask_initialize_gt_forwards_lifecycle_kwarg +# +# Example: +# events = [] +# manager = initialize_gt( +# app, +# default_locale="en", locales=["en", "es"], +# load_translations=lambda loc: {}, +# lifecycle={"on_translations_cache_miss": lambda **p: events.append(p)}, +# ) +# # After a runtime miss, `events` has one entry. +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_flask_initialize_gt_forwards_lifecycle_kwarg(monkeypatch: pytest.MonkeyPatch) -> None: + """``initialize_gt`` accepts ``lifecycle`` and wires it through to the I18nManager.""" + gt = _FakeGT() + monkeypatch.setattr(I18nManager, "get_gt_instance", lambda self: gt) + + events: list[dict[str, Any]] = [] + + app = Flask(__name__) + manager = initialize_gt( + app, + default_locale="en", + locales=["en", "es"], + load_translations=lambda locale: {}, + eager_loading=False, + lifecycle={ + "on_translations_cache_miss": lambda *, locale, hash, value: events.append( + {"locale": locale, "hash": hash, "value": value} + ), + }, + ) + + manager.set_locale("es") + await manager.lookup_translation_with_fallback("Hello", _format="STRING") + + assert len(events) == 1, "on_translations_cache_miss should have fired once via the forwarded lifecycle dict" + assert events[0]["locale"] == "es" + assert events[0]["value"] == "[es]Hello" diff --git a/packages/gt-i18n/src/gt_i18n/__init__.py b/packages/gt-i18n/src/gt_i18n/__init__.py index de617ce..60c20d3 100644 --- a/packages/gt-i18n/src/gt_i18n/__init__.py +++ b/packages/gt-i18n/src/gt_i18n/__init__.py @@ -26,6 +26,7 @@ msg, t, t_fallback, + tx, ) __all__ = [ @@ -46,6 +47,7 @@ "msg", "t", "t_fallback", + "tx", # Locale helpers "get_locale", "get_locales", diff --git a/packages/gt-i18n/src/gt_i18n/i18n_manager/_cache.py b/packages/gt-i18n/src/gt_i18n/i18n_manager/_cache.py new file mode 100644 index 0000000..6fc915b --- /dev/null +++ b/packages/gt-i18n/src/gt_i18n/i18n_manager/_cache.py @@ -0,0 +1,72 @@ +"""Abstract two-phase cache with in-flight request deduplication. + +Port of the JS ``Cache`` base +introduced in gt PR #1207. Provides the shared dedup primitive used by +``LocalesCache`` and ``TranslationsCache``: if two callers miss() for the +same cache key while the first is still in flight, they share one +``_fallback()`` invocation rather than firing two concurrent requests. +""" + +from __future__ import annotations + +import asyncio +from abc import ABC, abstractmethod +from typing import Any, Generic, TypeVar + +_InputKey = TypeVar("_InputKey") +_CacheKey = TypeVar("_CacheKey") +_CacheValue = TypeVar("_CacheValue") +_OutputValue = TypeVar("_OutputValue") + + +class Cache(ABC, Generic[_InputKey, _CacheKey, _CacheValue, _OutputValue]): + """Abstract cache with synchronous ``get()`` and async dedup-fallback ``_miss_cache()``. + + Subclasses implement: + * ``_gen_key(input)`` — map the external key to the internal cache key. + * ``_fallback(input)`` — async work to populate the cache on a miss. + * ``get(input)`` — synchronous lookup (never blocks, never fires network). + * ``miss(input)`` — the public async entry point, orchestrating get/fallback. + + The base provides ``_miss_cache(input)``: a dedup wrapper around ``_fallback`` + keyed on ``CacheKey``. Concurrent callers with the same computed key share + one in-flight future. Exceptions propagate to all awaiters but the pending + entry is cleared so subsequent calls can retry cleanly. + """ + + def __init__(self) -> None: + self._cache: dict[Any, _CacheValue] = {} + self._pending: dict[Any, asyncio.Future[_CacheValue | None]] = {} + + @abstractmethod + def _gen_key(self, input_key: _InputKey, /) -> _CacheKey: + """Map the external input key to the internal cache key.""" + + @abstractmethod + async def _fallback(self, input_key: _InputKey, /) -> _CacheValue | None: + """Populate the cache on miss. Runs at most once per in-flight window per key.""" + + @abstractmethod + def get(self, input_key: _InputKey, /) -> _OutputValue | None: + """Synchronous accessor. Returns ``None`` on miss; never triggers work.""" + + async def _miss_cache(self, input_key: _InputKey, /) -> _CacheValue | None: + """Dedup concurrent misses by cache key, invoking ``_fallback`` exactly once.""" + key = self._gen_key(input_key) + pending = self._pending.get(key) + if pending is not None: + return await pending + + loop = asyncio.get_running_loop() + future: asyncio.Future[_CacheValue | None] = loop.create_future() + self._pending[key] = future + try: + value = await self._fallback(input_key) + future.set_result(value) + except BaseException as exc: # noqa: BLE001 — propagate to all awaiters via the future + future.set_exception(exc) + finally: + self._pending.pop(key, None) + + # Always consume the future — returns the result or re-raises its exception. + return await future diff --git a/packages/gt-i18n/src/gt_i18n/i18n_manager/_i18n_manager.py b/packages/gt-i18n/src/gt_i18n/i18n_manager/_i18n_manager.py index ea6a001..3fecddf 100644 --- a/packages/gt-i18n/src/gt_i18n/i18n_manager/_i18n_manager.py +++ b/packages/gt-i18n/src/gt_i18n/i18n_manager/_i18n_manager.py @@ -1,8 +1,18 @@ -"""I18nManager — central orchestrator for i18n operations.""" +"""I18nManager — central orchestrator for i18n operations. + +Post gt PR #1207: the manager owns a ``LocalesCache`` (outer) whose entries +each own a ``TranslationsCache`` (inner, hash-keyed, batched). All lookup +traffic flows through this two-level cache. Runtime translate calls go to +``GT.translate_many`` via a ``create_translate_many_factory`` that defers GT +construction to ``self.get_gt_instance()`` so tests can patch it cleanly. +""" from __future__ import annotations -from typing import TYPE_CHECKING +import asyncio +import warnings +from collections.abc import Awaitable, Callable +from typing import TYPE_CHECKING, Any from generaltranslation import CustomMapping from generaltranslation._gt import GT @@ -10,27 +20,20 @@ from generaltranslation.locales import requires_translation from gt_i18n.i18n_manager._context_var_adapter import ContextVarStorageAdapter +from gt_i18n.i18n_manager._lifecycle import LifecycleCallbacks +from gt_i18n.i18n_manager._locales_cache import LocalesCache from gt_i18n.i18n_manager._storage_adapter import StorageAdapter -from gt_i18n.i18n_manager._translations_manager import TranslationsManager +from gt_i18n.i18n_manager._translate_many_factory import ( + DEFAULT_TRANSLATION_TIMEOUT_MS, + create_translate_many_factory, +) if TYPE_CHECKING: from gt_i18n.i18n_manager._remote_loader import TranslationsLoader class I18nManager: - """Central orchestrator for i18n operations. - - Args: - default_locale: The source/default locale. - locales: List of target locales the application supports. - project_id: GT project ID (used for CDN loader). - cache_url: CDN base URL override. - store_adapter: Custom storage adapter. Defaults to - :class:`ContextVarStorageAdapter`. - load_translations: Custom translation loader. Overrides the - remote CDN loader when provided. - cache_expiry_time: Cache expiry in milliseconds. - """ + """Central orchestrator for i18n operations.""" def __init__( self, @@ -44,6 +47,11 @@ def __init__( load_translations: TranslationsLoader | None = None, cache_expiry_time: int = 60_000, version_id: str | None = None, + lifecycle: LifecycleCallbacks | None = None, + batch_size: int = 25, + batch_interval_ms: int = 50, + max_concurrent_requests: int = 100, + translation_timeout_ms: int = DEFAULT_TRANSLATION_TIMEOUT_MS, ) -> None: self._version_id = version_id self._default_locale = default_locale @@ -56,9 +64,9 @@ def __init__( # Storage self._store: StorageAdapter = store_adapter or ContextVarStorageAdapter() - # Translation loading + # Translation loader — user-provided, CDN (if project_id), or no-op. if load_translations is not None: - loader = load_translations + loader: TranslationsLoader = load_translations elif project_id: from gt_i18n.i18n_manager._remote_loader import ( create_remote_translation_loader, @@ -67,8 +75,24 @@ def __init__( loader = create_remote_translation_loader(project_id, cache_url or "") else: loader = lambda locale: {} # noqa: E731 + self._load_translations: TranslationsLoader = loader + + # Two-level cache hierarchy. + self._lifecycle: LifecycleCallbacks = lifecycle or {} + self._locales_cache = LocalesCache( + load_translations=loader, + create_translate_many=create_translate_many_factory( + self.get_gt_instance, + timeout_ms=translation_timeout_ms, + ), + ttl_ms=cache_expiry_time, + batch_size=batch_size, + batch_interval_ms=batch_interval_ms, + max_concurrent_requests=max_concurrent_requests, + lifecycle=self._lifecycle, + ) - self._translations = TranslationsManager(loader, cache_expiry_time=cache_expiry_time) + # -- basic properties -------------------------------------------------- @property def default_locale(self) -> str: @@ -85,23 +109,19 @@ def get_gt_instance(self) -> GT: ) def get_version_id(self) -> str | None: - """Get the version ID for the current source, if set.""" return self._version_id def get_locales(self) -> list[str]: return list(self._locales) def get_locale(self) -> str: - """Read the current request locale from the storage adapter.""" locale = self._store.get_item("locale") return locale or self._default_locale def set_locale(self, locale: str) -> None: - """Write the current request locale to the storage adapter.""" self._store.set_item("locale", locale) def requires_translation(self, locale: str | None = None) -> bool: - """Check if the given locale needs translation from the default.""" target = locale or self.get_locale() return requires_translation( self._default_locale, @@ -109,17 +129,108 @@ def requires_translation(self, locale: str | None = None) -> bool: approved_locales=self._locales or None, ) - async def get_translations(self, locale: str | None = None) -> dict[str, str]: - """Get translations for a locale (async, loads if needed).""" + # -- translation lookup (new API) ------------------------------------- + + def lookup_translation(self, message: str, **options: Any) -> str | None: + """Sync dict-cache lookup. Never fires a network call. Returns None on miss.""" + locale = options.pop("_locale", None) or self.get_locale() + if not self.requires_translation(locale): + return None + tc = self._locales_cache.get(locale) + if tc is None: + return None + return tc.get({"message": message, "options": options}) + + async def lookup_translation_with_fallback(self, message: str, **options: Any) -> str | None: + """Cache-hit sync; on miss, fetch via runtime translate_many.""" + locale = options.pop("_locale", None) or self.get_locale() + if not self.requires_translation(locale): + return None + tc = self._locales_cache.get(locale) + if tc is None: + tc = await self._locales_cache.miss(locale) + return await tc.miss({"message": message, "options": options}) + + async def load_translations(self, locale: str | None = None) -> dict[str, str]: + """Load (or return cached) translations dict for the locale.""" target = locale or self.get_locale() - return await self._translations.get_translations(target) + await self._locales_cache.miss(target) + entry = self._locales_cache._cache.get(target) + return dict(entry.translations) if entry else {} + + async def get_lookup_translation( + self, + locale: str | None = None, + prefetch: list[dict[str, Any]] | None = None, + ) -> Callable[[str, dict[str, Any]], str | None]: + """Prefetch entries for ``locale``, then return a sync lookup callable.""" + target = locale or self.get_locale() + tc = self._locales_cache.get(target) + if tc is None: + tc = await self._locales_cache.miss(target) + if prefetch: + await asyncio.gather(*(tc.miss(entry) for entry in prefetch)) + + def _lookup(message: str, options: dict[str, Any]) -> str | None: + return tc.get({"message": message, "options": options}) + + return _lookup + + # -- translation lookup (legacy, deprecated) -------------------------- + + async def get_translations(self, locale: str | None = None) -> dict[str, str]: + """DEPRECATED: use ``load_translations`` instead.""" + warnings.warn( + "I18nManager.get_translations is deprecated; use load_translations instead.", + DeprecationWarning, + stacklevel=2, + ) + return await self.load_translations(locale) def get_translations_sync(self, locale: str | None = None) -> dict[str, str]: - """Get cached translations (sync, returns empty if not loaded).""" + """Return cached translations dict (sync, empty if not yet loaded).""" target = locale or self.get_locale() - return self._translations.get_translations_sync(target) + entry = self._locales_cache._cache.get(target) + if entry is None: + return {} + # Respect TTL — expired entries shouldn't leak stale data. + import time as _time + + if entry.expires_at < _time.monotonic(): + return {} + return dict(entry.translations) async def load_all_translations(self) -> None: - """Eagerly load translations for all configured locales.""" + """Eagerly load translations for all configured locales concurrently.""" if self._locales: - await self._translations.load_all(self._locales) + await asyncio.gather(*(self._locales_cache.miss(loc) for loc in self._locales)) + + async def get_translation_resolver( + self, + locale: str | None = None, + ) -> Callable[[str, dict[str, Any]], str | None]: + """DEPRECATED: use ``get_lookup_translation`` instead.""" + warnings.warn( + "I18nManager.get_translation_resolver is deprecated; use get_lookup_translation instead.", + DeprecationWarning, + stacklevel=2, + ) + return await self.get_lookup_translation(locale) + + def resolve_translation_sync(self, message: str, options: dict[str, Any]) -> str | None: + """DEPRECATED: use ``lookup_translation`` instead.""" + warnings.warn( + "I18nManager.resolve_translation_sync is deprecated; use lookup_translation instead.", + DeprecationWarning, + stacklevel=2, + ) + return self.lookup_translation(message, **options) + + def get_translation_loader(self) -> Callable[[str], dict[str, str] | Awaitable[dict[str, str]]]: + """DEPRECATED: pass ``load_translations`` directly to the constructor instead.""" + warnings.warn( + "I18nManager.get_translation_loader is deprecated; pass load_translations directly.", + DeprecationWarning, + stacklevel=2, + ) + return self._load_translations diff --git a/packages/gt-i18n/src/gt_i18n/i18n_manager/_lifecycle.py b/packages/gt-i18n/src/gt_i18n/i18n_manager/_lifecycle.py new file mode 100644 index 0000000..6570faf --- /dev/null +++ b/packages/gt-i18n/src/gt_i18n/i18n_manager/_lifecycle.py @@ -0,0 +1,41 @@ +"""Lifecycle callback TypedDict for cache hits / misses. + +Mirrors the JS ``LifecycleCallbacks`` type introduced in gt PR #1207. All +callbacks are optional; consumers can observe any subset of events. + +Callbacks are invoked with keyword arguments (Pythonic) rather than a single +params object. Example: + + def on_miss(*, locale: str, hash: str, value: str) -> None: + log.info("runtime fetch for %s/%s", locale, hash) + + manager = I18nManager( + ..., + lifecycle={"on_translations_cache_miss": on_miss}, + ) +""" + +from __future__ import annotations + +from collections.abc import Callable +from typing import Any, TypedDict + + +class LifecycleCallbacks(TypedDict, total=False): + """Optional hooks for observing translation cache behavior. + + Keys: + on_locales_cache_hit: Fires when a locale's translations dict is read + from the outer LocalesCache. Kwargs: ``locale``, ``value``. + on_locales_cache_miss: Fires after the user loader completes for a + locale. Kwargs: ``locale``, ``value``. + on_translations_cache_hit: Fires when a translation is read from the + inner TranslationsCache. Kwargs: ``locale``, ``hash``, ``value``. + on_translations_cache_miss: Fires after a runtime translate fetch + populates the cache. Kwargs: ``locale``, ``hash``, ``value``. + """ + + on_locales_cache_hit: Callable[..., Any] + on_locales_cache_miss: Callable[..., Any] + on_translations_cache_hit: Callable[..., Any] + on_translations_cache_miss: Callable[..., Any] diff --git a/packages/gt-i18n/src/gt_i18n/i18n_manager/_locales_cache.py b/packages/gt-i18n/src/gt_i18n/i18n_manager/_locales_cache.py new file mode 100644 index 0000000..6fb2f3a --- /dev/null +++ b/packages/gt-i18n/src/gt_i18n/i18n_manager/_locales_cache.py @@ -0,0 +1,113 @@ +"""Outer cache: locale → TranslationsCache with TTL. + +Port of the JS ``LocalesCache`` introduced in gt PR #1207. Wraps the user's +``load_translations`` loader and builds a per-locale ``TranslationsCache`` +on miss. Enforces a TTL on each locale entry; concurrent misses for the same +locale dedup via the ``Cache`` base. + +Loader failures propagate to the caller (and do NOT poison the cache — the +next miss retries). +""" + +from __future__ import annotations + +import inspect +import time +from collections.abc import Awaitable, Callable +from dataclasses import dataclass +from typing import Any, cast + +from gt_i18n.i18n_manager._cache import Cache +from gt_i18n.i18n_manager._lifecycle import LifecycleCallbacks +from gt_i18n.i18n_manager._translations_cache import TranslationsCache + + +@dataclass +class _LocaleEntry: + """One cached locale: its TranslationsCache, raw translations snapshot, and expiry.""" + + translations_cache: TranslationsCache + translations: dict[str, str] # snapshot for hit/miss callback payloads + expires_at: float # time.monotonic() seconds + + +class LocalesCache(Cache[str, str, _LocaleEntry, TranslationsCache]): + """Cache keyed on locale. ``get()`` returns the locale's TranslationsCache if fresh.""" + + def __init__( + self, + *, + load_translations: Callable[[str], dict[str, str] | Awaitable[dict[str, str]]], + create_translate_many: Callable[[str], Callable[[dict[str, Any]], Awaitable[dict[str, Any]]]], + ttl_ms: int = 60_000, + batch_size: int = 25, + batch_interval_ms: int = 50, + max_concurrent_requests: int = 100, + lifecycle: LifecycleCallbacks | None = None, + ) -> None: + super().__init__() + self._load_translations = load_translations + self._create_translate_many = create_translate_many + self._ttl_s = ttl_ms / 1000 + self._batch_size = batch_size + self._batch_interval_ms = batch_interval_ms + self._max_concurrent_requests = max_concurrent_requests + self._lifecycle: LifecycleCallbacks = lifecycle or {} + + # -- Cache API -------------------------------------------------------- + + def _gen_key(self, locale: str) -> str: + return locale + + def get(self, locale: str) -> TranslationsCache | None: + """Return the cached TranslationsCache for ``locale`` if present and fresh.""" + entry = self._cache.get(locale) + if entry is None: + return None + if entry.expires_at < time.monotonic(): + # Expired — evict and treat as miss. + del self._cache[locale] + return None + self._fire("on_locales_cache_hit", locale=locale, value=dict(entry.translations)) + return entry.translations_cache + + async def miss(self, locale: str) -> TranslationsCache: + """Fast-path to ``get()``; otherwise dedup-protected loader fallback.""" + cached = self.get(locale) + if cached is not None: + return cached + entry = await self._miss_cache(locale) + assert entry is not None, "_fallback returned None, which should not happen" + return entry.translations_cache + + async def _fallback(self, locale: str) -> _LocaleEntry: + """Invoke the user loader, build a new TranslationsCache, cache the entry.""" + maybe = self._load_translations(locale) + translations: dict[str, str] = ( + await maybe if inspect.isawaitable(maybe) else maybe # type: ignore[assignment] + ) + + tc = TranslationsCache( + locale=locale, + translate_many=self._create_translate_many(locale), + initial=translations, + batch_size=self._batch_size, + batch_interval_ms=self._batch_interval_ms, + max_concurrent_requests=self._max_concurrent_requests, + lifecycle=self._lifecycle, + ) + entry = _LocaleEntry( + translations_cache=tc, + translations=dict(translations), + expires_at=time.monotonic() + self._ttl_s, + ) + self._cache[locale] = entry + self._fire("on_locales_cache_miss", locale=locale, value=dict(translations)) + return entry + + # -- helpers ---------------------------------------------------------- + + def _fire(self, name: str, **kwargs: Any) -> None: + cb = cast(Callable[..., Any] | None, self._lifecycle.get(name)) # type: ignore[misc] + if cb is not None: + cb(**kwargs) diff --git a/packages/gt-i18n/src/gt_i18n/i18n_manager/_translate_many_factory.py b/packages/gt-i18n/src/gt_i18n/i18n_manager/_translate_many_factory.py new file mode 100644 index 0000000..2975eb9 --- /dev/null +++ b/packages/gt-i18n/src/gt_i18n/i18n_manager/_translate_many_factory.py @@ -0,0 +1,50 @@ +"""Factory that binds a ``GT`` instance to a per-locale ``translate_many`` callable. + +Mirrors the JS ``createTranslateMany(gtInstance, timeout)`` → ``(locale) => (sources) => gt.translateMany(...)`` +pattern from gt PR #1207. The deferred ``gt_factory`` (called each invocation rather +than frozen) is what lets tests monkey-patch ``I18nManager.get_gt_instance`` and +still have the cache route through the fake GT. +""" + +from __future__ import annotations + +from collections.abc import Awaitable, Callable +from typing import TYPE_CHECKING, Any + +if TYPE_CHECKING: + from generaltranslation._gt import GT + +DEFAULT_TRANSLATION_TIMEOUT_MS = 12_000 + + +def create_translate_many_factory( + gt_factory: Callable[[], GT], + timeout_ms: int = DEFAULT_TRANSLATION_TIMEOUT_MS, +) -> Callable[[str], Callable[[dict[str, Any]], Awaitable[dict[str, Any]]]]: + """Build a locale-binding factory for runtime ``translate_many`` calls. + + Example: + translate_many = create_translate_many_factory(manager.get_gt_instance) + es_call = translate_many("es") + response = await es_call({"": {"source": "Hello", "metadata": {...}}}) + + Args: + gt_factory: Zero-arg callable returning a GT instance. Evaluated each + call so patched GT instances (e.g. in tests) are respected. + timeout_ms: Per-request timeout in milliseconds. Defaults to 12s. + + Returns: + A callable that, given a target locale, returns an async + ``(sources_dict) -> response_dict`` function. + """ + + def for_locale(locale: str) -> Callable[[dict[str, Any]], Awaitable[dict[str, Any]]]: + async def _call(sources: dict[str, Any]) -> dict[str, Any]: + gt = gt_factory() + result = await gt.translate_many(sources, {"target_locale": locale}, timeout_ms) + assert isinstance(result, dict), "translate_many must return a dict when given a dict of sources" + return result + + return _call + + return for_locale diff --git a/packages/gt-i18n/src/gt_i18n/i18n_manager/_translations_cache.py b/packages/gt-i18n/src/gt_i18n/i18n_manager/_translations_cache.py new file mode 100644 index 0000000..071aeb2 --- /dev/null +++ b/packages/gt-i18n/src/gt_i18n/i18n_manager/_translations_cache.py @@ -0,0 +1,202 @@ +"""Inner cache: hash → translation with batched runtime fetch. + +Port of the JS ``TranslationsCache`` introduced in gt PR #1207. One instance +per locale. Batches concurrent misses into ``translate_many`` calls, dedups +in-flight requests for the same hash, and caps concurrent HTTP calls. + +Usage (typically instantiated by ``LocalesCache``): + + cache = TranslationsCache( + locale="es", + translate_many=lambda sources: gt.translate_many(sources, {"target_locale": "es"}), + initial={"h1": "Hola"}, # translations pre-loaded for the locale + ) + cached = cache.get({"message": "Hello", "options": {"_format": "STRING"}}) # sync + translation = await cache.miss({"message": "Hi", "options": {"_format": "STRING"}}) # async +""" + +from __future__ import annotations + +import asyncio +from collections.abc import Awaitable, Callable +from typing import Any, cast + +from generaltranslation._id._hash import hash_source +from generaltranslation.static._index_vars import index_vars + +from gt_i18n.i18n_manager._lifecycle import LifecycleCallbacks + + +def _compute_hash(message: str, options: dict[str, Any]) -> str: + """Compute the cache hash for a (message, options) lookup key. + + Mirrors ``hash_message`` semantics but inlined to avoid a circular import + through ``gt_i18n.translation_functions`` (whose ``__init__.py`` imports + ``_t`` which depends on the I18nManager this cache is owned by). + """ + fmt = options.get("_format", "ICU") + source = index_vars(message) if fmt == "ICU" else message + return hash_source( + source, + context=options.get("_context"), + id=options.get("_id"), + max_chars=options.get("_max_chars"), + data_format=fmt, + ) + + +def _build_metadata(hash_: str, options: dict[str, Any]) -> dict[str, Any]: + """Build the camelCase metadata block for the translate_many wire body.""" + metadata: dict[str, Any] = { + "hash": hash_, + "dataFormat": options.get("_format", "ICU"), + } + if options.get("_context") is not None: + metadata["context"] = options["_context"] + if options.get("_id") is not None: + metadata["id"] = options["_id"] + if options.get("_max_chars") is not None: + metadata["maxChars"] = options["_max_chars"] + return metadata + + +class TranslationsCache: + """Hash-keyed cache with batched runtime translate fetch. + + Constructor kwargs: + locale: The target locale this cache serves. + translate_many: Async callable ``(sources_dict) -> response_dict`` pre-bound + to this locale (see ``_translate_many_factory``). + initial: Optional seed translations loaded from the user's loader. + batch_size: Maximum entries per ``translate_many`` call. Default 25. + batch_interval_ms: Debounce interval before draining the queue. Default 50ms. + max_concurrent_requests: Cap on simultaneous in-flight ``translate_many`` + calls. Default 100. + lifecycle: Optional observability callbacks. + """ + + def __init__( + self, + *, + locale: str, + translate_many: Callable[[dict[str, Any]], Awaitable[dict[str, Any]]], + initial: dict[str, str] | None = None, + batch_size: int = 25, + batch_interval_ms: int = 50, + max_concurrent_requests: int = 100, + lifecycle: LifecycleCallbacks | None = None, + ) -> None: + self._locale = locale + self._translate_many = translate_many + self._cache: dict[str, str] = dict(initial or {}) + self._pending: dict[str, asyncio.Future[str | None]] = {} + self._queue: list[tuple[str, str, dict[str, Any]]] = [] # [(hash, message, options)] + self._batch_size = batch_size + self._batch_interval_ms = batch_interval_ms + self._max_concurrent_requests = max_concurrent_requests + self._lifecycle: LifecycleCallbacks = lifecycle or {} + self._batch_timer: asyncio.Task[None] | None = None + self._active_requests = 0 + + # -- public ------------------------------------------------------------ + + def get(self, key: dict[str, Any]) -> str | None: + """Synchronous lookup. Fires ``on_translations_cache_hit`` if cached.""" + message = key["message"] + options = key.get("options", {}) + hash_ = _compute_hash(message, options) + value = self._cache.get(hash_) + if value is not None: + self._fire("on_translations_cache_hit", locale=self._locale, hash=hash_, value=value) + return value + return None + + async def miss(self, key: dict[str, Any]) -> str | None: + """Return cached value, dedup in-flight, or enqueue a runtime fetch.""" + message = key["message"] + options = key.get("options", {}) + hash_ = _compute_hash(message, options) + + # Cache hit + if hash_ in self._cache: + value = self._cache[hash_] + self._fire("on_translations_cache_hit", locale=self._locale, hash=hash_, value=value) + return value + + # Dedup in-flight + if hash_ in self._pending: + return await self._pending[hash_] + + # New miss — enqueue + await + loop = asyncio.get_running_loop() + future: asyncio.Future[str | None] = loop.create_future() + self._pending[hash_] = future + self._queue.append((hash_, message, options)) + + if len(self._queue) >= self._batch_size: + self._drain_queue() + else: + self._schedule_batch() + + return await future + + # -- batching ---------------------------------------------------------- + + def _schedule_batch(self) -> None: + """Start a batch timer if none is currently running.""" + if self._batch_timer is None or self._batch_timer.done(): + self._batch_timer = asyncio.create_task(self._timer_body()) + + async def _timer_body(self) -> None: + """Sleep for the batch interval, then drain whatever's queued.""" + await asyncio.sleep(self._batch_interval_ms / 1000) + self._drain_queue() + + def _drain_queue(self) -> None: + """Split the queue into batches and spawn send tasks (respecting concurrency cap).""" + while self._queue and self._active_requests < self._max_concurrent_requests: + batch = self._queue[: self._batch_size] + del self._queue[: self._batch_size] + self._active_requests += 1 + asyncio.create_task(self._send_batch(batch)) + + async def _send_batch(self, batch: list[tuple[str, str, dict[str, Any]]]) -> None: + """Build the request, call translate_many, resolve per-entry futures.""" + try: + sources = { + hash_: {"source": message, "metadata": _build_metadata(hash_, options)} + for hash_, message, options in batch + } + try: + response = await self._translate_many(sources) + except BaseException as exc: # noqa: BLE001 — propagate to all awaiters + for hash_, _, _ in batch: + fut = self._pending.pop(hash_, None) + if fut is not None and not fut.done(): + fut.set_exception(exc) + return + + for hash_, _message, _options in batch: + result: dict[str, Any] = response.get(hash_) or {"success": False} + fut = self._pending.pop(hash_, None) + if result.get("success"): + value = cast(str, result["translation"]) + self._cache[hash_] = value + self._fire("on_translations_cache_miss", locale=self._locale, hash=hash_, value=value) + if fut is not None and not fut.done(): + fut.set_result(value) + else: + # Failure: do NOT cache; resolve future with None so callers fall back. + if fut is not None and not fut.done(): + fut.set_result(None) + finally: + self._active_requests -= 1 + # Admit any waiting batches now that a slot has opened up. + self._drain_queue() + + # -- lifecycle helpers ------------------------------------------------- + + def _fire(self, name: str, **kwargs: Any) -> None: + cb = cast(Callable[..., Any] | None, self._lifecycle.get(name)) # type: ignore[misc] + if cb is not None: + cb(**kwargs) diff --git a/packages/gt-i18n/src/gt_i18n/i18n_manager/_translations_manager.py b/packages/gt-i18n/src/gt_i18n/i18n_manager/_translations_manager.py deleted file mode 100644 index 7b687cd..0000000 --- a/packages/gt-i18n/src/gt_i18n/i18n_manager/_translations_manager.py +++ /dev/null @@ -1,75 +0,0 @@ -"""Translation cache with configurable loader and expiry.""" - -from __future__ import annotations - -import asyncio -import inspect -import time -from collections.abc import Awaitable, Callable - -TranslationsLoader = Callable[[str], dict[str, str] | Awaitable[dict[str, str]]] - - -class _CacheEntry: - __slots__ = ("translations", "loaded_at") - - def __init__(self, translations: dict[str, str], loaded_at: float) -> None: - self.translations = translations - self.loaded_at = loaded_at - - -class TranslationsManager: - """Caches ``dict[str, str]`` (hash -> translated string) per locale.""" - - def __init__( - self, - loader: TranslationsLoader, - cache_expiry_time: int = 60_000, - ) -> None: - self._loader = loader - self._cache_expiry_ms = cache_expiry_time - self._cache: dict[str, _CacheEntry] = {} - self._loading: dict[str, asyncio.Task[dict[str, str]]] = {} - - def _is_expired(self, entry: _CacheEntry) -> bool: - elapsed_ms = (time.monotonic() - entry.loaded_at) * 1000 - return elapsed_ms > self._cache_expiry_ms - - async def _call_loader(self, locale: str) -> dict[str, str]: - result = self._loader(locale) - if inspect.isawaitable(result): - return await result - return result # type: ignore[return-value] - - async def get_translations(self, locale: str) -> dict[str, str]: - """Load translations for *locale*, using cache when valid.""" - entry = self._cache.get(locale) - if entry is not None and not self._is_expired(entry): - return entry.translations - - # Deduplicate concurrent loads - if locale in self._loading: - return await self._loading[locale] - - task = asyncio.ensure_future(self._call_loader(locale)) - self._loading[locale] = task - try: - translations = await task - except Exception: - translations = {} - finally: - self._loading.pop(locale, None) - - self._cache[locale] = _CacheEntry(translations, time.monotonic()) - return translations - - def get_translations_sync(self, locale: str) -> dict[str, str]: - """Return cached translations or empty dict (never blocks).""" - entry = self._cache.get(locale) - if entry is not None and not self._is_expired(entry): - return entry.translations - return {} - - async def load_all(self, locales: list[str]) -> None: - """Eagerly fetch translations for all *locales*.""" - await asyncio.gather(*(self.get_translations(loc) for loc in locales)) diff --git a/packages/gt-i18n/src/gt_i18n/translation_functions/__init__.py b/packages/gt-i18n/src/gt_i18n/translation_functions/__init__.py index 2708467..31ad036 100644 --- a/packages/gt-i18n/src/gt_i18n/translation_functions/__init__.py +++ b/packages/gt-i18n/src/gt_i18n/translation_functions/__init__.py @@ -7,6 +7,7 @@ from gt_i18n.translation_functions._interpolate import interpolate_message from gt_i18n.translation_functions._msg import msg from gt_i18n.translation_functions._t import t +from gt_i18n.translation_functions._tx import tx __all__ = [ "decode_msg", @@ -18,4 +19,5 @@ "msg", "t", "t_fallback", + "tx", ] diff --git a/packages/gt-i18n/src/gt_i18n/translation_functions/_hash_message.py b/packages/gt-i18n/src/gt_i18n/translation_functions/_hash_message.py index c53e1d0..dafe3dd 100644 --- a/packages/gt-i18n/src/gt_i18n/translation_functions/_hash_message.py +++ b/packages/gt-i18n/src/gt_i18n/translation_functions/_hash_message.py @@ -12,22 +12,30 @@ def hash_message( context: str | None = None, id: str | None = None, max_chars: int | None = None, + format: str = "ICU", ) -> str: - """Hash an ICU message after indexing its variables. + """Hash a message for translation lookup. + + For ICU-format messages, variables are first normalized via ``index_vars`` + so two templates differing only in variable names collide. For STRING and + I18NEXT formats, the raw source is hashed verbatim — the wire representation + is the user's literal template. Args: - message: The ICU MessageFormat source string. + message: The message source string. context: Optional context string for disambiguation. id: Optional explicit message ID. max_chars: Optional max character constraint. + format: Data format — ``"ICU"`` (default), ``"STRING"``, or ``"I18NEXT"``. Returns: A hex hash string. """ + source = index_vars(message) if format == "ICU" else message return hash_source( - index_vars(message), + source, context=context, id=id, max_chars=max_chars, - data_format="ICU", + data_format=format, ) diff --git a/packages/gt-i18n/src/gt_i18n/translation_functions/_t.py b/packages/gt-i18n/src/gt_i18n/translation_functions/_t.py index 405027a..154df74 100644 --- a/packages/gt-i18n/src/gt_i18n/translation_functions/_t.py +++ b/packages/gt-i18n/src/gt_i18n/translation_functions/_t.py @@ -3,7 +3,6 @@ from __future__ import annotations from gt_i18n.i18n_manager._singleton import get_i18n_manager -from gt_i18n.translation_functions._hash_message import hash_message from gt_i18n.translation_functions._interpolate import interpolate_message @@ -11,8 +10,8 @@ def t(message: str, **kwargs: object) -> str: """Translate and interpolate a message. Looks up the current locale from the I18nManager, finds a cached - translation by hash, and interpolates variables. Falls back to the - source message if no translation is available. + translation via ``lookup_translation``, and interpolates variables. + Falls back to the source message if no translation is available. Args: message: The ICU MessageFormat source string. @@ -28,14 +27,13 @@ def t(message: str, **kwargs: object) -> str: if not manager.requires_translation(locale): return interpolate_message(message, kwargs, locale) - translations = manager.get_translations_sync(locale) - h = hash_message( + translated = manager.lookup_translation( message, - context=kwargs.get("_context"), # type: ignore[arg-type] - id=kwargs.get("_id"), # type: ignore[arg-type] - max_chars=kwargs.get("_max_chars"), # type: ignore[arg-type] + _context=kwargs.get("_context"), + _id=kwargs.get("_id"), + _max_chars=kwargs.get("_max_chars"), + _format="ICU", ) - translated = translations.get(h) if translated: return interpolate_message(translated, {**kwargs, "__fallback": message}, locale) diff --git a/packages/gt-i18n/src/gt_i18n/translation_functions/_tx.py b/packages/gt-i18n/src/gt_i18n/translation_functions/_tx.py new file mode 100644 index 0000000..6dd7b78 --- /dev/null +++ b/packages/gt-i18n/src/gt_i18n/translation_functions/_tx.py @@ -0,0 +1,37 @@ +"""The ``tx()`` function — runtime translation via GT.translate_many. + +Mirror of the JS ``tx()`` added in gt PR #1207: an async string translator +that fetches a translation on cache miss, caches it, and falls back to the +source on any failure. Default data format is ``STRING``. +""" + +from __future__ import annotations + +from typing import Any + +from gt_i18n.i18n_manager._singleton import get_i18n_manager +from gt_i18n.translation_functions._interpolate import interpolate_message + + +async def tx(content: str, **kwargs: Any) -> str: + """Runtime translate a string, then interpolate variables. + + On cache miss, batches into a ``GT.translate_many`` call (via the active + I18nManager's TranslationsCache). Returns the interpolated source if the + target locale matches the source locale, or if runtime translation fails. + + Args: + content: The source message. + **kwargs: Interpolation variables and GT options + (``_locale``, ``_format``, ``_context``, ``_id``, ``_max_chars``). + + Returns: + The translated and interpolated string, or interpolated source on failure. + """ + manager = get_i18n_manager() + kwargs.setdefault("_format", "STRING") + + translation = await manager.lookup_translation_with_fallback(content, **kwargs) + target = translation if translation is not None else content + locale = kwargs.get("_locale") or manager.get_locale() + return interpolate_message(target, kwargs, locale) diff --git a/packages/gt-i18n/tests/conftest.py b/packages/gt-i18n/tests/conftest.py index ea2e575..82d4bf3 100644 --- a/packages/gt-i18n/tests/conftest.py +++ b/packages/gt-i18n/tests/conftest.py @@ -1,6 +1,169 @@ -"""Make the tests directory importable so helpers.py can be used.""" +"""Shared test infrastructure for gt-i18n. +Makes the tests directory importable (so ``helpers.py`` can be used), +provides the shared ``fake_gt`` fixture used by runtime-translation tests, +and resets the I18nManager singleton between tests. +""" + +from __future__ import annotations + +import asyncio import sys +from collections.abc import Awaitable, Callable, Generator from pathlib import Path +from typing import Any + +import pytest sys.path.insert(0, str(Path(__file__).parent)) + + +# --------------------------------------------------------------------------- +# FakeGT — test double for the GT class used for runtime translation. +# --------------------------------------------------------------------------- + + +class FakeGT: + """Test double mirroring ``GT.translate_many`` for runtime-translation tests. + + Records every call and returns a configurable response. Supports simulating + slow / pending requests for concurrency / batching tests. + + Example: + gt = FakeGT() + # Default response: echo each source prefixed with [locale]. + result = await gt.translate_many( + {"h1": {"source": "Hello", "metadata": {"dataFormat": "STRING"}}}, + {"target_locale": "es"}, + ) + # → {"h1": {"success": True, "translation": "[es]Hello", "locale": "es"}} + """ + + def __init__(self) -> None: + # Each entry: {"sources": dict, "options": dict, "timeout": int | None} + self.calls: list[dict[str, Any]] = [] + + # Default: echo with [locale] prefix. Override to return custom payloads. + def _default_factory(sources: dict[str, Any], options: dict[str, Any]) -> dict[str, Any]: + locale = options.get("target_locale", options.get("targetLocale", "?")) + return { + h: { + "success": True, + "translation": f"[{locale}]{entry['source']}", + "locale": locale, + } + for h, entry in sources.items() + } + + self.response_factory: Callable[[dict[str, Any], dict[str, Any]], dict[str, Any]] = _default_factory + + # Added sleep before returning a response. Used for concurrency tests. + self.delay_s: float = 0.0 + + # When set, the mock awaits this event before returning. Used to hold + # requests "in flight" so we can observe max-concurrency behavior. + self.inflight_event: asyncio.Event | None = None + + # Running count of in-flight requests (increments on entry, decrements on exit). + self.inflight_count: int = 0 + self.max_inflight: int = 0 + + async def translate_many( + self, + sources: dict[str, Any] | list[Any], + options: dict[str, Any] | str, + timeout: int | None = None, + ) -> dict[str, Any] | list[Any]: + """Mirror ``GT.translate_many(sources, options, timeout)`` signature.""" + if isinstance(options, str): + options = {"target_locale": options} + if isinstance(sources, list): + raise AssertionError( + "FakeGT only supports dict `sources` — runtime translation batches are dict-keyed by hash." + ) + self.calls.append({"sources": dict(sources), "options": dict(options), "timeout": timeout}) + + self.inflight_count += 1 + self.max_inflight = max(self.max_inflight, self.inflight_count) + try: + if self.inflight_event is not None: + await self.inflight_event.wait() + if self.delay_s: + await asyncio.sleep(self.delay_s) + return self.response_factory(sources, options) + finally: + self.inflight_count -= 1 + + +@pytest.fixture +def fake_gt(monkeypatch: pytest.MonkeyPatch) -> FakeGT: + """Install a ``FakeGT`` in place of any ``GT`` instance the I18nManager constructs. + + Patches both ``I18nManager.get_gt_instance`` and the ``GT`` class import inside the + internal i18n-manager modules, so regardless of how the implementation wires up + runtime translation, tests still route through the fake. + + Tests that drive the public API (``tx()``, ``I18nManager``) get the fake automatically. + Tests that instantiate ``TranslationsCache`` directly can bind ``fake_gt.translate_many`` + as the ``translate_many`` callable. + """ + gt = FakeGT() + + try: + from gt_i18n import I18nManager + + monkeypatch.setattr(I18nManager, "get_gt_instance", lambda self: gt) + except ImportError: + pass + + for path in ( + "gt_i18n.i18n_manager._i18n_manager.GT", + "gt_i18n.i18n_manager._translations_cache.GT", + "gt_i18n.i18n_manager._locales_cache.GT", + ): + try: + monkeypatch.setattr(path, lambda *a, **kw: gt, raising=False) + except (ModuleNotFoundError, ImportError): + pass + + return gt + + +@pytest.fixture +def translate_many_for_locale( + fake_gt: FakeGT, +) -> Callable[[str], Callable[[dict[str, Any]], Awaitable[dict[str, Any]]]]: + """Return a factory that binds ``fake_gt.translate_many`` to a specific target locale. + + Mirrors the JS ``createTranslateMany(locale) => (sources) => gt.translateMany(...)`` + factory pattern. Convenient for directly instantiating ``TranslationsCache`` in tests. + + Example: + translate_many = translate_many_for_locale("es") + cache = TranslationsCache(locale="es", translate_many=translate_many, ...) + """ + + def make(locale: str) -> Callable[[dict[str, Any]], Awaitable[dict[str, Any]]]: + async def _translate_many(sources: dict[str, Any]) -> dict[str, Any]: + result = await fake_gt.translate_many(sources, {"target_locale": locale}) + assert isinstance(result, dict) + return result + + return _translate_many + + return make + + +# --------------------------------------------------------------------------- +# Singleton reset — prevents cross-test state leakage. +# --------------------------------------------------------------------------- + + +@pytest.fixture(autouse=True) +def _reset_i18n_manager_singleton() -> Generator[None, None, None]: + """Reset the module-level I18nManager singleton between tests.""" + import gt_i18n.i18n_manager._singleton as mod + + old = mod._manager + yield + mod._manager = old diff --git a/packages/gt-i18n/tests/fixtures/generate_runtime_fixtures.mjs b/packages/gt-i18n/tests/fixtures/generate_runtime_fixtures.mjs new file mode 100644 index 0000000..7dc7bd6 --- /dev/null +++ b/packages/gt-i18n/tests/fixtures/generate_runtime_fixtures.mjs @@ -0,0 +1,169 @@ +/** + * Generate bulk parity fixtures for the runtime-translation port. + * + * Exercises hashMessage (the JS ``hashSource`` with indexVars for ICU only) + * across a wide combinatorial space of (message, format, context, id, maxChars). + * + * Run: + * cd ~/Documents/dev/gt && \ + * node ~/Documents/dev/gt-python-wt/e-i18n-add-runtime-translation/packages/gt-i18n/tests/fixtures/generate_runtime_fixtures.mjs \ + * > ~/Documents/dev/gt-python-wt/e-i18n-add-runtime-translation/packages/gt-i18n/tests/fixtures/runtime_translation_parity.json + */ + +import { hashSource } from 'generaltranslation/id'; +import { indexVars } from 'generaltranslation/internal'; + +// --------------------------------------------------------------------------- +// Message corpora per format +// --------------------------------------------------------------------------- + +const ICU_MESSAGES = [ + "", + "Hello", + "Hello, world!", + "Hello, {name}!", + "{count, plural, one {# item} other {# items}}", + "{count, plural, one {# item} other {# items}} in cart", + "{gender, select, male {he} female {she} other {they}}", + "Hello {name} ({email})", + "Welcome to {app}, {user}!", + "{_gt_, select, other {John}}", + "Hello {_gt_, select, other {John}}!", + "Hello {_gt_, select, other {John}} and {_gt_, select, other {Jane}}", + "Hello {_gt_, select, other {John} _gt_var_name {user}}!", + "Save", + "Cancel", + "Delete {count, plural, one {# item} other {# items}}", + "Você tem {count, plural, one {# mensagem} other {# mensagens}}", + "こんにちは、{name}さん!", + "🎉 Welcome, {name}! 🎉", + "{a}{b}{c}{d}{e}", + " leading and trailing whitespace ", + "Line1\nLine2\nLine3", + "Tab\there", + "A very long message that goes on and on and on and on and repeats many words many times in sequence for testing long-content handling behavior within the hash computation pipeline end-to-end.", +]; + +const STRING_MESSAGES = [ + "", + "Hello", + "Hello, world!", + "Hello, {name}!", + "Welcome {user}!", + "Items: {count}", + " whitespace ", + "Line1\nLine2", + "emoji 🎉 test", + "こんにちは", + "Long" + " string".repeat(50), + "{}", + "{{escaped}}", + "{a}{b}{c}", + "single_word", + "hyphen-ated", + "under_scored", +]; + +const I18NEXT_MESSAGES = [ + "", + "Hello", + "Hello, world!", + "Hello, {{name}}!", + "Welcome {{user}}!", + "You have {{count}} items", + "Nested {{outer}} with {{inner}}", + "Mixed {{name}} and {count}", + " whitespace ", + "emoji 🎉 {{name}}", + "{{a}}{{b}}{{c}}", +]; + +// --------------------------------------------------------------------------- +// Option variants +// --------------------------------------------------------------------------- + +const CONTEXTS = [null, "", "button", "menu", "a_very_long_context_value_for_testing", "こんにちは"]; +const IDS = [null, "", "greeting", "user.profile.button", "x/y/z", "a-b-c"]; +const MAX_CHARS_VALUES = [null, 0, 5, 50, 1000, -10, -500]; + +// --------------------------------------------------------------------------- +// Generate +// --------------------------------------------------------------------------- + +function hashMessage(message, format, options) { + return hashSource({ + source: format === "ICU" ? indexVars(message) : message, + ...(options.context && { context: options.context }), + ...(options.id && { id: options.id }), + ...(options.maxChars != null && { maxChars: Math.abs(options.maxChars) }), + dataFormat: format, + }); +} + +const cases = []; + +function addCases(format, messages) { + // Full matrix for each message: basic, each context, each id, each maxChars. + for (const message of messages) { + // Plain (no options) + cases.push({ message, format, context: null, id: null, maxChars: null }); + // Each context + for (const context of CONTEXTS.filter((c) => c !== null)) { + cases.push({ message, format, context, id: null, maxChars: null }); + } + // Each id + for (const id of IDS.filter((i) => i !== null)) { + cases.push({ message, format, context: null, id, maxChars: null }); + } + // Each maxChars + for (const maxChars of MAX_CHARS_VALUES.filter((m) => m !== null)) { + cases.push({ message, format, context: null, id: null, maxChars }); + } + // Context + id together (sample) + cases.push({ message, format, context: "button", id: "save_btn", maxChars: null }); + // Context + maxChars together (sample) + cases.push({ message, format, context: "menu", id: null, maxChars: 20 }); + // All three + cases.push({ message, format, context: "menu", id: "menu_save", maxChars: 50 }); + } +} + +addCases("ICU", ICU_MESSAGES); +addCases("STRING", STRING_MESSAGES); +addCases("I18NEXT", I18NEXT_MESSAGES); + +// Compute hashes +for (const c of cases) { + c.hash = hashMessage(c.message, c.format, { + context: c.context, + id: c.id, + maxChars: c.maxChars, + }); +} + +// Also emit a few cross-format collision checks: same (message, context, id, maxChars) +// across ICU vs STRING vs I18NEXT should produce DIFFERENT hashes. +const collisionCases = []; +for (const message of ["", "Hello", "Hello, {name}!", "simple"]) { + for (const context of [null, "button"]) { + const setOfHashes = {}; + for (const format of ["ICU", "STRING", "I18NEXT"]) { + setOfHashes[format] = hashMessage(message, format, { + context, + id: null, + maxChars: null, + }); + } + collisionCases.push({ message, context, hashes: setOfHashes }); + } +} + +const output = { + note: "Generated by generate_runtime_fixtures.mjs. Do not edit by hand.", + caseCount: cases.length, + cases, + collisionCases, +}; + +// Stdout — redirect to fixture JSON file. +process.stdout.write(JSON.stringify(output, null, 2)); diff --git a/packages/gt-i18n/tests/fixtures/runtime_translation_parity.json b/packages/gt-i18n/tests/fixtures/runtime_translation_parity.json new file mode 100644 index 0000000..14b62dd --- /dev/null +++ b/packages/gt-i18n/tests/fixtures/runtime_translation_parity.json @@ -0,0 +1,8400 @@ +{ + "note": "Generated by generate_runtime_fixtures.mjs. Do not edit by hand.", + "caseCount": 1040, + "cases": [ + { + "message": "", + "format": "ICU", + "context": null, + "id": null, + "maxChars": null, + "hash": "e2b070737b68f01d" + }, + { + "message": "", + "format": "ICU", + "context": "", + "id": null, + "maxChars": null, + "hash": "e2b070737b68f01d" + }, + { + "message": "", + "format": "ICU", + "context": "button", + "id": null, + "maxChars": null, + "hash": "b3be66567482f5b6" + }, + { + "message": "", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "701e922b40603a65" + }, + { + "message": "", + "format": "ICU", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "5abda81f5115c0ad" + }, + { + "message": "", + "format": "ICU", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "993fb4917a390875" + }, + { + "message": "", + "format": "ICU", + "context": null, + "id": "", + "maxChars": null, + "hash": "e2b070737b68f01d" + }, + { + "message": "", + "format": "ICU", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "c54eedd1e42b21b0" + }, + { + "message": "", + "format": "ICU", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "a895cbbbe3bc87b8" + }, + { + "message": "", + "format": "ICU", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "144934faf8032414" + }, + { + "message": "", + "format": "ICU", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "a2259d0f0a78e487" + }, + { + "message": "", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 0, + "hash": "5e6c17e9967b37a4" + }, + { + "message": "", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 5, + "hash": "6d26c53d29a330b8" + }, + { + "message": "", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 50, + "hash": "5a2e71105f01fcc0" + }, + { + "message": "", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "163595ded098b49c" + }, + { + "message": "", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -10, + "hash": "5001799737e9948b" + }, + { + "message": "", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -500, + "hash": "a28fcb37ca7c3c1c" + }, + { + "message": "", + "format": "ICU", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "f33eb3251a21e4a8" + }, + { + "message": "", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "62a6f9ffc39ce630" + }, + { + "message": "", + "format": "ICU", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "f9813d29eed4b727" + }, + { + "message": "Hello", + "format": "ICU", + "context": null, + "id": null, + "maxChars": null, + "hash": "3cdffbf6452b553c" + }, + { + "message": "Hello", + "format": "ICU", + "context": "", + "id": null, + "maxChars": null, + "hash": "3cdffbf6452b553c" + }, + { + "message": "Hello", + "format": "ICU", + "context": "button", + "id": null, + "maxChars": null, + "hash": "7ff12157cc99c288" + }, + { + "message": "Hello", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "aef2fe9520fead9d" + }, + { + "message": "Hello", + "format": "ICU", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "e8a1a338a5299346" + }, + { + "message": "Hello", + "format": "ICU", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "08b3b4d789557906" + }, + { + "message": "Hello", + "format": "ICU", + "context": null, + "id": "", + "maxChars": null, + "hash": "3cdffbf6452b553c" + }, + { + "message": "Hello", + "format": "ICU", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "e0043c773a7eaf71" + }, + { + "message": "Hello", + "format": "ICU", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "4f71e171b06dac63" + }, + { + "message": "Hello", + "format": "ICU", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "2ac7be935378e1a0" + }, + { + "message": "Hello", + "format": "ICU", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "314104d82d39f450" + }, + { + "message": "Hello", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 0, + "hash": "76ce9978e65ee6a7" + }, + { + "message": "Hello", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 5, + "hash": "1ca976faecf212f4" + }, + { + "message": "Hello", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 50, + "hash": "a60c9de939afa7ed" + }, + { + "message": "Hello", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "bc42551678526141" + }, + { + "message": "Hello", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -10, + "hash": "caba7f8dc8e5092a" + }, + { + "message": "Hello", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -500, + "hash": "47bd2f03f21bf6e4" + }, + { + "message": "Hello", + "format": "ICU", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "5a284dcdf2f6afbf" + }, + { + "message": "Hello", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "ca7922d2491df149" + }, + { + "message": "Hello", + "format": "ICU", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "ee6bd56ff923a531" + }, + { + "message": "Hello, world!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": null, + "hash": "8042e0a3d395c1fb" + }, + { + "message": "Hello, world!", + "format": "ICU", + "context": "", + "id": null, + "maxChars": null, + "hash": "8042e0a3d395c1fb" + }, + { + "message": "Hello, world!", + "format": "ICU", + "context": "button", + "id": null, + "maxChars": null, + "hash": "2ae2a1ab2eaa63b4" + }, + { + "message": "Hello, world!", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "36fd1171047e884a" + }, + { + "message": "Hello, world!", + "format": "ICU", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "c3785cc2c2d67512" + }, + { + "message": "Hello, world!", + "format": "ICU", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "24a827c0e5fc2ce5" + }, + { + "message": "Hello, world!", + "format": "ICU", + "context": null, + "id": "", + "maxChars": null, + "hash": "8042e0a3d395c1fb" + }, + { + "message": "Hello, world!", + "format": "ICU", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "390a63b9c7149f72" + }, + { + "message": "Hello, world!", + "format": "ICU", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "18213a39d4429202" + }, + { + "message": "Hello, world!", + "format": "ICU", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "60c7767ad4e92bbc" + }, + { + "message": "Hello, world!", + "format": "ICU", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "3bd87677c5c5615c" + }, + { + "message": "Hello, world!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 0, + "hash": "eecd3a2672301f67" + }, + { + "message": "Hello, world!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 5, + "hash": "a71785cff0e2a7ed" + }, + { + "message": "Hello, world!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 50, + "hash": "ea52f12c925de21e" + }, + { + "message": "Hello, world!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "fc5630ebd698a4c0" + }, + { + "message": "Hello, world!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -10, + "hash": "c99d67f664df4929" + }, + { + "message": "Hello, world!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -500, + "hash": "df1be11924223a70" + }, + { + "message": "Hello, world!", + "format": "ICU", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "1cf1699821315a37" + }, + { + "message": "Hello, world!", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "706c56ecc1c9179f" + }, + { + "message": "Hello, world!", + "format": "ICU", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "3ba644e1b1da9264" + }, + { + "message": "Hello, {name}!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": null, + "hash": "9b323e35e1a80c51" + }, + { + "message": "Hello, {name}!", + "format": "ICU", + "context": "", + "id": null, + "maxChars": null, + "hash": "9b323e35e1a80c51" + }, + { + "message": "Hello, {name}!", + "format": "ICU", + "context": "button", + "id": null, + "maxChars": null, + "hash": "9f2b79f9cacb6ce9" + }, + { + "message": "Hello, {name}!", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "40b55312146178b2" + }, + { + "message": "Hello, {name}!", + "format": "ICU", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "5ae454dac6a4cd8b" + }, + { + "message": "Hello, {name}!", + "format": "ICU", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "7b0d6d0c24902af6" + }, + { + "message": "Hello, {name}!", + "format": "ICU", + "context": null, + "id": "", + "maxChars": null, + "hash": "9b323e35e1a80c51" + }, + { + "message": "Hello, {name}!", + "format": "ICU", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "9419278172018d76" + }, + { + "message": "Hello, {name}!", + "format": "ICU", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "71cc0bf0b92600d7" + }, + { + "message": "Hello, {name}!", + "format": "ICU", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "7e3aacc263d5de5e" + }, + { + "message": "Hello, {name}!", + "format": "ICU", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "468c94a1ab307603" + }, + { + "message": "Hello, {name}!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 0, + "hash": "8bc86103cfb34c0e" + }, + { + "message": "Hello, {name}!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 5, + "hash": "caa2cbf7077a7641" + }, + { + "message": "Hello, {name}!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 50, + "hash": "d5e7d603ea9fbe2d" + }, + { + "message": "Hello, {name}!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "2c7379a25831aceb" + }, + { + "message": "Hello, {name}!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -10, + "hash": "9cf7d2b6cb3658b2" + }, + { + "message": "Hello, {name}!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -500, + "hash": "2b1239371a716112" + }, + { + "message": "Hello, {name}!", + "format": "ICU", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "42f5f60d92e81835" + }, + { + "message": "Hello, {name}!", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "65f0bdf6d1f92a04" + }, + { + "message": "Hello, {name}!", + "format": "ICU", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "b48bc8b209d17705" + }, + { + "message": "{count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": null, + "hash": "f5683e647a65e15c" + }, + { + "message": "{count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": "", + "id": null, + "maxChars": null, + "hash": "f5683e647a65e15c" + }, + { + "message": "{count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": "button", + "id": null, + "maxChars": null, + "hash": "ba67cbd6e49d7b4e" + }, + { + "message": "{count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "9255c8e3213375b7" + }, + { + "message": "{count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "aaa216a2b3ef2d03" + }, + { + "message": "{count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "fc6581b5450d8e14" + }, + { + "message": "{count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": null, + "id": "", + "maxChars": null, + "hash": "f5683e647a65e15c" + }, + { + "message": "{count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "d02f157865950074" + }, + { + "message": "{count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "6e4f536f9d96ff82" + }, + { + "message": "{count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "749b5a2043734d0c" + }, + { + "message": "{count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "b8e9393ba97c011f" + }, + { + "message": "{count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 0, + "hash": "1a82501c18dc7883" + }, + { + "message": "{count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 5, + "hash": "8adb715eab67229b" + }, + { + "message": "{count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 50, + "hash": "08ed37a1ed326740" + }, + { + "message": "{count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "525170a9667e1375" + }, + { + "message": "{count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -10, + "hash": "269f0e92efefaf28" + }, + { + "message": "{count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -500, + "hash": "5d16630e57547d7d" + }, + { + "message": "{count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "7f181e2a36921c9e" + }, + { + "message": "{count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "df6d103dc026f84a" + }, + { + "message": "{count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "5f84dfeffed32631" + }, + { + "message": "{count, plural, one {# item} other {# items}} in cart", + "format": "ICU", + "context": null, + "id": null, + "maxChars": null, + "hash": "7bc92bde532bab2e" + }, + { + "message": "{count, plural, one {# item} other {# items}} in cart", + "format": "ICU", + "context": "", + "id": null, + "maxChars": null, + "hash": "7bc92bde532bab2e" + }, + { + "message": "{count, plural, one {# item} other {# items}} in cart", + "format": "ICU", + "context": "button", + "id": null, + "maxChars": null, + "hash": "f1566942be8c204d" + }, + { + "message": "{count, plural, one {# item} other {# items}} in cart", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "0c0069d2f78d5eb3" + }, + { + "message": "{count, plural, one {# item} other {# items}} in cart", + "format": "ICU", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "6137fd0d73e3a6f6" + }, + { + "message": "{count, plural, one {# item} other {# items}} in cart", + "format": "ICU", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "a3ab534e14088ba2" + }, + { + "message": "{count, plural, one {# item} other {# items}} in cart", + "format": "ICU", + "context": null, + "id": "", + "maxChars": null, + "hash": "7bc92bde532bab2e" + }, + { + "message": "{count, plural, one {# item} other {# items}} in cart", + "format": "ICU", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "d35ca2132664c7df" + }, + { + "message": "{count, plural, one {# item} other {# items}} in cart", + "format": "ICU", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "e72a08f2755ce29f" + }, + { + "message": "{count, plural, one {# item} other {# items}} in cart", + "format": "ICU", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "be1d321b1741c803" + }, + { + "message": "{count, plural, one {# item} other {# items}} in cart", + "format": "ICU", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "f311606efec8d07e" + }, + { + "message": "{count, plural, one {# item} other {# items}} in cart", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 0, + "hash": "9015c8d942dc243a" + }, + { + "message": "{count, plural, one {# item} other {# items}} in cart", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 5, + "hash": "d05dad5c7fd295c2" + }, + { + "message": "{count, plural, one {# item} other {# items}} in cart", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 50, + "hash": "56e36de7b258e6e9" + }, + { + "message": "{count, plural, one {# item} other {# items}} in cart", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "4db25f05270e00e0" + }, + { + "message": "{count, plural, one {# item} other {# items}} in cart", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -10, + "hash": "b10d6a847d248eb1" + }, + { + "message": "{count, plural, one {# item} other {# items}} in cart", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -500, + "hash": "8c6bb907cb85e0eb" + }, + { + "message": "{count, plural, one {# item} other {# items}} in cart", + "format": "ICU", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "48fa021706192695" + }, + { + "message": "{count, plural, one {# item} other {# items}} in cart", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "8b465ad13a266b4e" + }, + { + "message": "{count, plural, one {# item} other {# items}} in cart", + "format": "ICU", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "6934930df220fc08" + }, + { + "message": "{gender, select, male {he} female {she} other {they}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": null, + "hash": "582d9b1bfa7f40ba" + }, + { + "message": "{gender, select, male {he} female {she} other {they}}", + "format": "ICU", + "context": "", + "id": null, + "maxChars": null, + "hash": "582d9b1bfa7f40ba" + }, + { + "message": "{gender, select, male {he} female {she} other {they}}", + "format": "ICU", + "context": "button", + "id": null, + "maxChars": null, + "hash": "8fac6187ce9b3d40" + }, + { + "message": "{gender, select, male {he} female {she} other {they}}", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "6ae4b41c72f743a4" + }, + { + "message": "{gender, select, male {he} female {she} other {they}}", + "format": "ICU", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "5f96ebe8d0975598" + }, + { + "message": "{gender, select, male {he} female {she} other {they}}", + "format": "ICU", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "983af386f87fa35b" + }, + { + "message": "{gender, select, male {he} female {she} other {they}}", + "format": "ICU", + "context": null, + "id": "", + "maxChars": null, + "hash": "582d9b1bfa7f40ba" + }, + { + "message": "{gender, select, male {he} female {she} other {they}}", + "format": "ICU", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "46635484eb0da563" + }, + { + "message": "{gender, select, male {he} female {she} other {they}}", + "format": "ICU", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "2c9a8d123f368d47" + }, + { + "message": "{gender, select, male {he} female {she} other {they}}", + "format": "ICU", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "65c499d052d9ae9c" + }, + { + "message": "{gender, select, male {he} female {she} other {they}}", + "format": "ICU", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "c0d78ab03423e279" + }, + { + "message": "{gender, select, male {he} female {she} other {they}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 0, + "hash": "ca2173895a55ec4b" + }, + { + "message": "{gender, select, male {he} female {she} other {they}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 5, + "hash": "89286d8296639df8" + }, + { + "message": "{gender, select, male {he} female {she} other {they}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 50, + "hash": "37b87fdf1fef2b15" + }, + { + "message": "{gender, select, male {he} female {she} other {they}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "fac8a6e5d9a080e2" + }, + { + "message": "{gender, select, male {he} female {she} other {they}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -10, + "hash": "bc646768882fb6ed" + }, + { + "message": "{gender, select, male {he} female {she} other {they}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -500, + "hash": "f12496f4c053b232" + }, + { + "message": "{gender, select, male {he} female {she} other {they}}", + "format": "ICU", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "b6955b70f8f3b5ed" + }, + { + "message": "{gender, select, male {he} female {she} other {they}}", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "0021569db289dbf1" + }, + { + "message": "{gender, select, male {he} female {she} other {they}}", + "format": "ICU", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "1db0f48470075c39" + }, + { + "message": "Hello {name} ({email})", + "format": "ICU", + "context": null, + "id": null, + "maxChars": null, + "hash": "a07401ee7472bf7e" + }, + { + "message": "Hello {name} ({email})", + "format": "ICU", + "context": "", + "id": null, + "maxChars": null, + "hash": "a07401ee7472bf7e" + }, + { + "message": "Hello {name} ({email})", + "format": "ICU", + "context": "button", + "id": null, + "maxChars": null, + "hash": "8d094bb50de00675" + }, + { + "message": "Hello {name} ({email})", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "6f48c283c607d670" + }, + { + "message": "Hello {name} ({email})", + "format": "ICU", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "055071ad984363b3" + }, + { + "message": "Hello {name} ({email})", + "format": "ICU", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "4c25448cfb3cec69" + }, + { + "message": "Hello {name} ({email})", + "format": "ICU", + "context": null, + "id": "", + "maxChars": null, + "hash": "a07401ee7472bf7e" + }, + { + "message": "Hello {name} ({email})", + "format": "ICU", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "012028f0dc4374ad" + }, + { + "message": "Hello {name} ({email})", + "format": "ICU", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "513854b600287d53" + }, + { + "message": "Hello {name} ({email})", + "format": "ICU", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "f9699f59a4a4fd8d" + }, + { + "message": "Hello {name} ({email})", + "format": "ICU", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "fb5f28d3745041b8" + }, + { + "message": "Hello {name} ({email})", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 0, + "hash": "428e9edff9306b9a" + }, + { + "message": "Hello {name} ({email})", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 5, + "hash": "0b93b70bec9a614f" + }, + { + "message": "Hello {name} ({email})", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 50, + "hash": "f7f8386f00546905" + }, + { + "message": "Hello {name} ({email})", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "42a9bbc6037ac39c" + }, + { + "message": "Hello {name} ({email})", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -10, + "hash": "2cd52c01cf779aec" + }, + { + "message": "Hello {name} ({email})", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -500, + "hash": "a7d6bc9bd6c70d58" + }, + { + "message": "Hello {name} ({email})", + "format": "ICU", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "a9255a5767b4a9e9" + }, + { + "message": "Hello {name} ({email})", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "a55bb98e10918ba9" + }, + { + "message": "Hello {name} ({email})", + "format": "ICU", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "f0d78d84cb27dcf3" + }, + { + "message": "Welcome to {app}, {user}!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": null, + "hash": "7a35206ea867ddad" + }, + { + "message": "Welcome to {app}, {user}!", + "format": "ICU", + "context": "", + "id": null, + "maxChars": null, + "hash": "7a35206ea867ddad" + }, + { + "message": "Welcome to {app}, {user}!", + "format": "ICU", + "context": "button", + "id": null, + "maxChars": null, + "hash": "08c5ee4279c23541" + }, + { + "message": "Welcome to {app}, {user}!", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "32dbded6bec25da6" + }, + { + "message": "Welcome to {app}, {user}!", + "format": "ICU", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "63682de60dab6d26" + }, + { + "message": "Welcome to {app}, {user}!", + "format": "ICU", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "68b0fc166dffcf37" + }, + { + "message": "Welcome to {app}, {user}!", + "format": "ICU", + "context": null, + "id": "", + "maxChars": null, + "hash": "7a35206ea867ddad" + }, + { + "message": "Welcome to {app}, {user}!", + "format": "ICU", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "92195596a9ad3e2e" + }, + { + "message": "Welcome to {app}, {user}!", + "format": "ICU", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "0ab241e50dbaa38e" + }, + { + "message": "Welcome to {app}, {user}!", + "format": "ICU", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "cfaa492faebdc98f" + }, + { + "message": "Welcome to {app}, {user}!", + "format": "ICU", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "db18f54ca623c033" + }, + { + "message": "Welcome to {app}, {user}!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 0, + "hash": "7a30cd282adf9f38" + }, + { + "message": "Welcome to {app}, {user}!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 5, + "hash": "cc3c6f4c229505a7" + }, + { + "message": "Welcome to {app}, {user}!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 50, + "hash": "61bbdc759c6934c9" + }, + { + "message": "Welcome to {app}, {user}!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "1cbe1e528f70c83f" + }, + { + "message": "Welcome to {app}, {user}!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -10, + "hash": "e60a6f6fced64fd3" + }, + { + "message": "Welcome to {app}, {user}!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -500, + "hash": "86c7f51b08411d66" + }, + { + "message": "Welcome to {app}, {user}!", + "format": "ICU", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "564cf4f01e2d08d7" + }, + { + "message": "Welcome to {app}, {user}!", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "b6b0e13c4ba90b9b" + }, + { + "message": "Welcome to {app}, {user}!", + "format": "ICU", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "fa9c4d229ef38c65" + }, + { + "message": "{_gt_, select, other {John}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": null, + "hash": "5730e81638a4eefc" + }, + { + "message": "{_gt_, select, other {John}}", + "format": "ICU", + "context": "", + "id": null, + "maxChars": null, + "hash": "5730e81638a4eefc" + }, + { + "message": "{_gt_, select, other {John}}", + "format": "ICU", + "context": "button", + "id": null, + "maxChars": null, + "hash": "8a2612c09763534f" + }, + { + "message": "{_gt_, select, other {John}}", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "79db31b3635fbf21" + }, + { + "message": "{_gt_, select, other {John}}", + "format": "ICU", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "a1ebbdbb6c960a4d" + }, + { + "message": "{_gt_, select, other {John}}", + "format": "ICU", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "50df7d191a49a9d5" + }, + { + "message": "{_gt_, select, other {John}}", + "format": "ICU", + "context": null, + "id": "", + "maxChars": null, + "hash": "5730e81638a4eefc" + }, + { + "message": "{_gt_, select, other {John}}", + "format": "ICU", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "1f6a61b7aa853e53" + }, + { + "message": "{_gt_, select, other {John}}", + "format": "ICU", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "71002711d5d85059" + }, + { + "message": "{_gt_, select, other {John}}", + "format": "ICU", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "3853c940aa4751f9" + }, + { + "message": "{_gt_, select, other {John}}", + "format": "ICU", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "f170cf6fc3598653" + }, + { + "message": "{_gt_, select, other {John}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 0, + "hash": "4ea4f86ef535d3c4" + }, + { + "message": "{_gt_, select, other {John}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 5, + "hash": "71ca189c106c210d" + }, + { + "message": "{_gt_, select, other {John}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 50, + "hash": "759618e4480da67c" + }, + { + "message": "{_gt_, select, other {John}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "15e1f35469323335" + }, + { + "message": "{_gt_, select, other {John}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -10, + "hash": "8c61441fc4e98c06" + }, + { + "message": "{_gt_, select, other {John}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -500, + "hash": "dfea15518e521111" + }, + { + "message": "{_gt_, select, other {John}}", + "format": "ICU", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "4165fbc0545567a1" + }, + { + "message": "{_gt_, select, other {John}}", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "6753c9db1a0e3b59" + }, + { + "message": "{_gt_, select, other {John}}", + "format": "ICU", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "52255a0c238025af" + }, + { + "message": "Hello {_gt_, select, other {John}}!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": null, + "hash": "6513dde2ae9e6c33" + }, + { + "message": "Hello {_gt_, select, other {John}}!", + "format": "ICU", + "context": "", + "id": null, + "maxChars": null, + "hash": "6513dde2ae9e6c33" + }, + { + "message": "Hello {_gt_, select, other {John}}!", + "format": "ICU", + "context": "button", + "id": null, + "maxChars": null, + "hash": "362ae282a10c6dd8" + }, + { + "message": "Hello {_gt_, select, other {John}}!", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "bc9ced263f6b86d8" + }, + { + "message": "Hello {_gt_, select, other {John}}!", + "format": "ICU", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "bd74463005970273" + }, + { + "message": "Hello {_gt_, select, other {John}}!", + "format": "ICU", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "8f2fc61dbba407e9" + }, + { + "message": "Hello {_gt_, select, other {John}}!", + "format": "ICU", + "context": null, + "id": "", + "maxChars": null, + "hash": "6513dde2ae9e6c33" + }, + { + "message": "Hello {_gt_, select, other {John}}!", + "format": "ICU", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "c293dbcb54b7093d" + }, + { + "message": "Hello {_gt_, select, other {John}}!", + "format": "ICU", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "ce5d7ba4321fa2e0" + }, + { + "message": "Hello {_gt_, select, other {John}}!", + "format": "ICU", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "cacfb88a55e04997" + }, + { + "message": "Hello {_gt_, select, other {John}}!", + "format": "ICU", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "3e5a49742b2f8210" + }, + { + "message": "Hello {_gt_, select, other {John}}!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 0, + "hash": "14796141b6d47552" + }, + { + "message": "Hello {_gt_, select, other {John}}!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 5, + "hash": "a8dbbc9761d8a621" + }, + { + "message": "Hello {_gt_, select, other {John}}!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 50, + "hash": "c22a11795911c731" + }, + { + "message": "Hello {_gt_, select, other {John}}!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "7d1b5caf385d99e2" + }, + { + "message": "Hello {_gt_, select, other {John}}!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -10, + "hash": "6e6bb292d85c6383" + }, + { + "message": "Hello {_gt_, select, other {John}}!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -500, + "hash": "310f5f3f1c1ceacf" + }, + { + "message": "Hello {_gt_, select, other {John}}!", + "format": "ICU", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "41a2725fdf2e36e2" + }, + { + "message": "Hello {_gt_, select, other {John}}!", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "11cd5b12dbff2bef" + }, + { + "message": "Hello {_gt_, select, other {John}}!", + "format": "ICU", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "c6c2ef56df90b3de" + }, + { + "message": "Hello {_gt_, select, other {John}} and {_gt_, select, other {Jane}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": null, + "hash": "8fd28997b591e4d1" + }, + { + "message": "Hello {_gt_, select, other {John}} and {_gt_, select, other {Jane}}", + "format": "ICU", + "context": "", + "id": null, + "maxChars": null, + "hash": "8fd28997b591e4d1" + }, + { + "message": "Hello {_gt_, select, other {John}} and {_gt_, select, other {Jane}}", + "format": "ICU", + "context": "button", + "id": null, + "maxChars": null, + "hash": "c78b649d7f9cfc63" + }, + { + "message": "Hello {_gt_, select, other {John}} and {_gt_, select, other {Jane}}", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "2517dbdc834a51f5" + }, + { + "message": "Hello {_gt_, select, other {John}} and {_gt_, select, other {Jane}}", + "format": "ICU", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "63387c19acf23268" + }, + { + "message": "Hello {_gt_, select, other {John}} and {_gt_, select, other {Jane}}", + "format": "ICU", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "c0e4262fff0d9551" + }, + { + "message": "Hello {_gt_, select, other {John}} and {_gt_, select, other {Jane}}", + "format": "ICU", + "context": null, + "id": "", + "maxChars": null, + "hash": "8fd28997b591e4d1" + }, + { + "message": "Hello {_gt_, select, other {John}} and {_gt_, select, other {Jane}}", + "format": "ICU", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "ee3586c59aa291c9" + }, + { + "message": "Hello {_gt_, select, other {John}} and {_gt_, select, other {Jane}}", + "format": "ICU", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "07f13745657bac51" + }, + { + "message": "Hello {_gt_, select, other {John}} and {_gt_, select, other {Jane}}", + "format": "ICU", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "97a4960dc94a1388" + }, + { + "message": "Hello {_gt_, select, other {John}} and {_gt_, select, other {Jane}}", + "format": "ICU", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "854a2a0b0bf40df9" + }, + { + "message": "Hello {_gt_, select, other {John}} and {_gt_, select, other {Jane}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 0, + "hash": "1a326a2a86f94cea" + }, + { + "message": "Hello {_gt_, select, other {John}} and {_gt_, select, other {Jane}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 5, + "hash": "966462d4d3c52b39" + }, + { + "message": "Hello {_gt_, select, other {John}} and {_gt_, select, other {Jane}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 50, + "hash": "23c5e6f04c2b7f82" + }, + { + "message": "Hello {_gt_, select, other {John}} and {_gt_, select, other {Jane}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "9618d9c9eb93d95e" + }, + { + "message": "Hello {_gt_, select, other {John}} and {_gt_, select, other {Jane}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -10, + "hash": "31adababa3f988f4" + }, + { + "message": "Hello {_gt_, select, other {John}} and {_gt_, select, other {Jane}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -500, + "hash": "bcadc5409148940f" + }, + { + "message": "Hello {_gt_, select, other {John}} and {_gt_, select, other {Jane}}", + "format": "ICU", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "a46f1810d6fcca01" + }, + { + "message": "Hello {_gt_, select, other {John}} and {_gt_, select, other {Jane}}", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "7c7c50879e93170a" + }, + { + "message": "Hello {_gt_, select, other {John}} and {_gt_, select, other {Jane}}", + "format": "ICU", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "d0a3f702828cbe26" + }, + { + "message": "Hello {_gt_, select, other {John} _gt_var_name {user}}!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": null, + "hash": "ddec6a441a4c30cb" + }, + { + "message": "Hello {_gt_, select, other {John} _gt_var_name {user}}!", + "format": "ICU", + "context": "", + "id": null, + "maxChars": null, + "hash": "ddec6a441a4c30cb" + }, + { + "message": "Hello {_gt_, select, other {John} _gt_var_name {user}}!", + "format": "ICU", + "context": "button", + "id": null, + "maxChars": null, + "hash": "f3c1089c4b1fc666" + }, + { + "message": "Hello {_gt_, select, other {John} _gt_var_name {user}}!", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "bee0d670e034d097" + }, + { + "message": "Hello {_gt_, select, other {John} _gt_var_name {user}}!", + "format": "ICU", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "ad4e2b50ea6f7f98" + }, + { + "message": "Hello {_gt_, select, other {John} _gt_var_name {user}}!", + "format": "ICU", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "48905e6f5eb04b2a" + }, + { + "message": "Hello {_gt_, select, other {John} _gt_var_name {user}}!", + "format": "ICU", + "context": null, + "id": "", + "maxChars": null, + "hash": "ddec6a441a4c30cb" + }, + { + "message": "Hello {_gt_, select, other {John} _gt_var_name {user}}!", + "format": "ICU", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "1105057f166e8d75" + }, + { + "message": "Hello {_gt_, select, other {John} _gt_var_name {user}}!", + "format": "ICU", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "8434459adddae017" + }, + { + "message": "Hello {_gt_, select, other {John} _gt_var_name {user}}!", + "format": "ICU", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "7533cac4d6bf8bc0" + }, + { + "message": "Hello {_gt_, select, other {John} _gt_var_name {user}}!", + "format": "ICU", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "ed3baf8ce80242e3" + }, + { + "message": "Hello {_gt_, select, other {John} _gt_var_name {user}}!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 0, + "hash": "27d4219fb2e7b383" + }, + { + "message": "Hello {_gt_, select, other {John} _gt_var_name {user}}!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 5, + "hash": "d665a2dd0f63843e" + }, + { + "message": "Hello {_gt_, select, other {John} _gt_var_name {user}}!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 50, + "hash": "33c65611d442c011" + }, + { + "message": "Hello {_gt_, select, other {John} _gt_var_name {user}}!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "5d09c8d9a762d290" + }, + { + "message": "Hello {_gt_, select, other {John} _gt_var_name {user}}!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -10, + "hash": "d5105b2a29de4606" + }, + { + "message": "Hello {_gt_, select, other {John} _gt_var_name {user}}!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -500, + "hash": "4d2da82af6d494aa" + }, + { + "message": "Hello {_gt_, select, other {John} _gt_var_name {user}}!", + "format": "ICU", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "fd646b9f10720531" + }, + { + "message": "Hello {_gt_, select, other {John} _gt_var_name {user}}!", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "34e02170be937b85" + }, + { + "message": "Hello {_gt_, select, other {John} _gt_var_name {user}}!", + "format": "ICU", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "d32a233afc544ce7" + }, + { + "message": "Save", + "format": "ICU", + "context": null, + "id": null, + "maxChars": null, + "hash": "762fcce45b65eea9" + }, + { + "message": "Save", + "format": "ICU", + "context": "", + "id": null, + "maxChars": null, + "hash": "762fcce45b65eea9" + }, + { + "message": "Save", + "format": "ICU", + "context": "button", + "id": null, + "maxChars": null, + "hash": "98c53370a6119642" + }, + { + "message": "Save", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "34e888326f1b51a4" + }, + { + "message": "Save", + "format": "ICU", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "bc8829c76fe965c2" + }, + { + "message": "Save", + "format": "ICU", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "4e436f1e17c5ff65" + }, + { + "message": "Save", + "format": "ICU", + "context": null, + "id": "", + "maxChars": null, + "hash": "762fcce45b65eea9" + }, + { + "message": "Save", + "format": "ICU", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "8d8ae6ec19ceee7f" + }, + { + "message": "Save", + "format": "ICU", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "6fb8fab54a911538" + }, + { + "message": "Save", + "format": "ICU", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "9c6ef363310f766b" + }, + { + "message": "Save", + "format": "ICU", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "0ca804952f5451c2" + }, + { + "message": "Save", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 0, + "hash": "d73a60ce62ce559b" + }, + { + "message": "Save", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 5, + "hash": "9ad820ecf5294537" + }, + { + "message": "Save", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 50, + "hash": "4e2492f768f8e1db" + }, + { + "message": "Save", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "fa27a712816efddd" + }, + { + "message": "Save", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -10, + "hash": "e0354dd35d7d269d" + }, + { + "message": "Save", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -500, + "hash": "216ffae7d1edcd40" + }, + { + "message": "Save", + "format": "ICU", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "f42de11c6be29974" + }, + { + "message": "Save", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "85bdea0e562f9222" + }, + { + "message": "Save", + "format": "ICU", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "e608cccc8fdaafe7" + }, + { + "message": "Cancel", + "format": "ICU", + "context": null, + "id": null, + "maxChars": null, + "hash": "00ad8aeed0a6c461" + }, + { + "message": "Cancel", + "format": "ICU", + "context": "", + "id": null, + "maxChars": null, + "hash": "00ad8aeed0a6c461" + }, + { + "message": "Cancel", + "format": "ICU", + "context": "button", + "id": null, + "maxChars": null, + "hash": "a57df3595ab4f4fd" + }, + { + "message": "Cancel", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "c8767dc9b7795520" + }, + { + "message": "Cancel", + "format": "ICU", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "d5939e9daea6691f" + }, + { + "message": "Cancel", + "format": "ICU", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "9547d18ae5697f6e" + }, + { + "message": "Cancel", + "format": "ICU", + "context": null, + "id": "", + "maxChars": null, + "hash": "00ad8aeed0a6c461" + }, + { + "message": "Cancel", + "format": "ICU", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "9f2cdc7e9134831b" + }, + { + "message": "Cancel", + "format": "ICU", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "9e64476cad827aa0" + }, + { + "message": "Cancel", + "format": "ICU", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "e48ea93cdaa83336" + }, + { + "message": "Cancel", + "format": "ICU", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "8abee773cce4b94b" + }, + { + "message": "Cancel", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 0, + "hash": "e4599187031a16f2" + }, + { + "message": "Cancel", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 5, + "hash": "ed4991ddf99ea583" + }, + { + "message": "Cancel", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 50, + "hash": "52b4955cbb1dcf96" + }, + { + "message": "Cancel", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "512f7819e8945e8f" + }, + { + "message": "Cancel", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -10, + "hash": "4a50091812fb80eb" + }, + { + "message": "Cancel", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -500, + "hash": "53ba30cbd12aa031" + }, + { + "message": "Cancel", + "format": "ICU", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "24e3bd967c2a4127" + }, + { + "message": "Cancel", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "fada7a9e602d1f9b" + }, + { + "message": "Cancel", + "format": "ICU", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "a4a71aa62e35d505" + }, + { + "message": "Delete {count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": null, + "hash": "667c5e0a4216d646" + }, + { + "message": "Delete {count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": "", + "id": null, + "maxChars": null, + "hash": "667c5e0a4216d646" + }, + { + "message": "Delete {count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": "button", + "id": null, + "maxChars": null, + "hash": "01cce3bd5173d373" + }, + { + "message": "Delete {count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "2693b31c49698b37" + }, + { + "message": "Delete {count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "7865c79e8d567c2b" + }, + { + "message": "Delete {count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "484280f3a9603ff7" + }, + { + "message": "Delete {count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": null, + "id": "", + "maxChars": null, + "hash": "667c5e0a4216d646" + }, + { + "message": "Delete {count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "e935f532c843c168" + }, + { + "message": "Delete {count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "59f1d764cfa329ac" + }, + { + "message": "Delete {count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "0eeb6af837ab30c4" + }, + { + "message": "Delete {count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "1b8891b8d47c5710" + }, + { + "message": "Delete {count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 0, + "hash": "6363fd03ce58cdf3" + }, + { + "message": "Delete {count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 5, + "hash": "3732c32004f2f459" + }, + { + "message": "Delete {count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 50, + "hash": "992d903f8b3937ea" + }, + { + "message": "Delete {count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "a89a1e807a13cb7b" + }, + { + "message": "Delete {count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -10, + "hash": "5b85fe0fbef1fef1" + }, + { + "message": "Delete {count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -500, + "hash": "05f4283a37e88d1e" + }, + { + "message": "Delete {count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "f231f51d64525943" + }, + { + "message": "Delete {count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "2f20203883f6e30c" + }, + { + "message": "Delete {count, plural, one {# item} other {# items}}", + "format": "ICU", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "bff8b012b5619967" + }, + { + "message": "Você tem {count, plural, one {# mensagem} other {# mensagens}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": null, + "hash": "2fe58d274bc605e9" + }, + { + "message": "Você tem {count, plural, one {# mensagem} other {# mensagens}}", + "format": "ICU", + "context": "", + "id": null, + "maxChars": null, + "hash": "2fe58d274bc605e9" + }, + { + "message": "Você tem {count, plural, one {# mensagem} other {# mensagens}}", + "format": "ICU", + "context": "button", + "id": null, + "maxChars": null, + "hash": "072cd14a15e9ae91" + }, + { + "message": "Você tem {count, plural, one {# mensagem} other {# mensagens}}", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "5b45715f57b379b4" + }, + { + "message": "Você tem {count, plural, one {# mensagem} other {# mensagens}}", + "format": "ICU", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "a901a585adbfdb1a" + }, + { + "message": "Você tem {count, plural, one {# mensagem} other {# mensagens}}", + "format": "ICU", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "806ff622a340a9d9" + }, + { + "message": "Você tem {count, plural, one {# mensagem} other {# mensagens}}", + "format": "ICU", + "context": null, + "id": "", + "maxChars": null, + "hash": "2fe58d274bc605e9" + }, + { + "message": "Você tem {count, plural, one {# mensagem} other {# mensagens}}", + "format": "ICU", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "e542e68cebbcbcf2" + }, + { + "message": "Você tem {count, plural, one {# mensagem} other {# mensagens}}", + "format": "ICU", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "fa45fd13c20212b7" + }, + { + "message": "Você tem {count, plural, one {# mensagem} other {# mensagens}}", + "format": "ICU", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "3f938347a77d183e" + }, + { + "message": "Você tem {count, plural, one {# mensagem} other {# mensagens}}", + "format": "ICU", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "33090046c0c2542b" + }, + { + "message": "Você tem {count, plural, one {# mensagem} other {# mensagens}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 0, + "hash": "8331635258ca8885" + }, + { + "message": "Você tem {count, plural, one {# mensagem} other {# mensagens}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 5, + "hash": "b9babcb75fd4094d" + }, + { + "message": "Você tem {count, plural, one {# mensagem} other {# mensagens}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 50, + "hash": "57a9f2c03ff5565b" + }, + { + "message": "Você tem {count, plural, one {# mensagem} other {# mensagens}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "5614b99f20a3302f" + }, + { + "message": "Você tem {count, plural, one {# mensagem} other {# mensagens}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -10, + "hash": "0ac4a3ded43bcb5c" + }, + { + "message": "Você tem {count, plural, one {# mensagem} other {# mensagens}}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -500, + "hash": "063302cd4a9a9e64" + }, + { + "message": "Você tem {count, plural, one {# mensagem} other {# mensagens}}", + "format": "ICU", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "615739b23cf1eab5" + }, + { + "message": "Você tem {count, plural, one {# mensagem} other {# mensagens}}", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "16f8755418bb3a68" + }, + { + "message": "Você tem {count, plural, one {# mensagem} other {# mensagens}}", + "format": "ICU", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "6a558ff55e1ad2f6" + }, + { + "message": "こんにちは、{name}さん!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": null, + "hash": "6698f6ab875d10b6" + }, + { + "message": "こんにちは、{name}さん!", + "format": "ICU", + "context": "", + "id": null, + "maxChars": null, + "hash": "6698f6ab875d10b6" + }, + { + "message": "こんにちは、{name}さん!", + "format": "ICU", + "context": "button", + "id": null, + "maxChars": null, + "hash": "dc0c59774e58e83f" + }, + { + "message": "こんにちは、{name}さん!", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "e0b9ba28aa38deb0" + }, + { + "message": "こんにちは、{name}さん!", + "format": "ICU", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "6fb7781fb0e2b456" + }, + { + "message": "こんにちは、{name}さん!", + "format": "ICU", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "636e8a163c79f686" + }, + { + "message": "こんにちは、{name}さん!", + "format": "ICU", + "context": null, + "id": "", + "maxChars": null, + "hash": "6698f6ab875d10b6" + }, + { + "message": "こんにちは、{name}さん!", + "format": "ICU", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "d8ec5623241dc71b" + }, + { + "message": "こんにちは、{name}さん!", + "format": "ICU", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "97864c4a0f0d8361" + }, + { + "message": "こんにちは、{name}さん!", + "format": "ICU", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "c7351c5f8643d18a" + }, + { + "message": "こんにちは、{name}さん!", + "format": "ICU", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "83072c63dc627fe3" + }, + { + "message": "こんにちは、{name}さん!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 0, + "hash": "92960b0b1e5c9020" + }, + { + "message": "こんにちは、{name}さん!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 5, + "hash": "96a98b97ae7a1936" + }, + { + "message": "こんにちは、{name}さん!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 50, + "hash": "00a58817d852e747" + }, + { + "message": "こんにちは、{name}さん!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "1ec9c73118e2ffe0" + }, + { + "message": "こんにちは、{name}さん!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -10, + "hash": "f3aea7bb52582bf5" + }, + { + "message": "こんにちは、{name}さん!", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -500, + "hash": "54ec88668bf3e0cf" + }, + { + "message": "こんにちは、{name}さん!", + "format": "ICU", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "92eb9f2c11cb9e55" + }, + { + "message": "こんにちは、{name}さん!", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "6c36b12c1e3e6b92" + }, + { + "message": "こんにちは、{name}さん!", + "format": "ICU", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "05b1c069a180e104" + }, + { + "message": "🎉 Welcome, {name}! 🎉", + "format": "ICU", + "context": null, + "id": null, + "maxChars": null, + "hash": "3fdf0d51143a1c18" + }, + { + "message": "🎉 Welcome, {name}! 🎉", + "format": "ICU", + "context": "", + "id": null, + "maxChars": null, + "hash": "3fdf0d51143a1c18" + }, + { + "message": "🎉 Welcome, {name}! 🎉", + "format": "ICU", + "context": "button", + "id": null, + "maxChars": null, + "hash": "2e613de428c0e443" + }, + { + "message": "🎉 Welcome, {name}! 🎉", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "267edddd02685e29" + }, + { + "message": "🎉 Welcome, {name}! 🎉", + "format": "ICU", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "4b2c944c37df7849" + }, + { + "message": "🎉 Welcome, {name}! 🎉", + "format": "ICU", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "6babc414ff3e8dd2" + }, + { + "message": "🎉 Welcome, {name}! 🎉", + "format": "ICU", + "context": null, + "id": "", + "maxChars": null, + "hash": "3fdf0d51143a1c18" + }, + { + "message": "🎉 Welcome, {name}! 🎉", + "format": "ICU", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "a9521df28d3bba2c" + }, + { + "message": "🎉 Welcome, {name}! 🎉", + "format": "ICU", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "da2a8b974e429468" + }, + { + "message": "🎉 Welcome, {name}! 🎉", + "format": "ICU", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "66ae36006dd4f1ae" + }, + { + "message": "🎉 Welcome, {name}! 🎉", + "format": "ICU", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "f17a68c3945f16c1" + }, + { + "message": "🎉 Welcome, {name}! 🎉", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 0, + "hash": "304b334cafd5e467" + }, + { + "message": "🎉 Welcome, {name}! 🎉", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 5, + "hash": "44370f22bbf54a13" + }, + { + "message": "🎉 Welcome, {name}! 🎉", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 50, + "hash": "8395a9bb939fc0cb" + }, + { + "message": "🎉 Welcome, {name}! 🎉", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "d5d4b9018b9f9017" + }, + { + "message": "🎉 Welcome, {name}! 🎉", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -10, + "hash": "083b4cd80eb451e8" + }, + { + "message": "🎉 Welcome, {name}! 🎉", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -500, + "hash": "3f6fe9266524906f" + }, + { + "message": "🎉 Welcome, {name}! 🎉", + "format": "ICU", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "74723586512c0240" + }, + { + "message": "🎉 Welcome, {name}! 🎉", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "c077f73bb41d08b5" + }, + { + "message": "🎉 Welcome, {name}! 🎉", + "format": "ICU", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "a95706c9afe3d31c" + }, + { + "message": "{a}{b}{c}{d}{e}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": null, + "hash": "37d00b2fbf2deac2" + }, + { + "message": "{a}{b}{c}{d}{e}", + "format": "ICU", + "context": "", + "id": null, + "maxChars": null, + "hash": "37d00b2fbf2deac2" + }, + { + "message": "{a}{b}{c}{d}{e}", + "format": "ICU", + "context": "button", + "id": null, + "maxChars": null, + "hash": "be7c418537cee084" + }, + { + "message": "{a}{b}{c}{d}{e}", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "6c5fbdfeb0a07feb" + }, + { + "message": "{a}{b}{c}{d}{e}", + "format": "ICU", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "b4237f039ee92633" + }, + { + "message": "{a}{b}{c}{d}{e}", + "format": "ICU", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "ceaff09295e43c03" + }, + { + "message": "{a}{b}{c}{d}{e}", + "format": "ICU", + "context": null, + "id": "", + "maxChars": null, + "hash": "37d00b2fbf2deac2" + }, + { + "message": "{a}{b}{c}{d}{e}", + "format": "ICU", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "44cd0feff5094987" + }, + { + "message": "{a}{b}{c}{d}{e}", + "format": "ICU", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "5f8a8503949a9d8b" + }, + { + "message": "{a}{b}{c}{d}{e}", + "format": "ICU", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "0b967d1abe8845ca" + }, + { + "message": "{a}{b}{c}{d}{e}", + "format": "ICU", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "a9852ef379034445" + }, + { + "message": "{a}{b}{c}{d}{e}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 0, + "hash": "18d3574dd60c0396" + }, + { + "message": "{a}{b}{c}{d}{e}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 5, + "hash": "d92205d349568f56" + }, + { + "message": "{a}{b}{c}{d}{e}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 50, + "hash": "450451ae758323f5" + }, + { + "message": "{a}{b}{c}{d}{e}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "fc305049a79f73c2" + }, + { + "message": "{a}{b}{c}{d}{e}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -10, + "hash": "f93eca966f18eca2" + }, + { + "message": "{a}{b}{c}{d}{e}", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -500, + "hash": "1d4b4d917df4a5fb" + }, + { + "message": "{a}{b}{c}{d}{e}", + "format": "ICU", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "7014e0dac10087b8" + }, + { + "message": "{a}{b}{c}{d}{e}", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "1ec7cb8f691cb1b6" + }, + { + "message": "{a}{b}{c}{d}{e}", + "format": "ICU", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "0ab97f8f799ee326" + }, + { + "message": " leading and trailing whitespace ", + "format": "ICU", + "context": null, + "id": null, + "maxChars": null, + "hash": "295f26e2412311d4" + }, + { + "message": " leading and trailing whitespace ", + "format": "ICU", + "context": "", + "id": null, + "maxChars": null, + "hash": "295f26e2412311d4" + }, + { + "message": " leading and trailing whitespace ", + "format": "ICU", + "context": "button", + "id": null, + "maxChars": null, + "hash": "de81c0642477d4ed" + }, + { + "message": " leading and trailing whitespace ", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "85d20646c0172ef7" + }, + { + "message": " leading and trailing whitespace ", + "format": "ICU", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "cf018b219512e84c" + }, + { + "message": " leading and trailing whitespace ", + "format": "ICU", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "caa3c551f4cf493b" + }, + { + "message": " leading and trailing whitespace ", + "format": "ICU", + "context": null, + "id": "", + "maxChars": null, + "hash": "295f26e2412311d4" + }, + { + "message": " leading and trailing whitespace ", + "format": "ICU", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "6732ed41b23a2f69" + }, + { + "message": " leading and trailing whitespace ", + "format": "ICU", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "ee41cf12c8b122ea" + }, + { + "message": " leading and trailing whitespace ", + "format": "ICU", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "0e41032ba04c9151" + }, + { + "message": " leading and trailing whitespace ", + "format": "ICU", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "39d8d415e363b5eb" + }, + { + "message": " leading and trailing whitespace ", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 0, + "hash": "eddfa1c4204233b7" + }, + { + "message": " leading and trailing whitespace ", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 5, + "hash": "a4de872f61bf8a6d" + }, + { + "message": " leading and trailing whitespace ", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 50, + "hash": "71431d89637d03eb" + }, + { + "message": " leading and trailing whitespace ", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "c98e540cd6535d5f" + }, + { + "message": " leading and trailing whitespace ", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -10, + "hash": "06c0e19500acf48c" + }, + { + "message": " leading and trailing whitespace ", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -500, + "hash": "0ea7b837d6fce350" + }, + { + "message": " leading and trailing whitespace ", + "format": "ICU", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "41efd4f7dd38ace5" + }, + { + "message": " leading and trailing whitespace ", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "2824e24f34281bbb" + }, + { + "message": " leading and trailing whitespace ", + "format": "ICU", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "36a8686a945cdca9" + }, + { + "message": "Line1\nLine2\nLine3", + "format": "ICU", + "context": null, + "id": null, + "maxChars": null, + "hash": "e60b4e64c6ed1f24" + }, + { + "message": "Line1\nLine2\nLine3", + "format": "ICU", + "context": "", + "id": null, + "maxChars": null, + "hash": "e60b4e64c6ed1f24" + }, + { + "message": "Line1\nLine2\nLine3", + "format": "ICU", + "context": "button", + "id": null, + "maxChars": null, + "hash": "d94263124ff8d939" + }, + { + "message": "Line1\nLine2\nLine3", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "33198e4e547c0dc9" + }, + { + "message": "Line1\nLine2\nLine3", + "format": "ICU", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "9ae4378a2374ca67" + }, + { + "message": "Line1\nLine2\nLine3", + "format": "ICU", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "f97340491d08e843" + }, + { + "message": "Line1\nLine2\nLine3", + "format": "ICU", + "context": null, + "id": "", + "maxChars": null, + "hash": "e60b4e64c6ed1f24" + }, + { + "message": "Line1\nLine2\nLine3", + "format": "ICU", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "f1183e53f7b020d5" + }, + { + "message": "Line1\nLine2\nLine3", + "format": "ICU", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "410463cb5303a5a7" + }, + { + "message": "Line1\nLine2\nLine3", + "format": "ICU", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "6eb071e01aeb32cf" + }, + { + "message": "Line1\nLine2\nLine3", + "format": "ICU", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "b7a325ce613d50eb" + }, + { + "message": "Line1\nLine2\nLine3", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 0, + "hash": "f71f9dd22442882c" + }, + { + "message": "Line1\nLine2\nLine3", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 5, + "hash": "6c775a8d4343063c" + }, + { + "message": "Line1\nLine2\nLine3", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 50, + "hash": "27000dc5ac8d73bf" + }, + { + "message": "Line1\nLine2\nLine3", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "c9f691c0ac342b69" + }, + { + "message": "Line1\nLine2\nLine3", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -10, + "hash": "09eae255cf54c37f" + }, + { + "message": "Line1\nLine2\nLine3", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -500, + "hash": "a73a656164e679e6" + }, + { + "message": "Line1\nLine2\nLine3", + "format": "ICU", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "7cbec41d17cf5133" + }, + { + "message": "Line1\nLine2\nLine3", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "7209729a126d4fac" + }, + { + "message": "Line1\nLine2\nLine3", + "format": "ICU", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "67d2ed987b8c5b66" + }, + { + "message": "Tab\there", + "format": "ICU", + "context": null, + "id": null, + "maxChars": null, + "hash": "0e77777d1966e928" + }, + { + "message": "Tab\there", + "format": "ICU", + "context": "", + "id": null, + "maxChars": null, + "hash": "0e77777d1966e928" + }, + { + "message": "Tab\there", + "format": "ICU", + "context": "button", + "id": null, + "maxChars": null, + "hash": "c0c9821516aab484" + }, + { + "message": "Tab\there", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "de5be85bcb7b90a6" + }, + { + "message": "Tab\there", + "format": "ICU", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "a04ab20af9dda860" + }, + { + "message": "Tab\there", + "format": "ICU", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "466a018a6efb3c8f" + }, + { + "message": "Tab\there", + "format": "ICU", + "context": null, + "id": "", + "maxChars": null, + "hash": "0e77777d1966e928" + }, + { + "message": "Tab\there", + "format": "ICU", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "7aa220b385f42f10" + }, + { + "message": "Tab\there", + "format": "ICU", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "6b42c61fee33a4d7" + }, + { + "message": "Tab\there", + "format": "ICU", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "915107b1e89358f6" + }, + { + "message": "Tab\there", + "format": "ICU", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "7c789e46f9b44879" + }, + { + "message": "Tab\there", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 0, + "hash": "bf4af0a8f067a2d9" + }, + { + "message": "Tab\there", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 5, + "hash": "1e11915418c757c2" + }, + { + "message": "Tab\there", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 50, + "hash": "2099c788b25b8a62" + }, + { + "message": "Tab\there", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "b7e1d7a19d8fcb9f" + }, + { + "message": "Tab\there", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -10, + "hash": "2ac24f34ce8aa26a" + }, + { + "message": "Tab\there", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -500, + "hash": "39dec30a1588d4a9" + }, + { + "message": "Tab\there", + "format": "ICU", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "a6e0c2c420d40ede" + }, + { + "message": "Tab\there", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "0e337129d8a400e6" + }, + { + "message": "Tab\there", + "format": "ICU", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "c3fa116ace463f57" + }, + { + "message": "A very long message that goes on and on and on and on and repeats many words many times in sequence for testing long-content handling behavior within the hash computation pipeline end-to-end.", + "format": "ICU", + "context": null, + "id": null, + "maxChars": null, + "hash": "23dda5dfd97c442b" + }, + { + "message": "A very long message that goes on and on and on and on and repeats many words many times in sequence for testing long-content handling behavior within the hash computation pipeline end-to-end.", + "format": "ICU", + "context": "", + "id": null, + "maxChars": null, + "hash": "23dda5dfd97c442b" + }, + { + "message": "A very long message that goes on and on and on and on and repeats many words many times in sequence for testing long-content handling behavior within the hash computation pipeline end-to-end.", + "format": "ICU", + "context": "button", + "id": null, + "maxChars": null, + "hash": "422c9842e3fe908f" + }, + { + "message": "A very long message that goes on and on and on and on and repeats many words many times in sequence for testing long-content handling behavior within the hash computation pipeline end-to-end.", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "eb4f40e4aee0b685" + }, + { + "message": "A very long message that goes on and on and on and on and repeats many words many times in sequence for testing long-content handling behavior within the hash computation pipeline end-to-end.", + "format": "ICU", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "d9de6d7e70801fd8" + }, + { + "message": "A very long message that goes on and on and on and on and repeats many words many times in sequence for testing long-content handling behavior within the hash computation pipeline end-to-end.", + "format": "ICU", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "f2f8e26e73a4a908" + }, + { + "message": "A very long message that goes on and on and on and on and repeats many words many times in sequence for testing long-content handling behavior within the hash computation pipeline end-to-end.", + "format": "ICU", + "context": null, + "id": "", + "maxChars": null, + "hash": "23dda5dfd97c442b" + }, + { + "message": "A very long message that goes on and on and on and on and repeats many words many times in sequence for testing long-content handling behavior within the hash computation pipeline end-to-end.", + "format": "ICU", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "b2f8ea248e65c758" + }, + { + "message": "A very long message that goes on and on and on and on and repeats many words many times in sequence for testing long-content handling behavior within the hash computation pipeline end-to-end.", + "format": "ICU", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "fe372a26a14e3271" + }, + { + "message": "A very long message that goes on and on and on and on and repeats many words many times in sequence for testing long-content handling behavior within the hash computation pipeline end-to-end.", + "format": "ICU", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "967289042f2649ae" + }, + { + "message": "A very long message that goes on and on and on and on and repeats many words many times in sequence for testing long-content handling behavior within the hash computation pipeline end-to-end.", + "format": "ICU", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "9de3a32c4bccc93e" + }, + { + "message": "A very long message that goes on and on and on and on and repeats many words many times in sequence for testing long-content handling behavior within the hash computation pipeline end-to-end.", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 0, + "hash": "d715a59b6dbda4f5" + }, + { + "message": "A very long message that goes on and on and on and on and repeats many words many times in sequence for testing long-content handling behavior within the hash computation pipeline end-to-end.", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 5, + "hash": "9ef72f8eb4ae60fe" + }, + { + "message": "A very long message that goes on and on and on and on and repeats many words many times in sequence for testing long-content handling behavior within the hash computation pipeline end-to-end.", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 50, + "hash": "60548308d115fba6" + }, + { + "message": "A very long message that goes on and on and on and on and repeats many words many times in sequence for testing long-content handling behavior within the hash computation pipeline end-to-end.", + "format": "ICU", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "a915523e8a3c0f88" + }, + { + "message": "A very long message that goes on and on and on and on and repeats many words many times in sequence for testing long-content handling behavior within the hash computation pipeline end-to-end.", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -10, + "hash": "195ba3bc5dca00c4" + }, + { + "message": "A very long message that goes on and on and on and on and repeats many words many times in sequence for testing long-content handling behavior within the hash computation pipeline end-to-end.", + "format": "ICU", + "context": null, + "id": null, + "maxChars": -500, + "hash": "a96c33be25cdbd35" + }, + { + "message": "A very long message that goes on and on and on and on and repeats many words many times in sequence for testing long-content handling behavior within the hash computation pipeline end-to-end.", + "format": "ICU", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "fcf51590d9a0300c" + }, + { + "message": "A very long message that goes on and on and on and on and repeats many words many times in sequence for testing long-content handling behavior within the hash computation pipeline end-to-end.", + "format": "ICU", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "d6011ba49071cbde" + }, + { + "message": "A very long message that goes on and on and on and on and repeats many words many times in sequence for testing long-content handling behavior within the hash computation pipeline end-to-end.", + "format": "ICU", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "78eacb3158a2b6f6" + }, + { + "message": "", + "format": "STRING", + "context": null, + "id": null, + "maxChars": null, + "hash": "301efa5d6b271581" + }, + { + "message": "", + "format": "STRING", + "context": "", + "id": null, + "maxChars": null, + "hash": "301efa5d6b271581" + }, + { + "message": "", + "format": "STRING", + "context": "button", + "id": null, + "maxChars": null, + "hash": "c1ba29f981c859e1" + }, + { + "message": "", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "a0333d3f42f8ce2d" + }, + { + "message": "", + "format": "STRING", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "3550492723043673" + }, + { + "message": "", + "format": "STRING", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "db32a239683c6b6c" + }, + { + "message": "", + "format": "STRING", + "context": null, + "id": "", + "maxChars": null, + "hash": "301efa5d6b271581" + }, + { + "message": "", + "format": "STRING", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "277985b176c67d08" + }, + { + "message": "", + "format": "STRING", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "09646f6aa623a89b" + }, + { + "message": "", + "format": "STRING", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "127b37b089d99c6e" + }, + { + "message": "", + "format": "STRING", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "086ec24cb05448f8" + }, + { + "message": "", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 0, + "hash": "3763a541fe99314d" + }, + { + "message": "", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 5, + "hash": "13bfaccf8b960fe5" + }, + { + "message": "", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 50, + "hash": "e8b516739fd0ba6a" + }, + { + "message": "", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "c4bff9850418dbca" + }, + { + "message": "", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -10, + "hash": "577b439cc069fc06" + }, + { + "message": "", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -500, + "hash": "d8af71381aaa5d8c" + }, + { + "message": "", + "format": "STRING", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "e3d0dbf5a3a85907" + }, + { + "message": "", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "bf35cb4327b54f9c" + }, + { + "message": "", + "format": "STRING", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "f311d455edbd2515" + }, + { + "message": "Hello", + "format": "STRING", + "context": null, + "id": null, + "maxChars": null, + "hash": "c6e2133cd6d89e5d" + }, + { + "message": "Hello", + "format": "STRING", + "context": "", + "id": null, + "maxChars": null, + "hash": "c6e2133cd6d89e5d" + }, + { + "message": "Hello", + "format": "STRING", + "context": "button", + "id": null, + "maxChars": null, + "hash": "85e6a57c73fa392c" + }, + { + "message": "Hello", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "71c6f929957a5ded" + }, + { + "message": "Hello", + "format": "STRING", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "0e21562893068df3" + }, + { + "message": "Hello", + "format": "STRING", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "68ccbfc21eaf79ab" + }, + { + "message": "Hello", + "format": "STRING", + "context": null, + "id": "", + "maxChars": null, + "hash": "c6e2133cd6d89e5d" + }, + { + "message": "Hello", + "format": "STRING", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "dc6091b5a1b8fe4b" + }, + { + "message": "Hello", + "format": "STRING", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "5b5309eb45e83fd4" + }, + { + "message": "Hello", + "format": "STRING", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "dff8a66f07908f2f" + }, + { + "message": "Hello", + "format": "STRING", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "fdf22dbab8592664" + }, + { + "message": "Hello", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 0, + "hash": "3ae3b1de13ecd492" + }, + { + "message": "Hello", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 5, + "hash": "8bd20823b68e1a04" + }, + { + "message": "Hello", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 50, + "hash": "e0052591b119da9a" + }, + { + "message": "Hello", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "7e793a02008c7a9a" + }, + { + "message": "Hello", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -10, + "hash": "6220664c857bb9ca" + }, + { + "message": "Hello", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -500, + "hash": "7a92ac582f315955" + }, + { + "message": "Hello", + "format": "STRING", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "8fa731f2552b952b" + }, + { + "message": "Hello", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "43bcbe03b1624fe7" + }, + { + "message": "Hello", + "format": "STRING", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "7dbad350b0e3ffe2" + }, + { + "message": "Hello, world!", + "format": "STRING", + "context": null, + "id": null, + "maxChars": null, + "hash": "850b31724e3295a4" + }, + { + "message": "Hello, world!", + "format": "STRING", + "context": "", + "id": null, + "maxChars": null, + "hash": "850b31724e3295a4" + }, + { + "message": "Hello, world!", + "format": "STRING", + "context": "button", + "id": null, + "maxChars": null, + "hash": "c772ef7b545e6d53" + }, + { + "message": "Hello, world!", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "4c053b6515a7d4a9" + }, + { + "message": "Hello, world!", + "format": "STRING", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "9c6987d2ba03841e" + }, + { + "message": "Hello, world!", + "format": "STRING", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "f1fd90874129cb56" + }, + { + "message": "Hello, world!", + "format": "STRING", + "context": null, + "id": "", + "maxChars": null, + "hash": "850b31724e3295a4" + }, + { + "message": "Hello, world!", + "format": "STRING", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "4360da348ede26a5" + }, + { + "message": "Hello, world!", + "format": "STRING", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "00872eaf95cc4518" + }, + { + "message": "Hello, world!", + "format": "STRING", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "e6caf5f61999826f" + }, + { + "message": "Hello, world!", + "format": "STRING", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "b7517ad45afd0d9c" + }, + { + "message": "Hello, world!", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 0, + "hash": "f821e798a1f8514e" + }, + { + "message": "Hello, world!", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 5, + "hash": "e9cfdbefd635c14a" + }, + { + "message": "Hello, world!", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 50, + "hash": "3c1872e1f9fe784d" + }, + { + "message": "Hello, world!", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "d3089398a3e86700" + }, + { + "message": "Hello, world!", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -10, + "hash": "00dfd0644109bf47" + }, + { + "message": "Hello, world!", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -500, + "hash": "b90527734f7f841e" + }, + { + "message": "Hello, world!", + "format": "STRING", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "628938bdeeed3a56" + }, + { + "message": "Hello, world!", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "8b62eb8660d72edd" + }, + { + "message": "Hello, world!", + "format": "STRING", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "d57e2ec6b4d8b364" + }, + { + "message": "Hello, {name}!", + "format": "STRING", + "context": null, + "id": null, + "maxChars": null, + "hash": "71e647ee929eb212" + }, + { + "message": "Hello, {name}!", + "format": "STRING", + "context": "", + "id": null, + "maxChars": null, + "hash": "71e647ee929eb212" + }, + { + "message": "Hello, {name}!", + "format": "STRING", + "context": "button", + "id": null, + "maxChars": null, + "hash": "63f9ca42b13c18a9" + }, + { + "message": "Hello, {name}!", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "4ad0f1f2758dbbcf" + }, + { + "message": "Hello, {name}!", + "format": "STRING", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "e4b4fa90d977535a" + }, + { + "message": "Hello, {name}!", + "format": "STRING", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "0d40526922de06d6" + }, + { + "message": "Hello, {name}!", + "format": "STRING", + "context": null, + "id": "", + "maxChars": null, + "hash": "71e647ee929eb212" + }, + { + "message": "Hello, {name}!", + "format": "STRING", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "44e2385b23c703e0" + }, + { + "message": "Hello, {name}!", + "format": "STRING", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "12fde3e924a44a5d" + }, + { + "message": "Hello, {name}!", + "format": "STRING", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "318b5408ffc69c99" + }, + { + "message": "Hello, {name}!", + "format": "STRING", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "c1fa05e89cc8bff0" + }, + { + "message": "Hello, {name}!", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 0, + "hash": "069bd3f8bb9afd99" + }, + { + "message": "Hello, {name}!", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 5, + "hash": "d562cf520238add9" + }, + { + "message": "Hello, {name}!", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 50, + "hash": "d4f94c57eecee206" + }, + { + "message": "Hello, {name}!", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "947f592ea54ee7f0" + }, + { + "message": "Hello, {name}!", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -10, + "hash": "7a9391fd07f6bb62" + }, + { + "message": "Hello, {name}!", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -500, + "hash": "d6372f356d2621cd" + }, + { + "message": "Hello, {name}!", + "format": "STRING", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "5643e30a446486f4" + }, + { + "message": "Hello, {name}!", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "b50152516dac191e" + }, + { + "message": "Hello, {name}!", + "format": "STRING", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "a1fd777235fbeab5" + }, + { + "message": "Welcome {user}!", + "format": "STRING", + "context": null, + "id": null, + "maxChars": null, + "hash": "1f50611780bd9f80" + }, + { + "message": "Welcome {user}!", + "format": "STRING", + "context": "", + "id": null, + "maxChars": null, + "hash": "1f50611780bd9f80" + }, + { + "message": "Welcome {user}!", + "format": "STRING", + "context": "button", + "id": null, + "maxChars": null, + "hash": "ed61a95106fdbf44" + }, + { + "message": "Welcome {user}!", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "02faa4cff96c48d8" + }, + { + "message": "Welcome {user}!", + "format": "STRING", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "968de91dc035b709" + }, + { + "message": "Welcome {user}!", + "format": "STRING", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "f59f33c24777cfd1" + }, + { + "message": "Welcome {user}!", + "format": "STRING", + "context": null, + "id": "", + "maxChars": null, + "hash": "1f50611780bd9f80" + }, + { + "message": "Welcome {user}!", + "format": "STRING", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "0f13a330181a7cbb" + }, + { + "message": "Welcome {user}!", + "format": "STRING", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "42489569e40ceb39" + }, + { + "message": "Welcome {user}!", + "format": "STRING", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "da54c366a0b2ca29" + }, + { + "message": "Welcome {user}!", + "format": "STRING", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "63e22feca17bf994" + }, + { + "message": "Welcome {user}!", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 0, + "hash": "f9156db9b51082dd" + }, + { + "message": "Welcome {user}!", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 5, + "hash": "20a1be88027600ef" + }, + { + "message": "Welcome {user}!", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 50, + "hash": "5c6b18b0f8679557" + }, + { + "message": "Welcome {user}!", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "647cedb7c30f0b32" + }, + { + "message": "Welcome {user}!", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -10, + "hash": "6a01d582b5feda3f" + }, + { + "message": "Welcome {user}!", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -500, + "hash": "210ac5ea7f43fca1" + }, + { + "message": "Welcome {user}!", + "format": "STRING", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "78ada122d2e948d2" + }, + { + "message": "Welcome {user}!", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "65beb407c6f44460" + }, + { + "message": "Welcome {user}!", + "format": "STRING", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "56913e5bc8a536b9" + }, + { + "message": "Items: {count}", + "format": "STRING", + "context": null, + "id": null, + "maxChars": null, + "hash": "213d869d32dc4a56" + }, + { + "message": "Items: {count}", + "format": "STRING", + "context": "", + "id": null, + "maxChars": null, + "hash": "213d869d32dc4a56" + }, + { + "message": "Items: {count}", + "format": "STRING", + "context": "button", + "id": null, + "maxChars": null, + "hash": "f173b2d6a50faddf" + }, + { + "message": "Items: {count}", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "e0e99360474d03b0" + }, + { + "message": "Items: {count}", + "format": "STRING", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "87d9b9a8cfdd7ab8" + }, + { + "message": "Items: {count}", + "format": "STRING", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "71c3277f81822f35" + }, + { + "message": "Items: {count}", + "format": "STRING", + "context": null, + "id": "", + "maxChars": null, + "hash": "213d869d32dc4a56" + }, + { + "message": "Items: {count}", + "format": "STRING", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "b89d13ed6f223509" + }, + { + "message": "Items: {count}", + "format": "STRING", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "d68e35ada0998e7a" + }, + { + "message": "Items: {count}", + "format": "STRING", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "151e1081eb2eb83a" + }, + { + "message": "Items: {count}", + "format": "STRING", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "75ede38fa77b2ce9" + }, + { + "message": "Items: {count}", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 0, + "hash": "c50d580eb2cb3ffb" + }, + { + "message": "Items: {count}", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 5, + "hash": "b4e9c94aacfc6c87" + }, + { + "message": "Items: {count}", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 50, + "hash": "fff212839fd96cbd" + }, + { + "message": "Items: {count}", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "fab81dc2459bc70a" + }, + { + "message": "Items: {count}", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -10, + "hash": "bdadbbabe6a7947d" + }, + { + "message": "Items: {count}", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -500, + "hash": "7b49afe1c477265f" + }, + { + "message": "Items: {count}", + "format": "STRING", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "a104d3cfa22875b4" + }, + { + "message": "Items: {count}", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "15d96e8615ddc518" + }, + { + "message": "Items: {count}", + "format": "STRING", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "5c7ccd17d5fd7b61" + }, + { + "message": " whitespace ", + "format": "STRING", + "context": null, + "id": null, + "maxChars": null, + "hash": "ef5e62362054a29e" + }, + { + "message": " whitespace ", + "format": "STRING", + "context": "", + "id": null, + "maxChars": null, + "hash": "ef5e62362054a29e" + }, + { + "message": " whitespace ", + "format": "STRING", + "context": "button", + "id": null, + "maxChars": null, + "hash": "498658ec43e8e618" + }, + { + "message": " whitespace ", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "64503264424d3772" + }, + { + "message": " whitespace ", + "format": "STRING", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "b386beb3e0783618" + }, + { + "message": " whitespace ", + "format": "STRING", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "0529f549466b68e8" + }, + { + "message": " whitespace ", + "format": "STRING", + "context": null, + "id": "", + "maxChars": null, + "hash": "ef5e62362054a29e" + }, + { + "message": " whitespace ", + "format": "STRING", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "5e04bcfe2cc574ef" + }, + { + "message": " whitespace ", + "format": "STRING", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "7865864593003a3f" + }, + { + "message": " whitespace ", + "format": "STRING", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "54cb2884672d3973" + }, + { + "message": " whitespace ", + "format": "STRING", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "5ece7198494e1ee7" + }, + { + "message": " whitespace ", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 0, + "hash": "f0add460a6993fcd" + }, + { + "message": " whitespace ", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 5, + "hash": "3fe7a431188f3387" + }, + { + "message": " whitespace ", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 50, + "hash": "e87404c1697d8aa2" + }, + { + "message": " whitespace ", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "fe92a6e589051190" + }, + { + "message": " whitespace ", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -10, + "hash": "a0402abd1d09e093" + }, + { + "message": " whitespace ", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -500, + "hash": "bcb669b2079687b1" + }, + { + "message": " whitespace ", + "format": "STRING", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "d26737dbabffa148" + }, + { + "message": " whitespace ", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "6531655d869924fa" + }, + { + "message": " whitespace ", + "format": "STRING", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "c9b1587042c6c914" + }, + { + "message": "Line1\nLine2", + "format": "STRING", + "context": null, + "id": null, + "maxChars": null, + "hash": "c31302eeea26a930" + }, + { + "message": "Line1\nLine2", + "format": "STRING", + "context": "", + "id": null, + "maxChars": null, + "hash": "c31302eeea26a930" + }, + { + "message": "Line1\nLine2", + "format": "STRING", + "context": "button", + "id": null, + "maxChars": null, + "hash": "95ace19bff6eb75c" + }, + { + "message": "Line1\nLine2", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "3c36fc9d19fe4531" + }, + { + "message": "Line1\nLine2", + "format": "STRING", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "19eff115723301fc" + }, + { + "message": "Line1\nLine2", + "format": "STRING", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "d02ae0e64fcee1ab" + }, + { + "message": "Line1\nLine2", + "format": "STRING", + "context": null, + "id": "", + "maxChars": null, + "hash": "c31302eeea26a930" + }, + { + "message": "Line1\nLine2", + "format": "STRING", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "d8fdcbd7f779e6d7" + }, + { + "message": "Line1\nLine2", + "format": "STRING", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "7aef8874c9f4c751" + }, + { + "message": "Line1\nLine2", + "format": "STRING", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "04c1fc90e77ccdd0" + }, + { + "message": "Line1\nLine2", + "format": "STRING", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "367af0edda7a576c" + }, + { + "message": "Line1\nLine2", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 0, + "hash": "a021cded8b23256a" + }, + { + "message": "Line1\nLine2", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 5, + "hash": "3ba9d52537e7da17" + }, + { + "message": "Line1\nLine2", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 50, + "hash": "3684b020604f0f92" + }, + { + "message": "Line1\nLine2", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "4c5855456066b45d" + }, + { + "message": "Line1\nLine2", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -10, + "hash": "f3f29a30e645969b" + }, + { + "message": "Line1\nLine2", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -500, + "hash": "083d2e206f5a5063" + }, + { + "message": "Line1\nLine2", + "format": "STRING", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "3505450e66f7b7df" + }, + { + "message": "Line1\nLine2", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "d1b013d81a246391" + }, + { + "message": "Line1\nLine2", + "format": "STRING", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "8fd6aea14780de74" + }, + { + "message": "emoji 🎉 test", + "format": "STRING", + "context": null, + "id": null, + "maxChars": null, + "hash": "fcbd7197611b62a3" + }, + { + "message": "emoji 🎉 test", + "format": "STRING", + "context": "", + "id": null, + "maxChars": null, + "hash": "fcbd7197611b62a3" + }, + { + "message": "emoji 🎉 test", + "format": "STRING", + "context": "button", + "id": null, + "maxChars": null, + "hash": "f4436a9b86f57e13" + }, + { + "message": "emoji 🎉 test", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "54be415e74d53eff" + }, + { + "message": "emoji 🎉 test", + "format": "STRING", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "c2fe8dfc43ec0101" + }, + { + "message": "emoji 🎉 test", + "format": "STRING", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "360258a07fbdcc4a" + }, + { + "message": "emoji 🎉 test", + "format": "STRING", + "context": null, + "id": "", + "maxChars": null, + "hash": "fcbd7197611b62a3" + }, + { + "message": "emoji 🎉 test", + "format": "STRING", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "ba9af554715d009c" + }, + { + "message": "emoji 🎉 test", + "format": "STRING", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "c178c03204bd7914" + }, + { + "message": "emoji 🎉 test", + "format": "STRING", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "a847dec114bd03e5" + }, + { + "message": "emoji 🎉 test", + "format": "STRING", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "3339e1f32dec3520" + }, + { + "message": "emoji 🎉 test", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 0, + "hash": "57da0cb88d8b47ff" + }, + { + "message": "emoji 🎉 test", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 5, + "hash": "3e6275b85dd2e31c" + }, + { + "message": "emoji 🎉 test", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 50, + "hash": "ee9b91b44e60d6d5" + }, + { + "message": "emoji 🎉 test", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "f965d3c284aeed68" + }, + { + "message": "emoji 🎉 test", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -10, + "hash": "a66a28b4eeeb5e7c" + }, + { + "message": "emoji 🎉 test", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -500, + "hash": "c78eab3aa56e3e8d" + }, + { + "message": "emoji 🎉 test", + "format": "STRING", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "ef485060c1c892e8" + }, + { + "message": "emoji 🎉 test", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "21ee3810d750339a" + }, + { + "message": "emoji 🎉 test", + "format": "STRING", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "10523af3ad0313b0" + }, + { + "message": "こんにちは", + "format": "STRING", + "context": null, + "id": null, + "maxChars": null, + "hash": "9dbcbaea1f620f2c" + }, + { + "message": "こんにちは", + "format": "STRING", + "context": "", + "id": null, + "maxChars": null, + "hash": "9dbcbaea1f620f2c" + }, + { + "message": "こんにちは", + "format": "STRING", + "context": "button", + "id": null, + "maxChars": null, + "hash": "ab624e6ab53a609c" + }, + { + "message": "こんにちは", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "3d5dc0757eb9a236" + }, + { + "message": "こんにちは", + "format": "STRING", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "1b9c0ac9c8cafa8c" + }, + { + "message": "こんにちは", + "format": "STRING", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "d0139425008d3fd8" + }, + { + "message": "こんにちは", + "format": "STRING", + "context": null, + "id": "", + "maxChars": null, + "hash": "9dbcbaea1f620f2c" + }, + { + "message": "こんにちは", + "format": "STRING", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "3bc9b66421ccc361" + }, + { + "message": "こんにちは", + "format": "STRING", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "1bfda319431cbb00" + }, + { + "message": "こんにちは", + "format": "STRING", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "0136835cb66cda0a" + }, + { + "message": "こんにちは", + "format": "STRING", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "60c595d9749cf78b" + }, + { + "message": "こんにちは", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 0, + "hash": "522ad60987701a8f" + }, + { + "message": "こんにちは", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 5, + "hash": "722ba864e54bd827" + }, + { + "message": "こんにちは", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 50, + "hash": "1ed9ea2999931e56" + }, + { + "message": "こんにちは", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "4b7dd635892786ca" + }, + { + "message": "こんにちは", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -10, + "hash": "c0bd3a558e15ea1a" + }, + { + "message": "こんにちは", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -500, + "hash": "f05abe3ad662f6a6" + }, + { + "message": "こんにちは", + "format": "STRING", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "e02313abb4f930fe" + }, + { + "message": "こんにちは", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "6f564679be98be30" + }, + { + "message": "こんにちは", + "format": "STRING", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "bcabb39a139a6791" + }, + { + "message": "Long string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string", + "format": "STRING", + "context": null, + "id": null, + "maxChars": null, + "hash": "14791dabad78b3b2" + }, + { + "message": "Long string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string", + "format": "STRING", + "context": "", + "id": null, + "maxChars": null, + "hash": "14791dabad78b3b2" + }, + { + "message": "Long string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string", + "format": "STRING", + "context": "button", + "id": null, + "maxChars": null, + "hash": "3bbc5a968aebaed3" + }, + { + "message": "Long string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "86c6f7ac2068cd38" + }, + { + "message": "Long string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string", + "format": "STRING", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "348eac36d7409590" + }, + { + "message": "Long string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string", + "format": "STRING", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "31f66d0bcc6ef575" + }, + { + "message": "Long string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string", + "format": "STRING", + "context": null, + "id": "", + "maxChars": null, + "hash": "14791dabad78b3b2" + }, + { + "message": "Long string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string", + "format": "STRING", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "8a6354e3a35c0d06" + }, + { + "message": "Long string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string", + "format": "STRING", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "8d2f1b85b885c8d0" + }, + { + "message": "Long string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string", + "format": "STRING", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "e926184c8cf70fa2" + }, + { + "message": "Long string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string", + "format": "STRING", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "20baea636c84e303" + }, + { + "message": "Long string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 0, + "hash": "bf7b77b795c699c4" + }, + { + "message": "Long string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 5, + "hash": "b92f3c5dbc904a5f" + }, + { + "message": "Long string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 50, + "hash": "2a1f2a87869fcdc0" + }, + { + "message": "Long string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "b67e4483b9c8347a" + }, + { + "message": "Long string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -10, + "hash": "37f9b58374a5098c" + }, + { + "message": "Long string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -500, + "hash": "06fbbb978bf0606f" + }, + { + "message": "Long string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string", + "format": "STRING", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "3cab2739792706e2" + }, + { + "message": "Long string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "d1fd73a36fda5f12" + }, + { + "message": "Long string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string string", + "format": "STRING", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "470e9fe05f4f32dc" + }, + { + "message": "{}", + "format": "STRING", + "context": null, + "id": null, + "maxChars": null, + "hash": "d0c22a4821601a1b" + }, + { + "message": "{}", + "format": "STRING", + "context": "", + "id": null, + "maxChars": null, + "hash": "d0c22a4821601a1b" + }, + { + "message": "{}", + "format": "STRING", + "context": "button", + "id": null, + "maxChars": null, + "hash": "bf4400e04a6a3c61" + }, + { + "message": "{}", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "e0e6a0edcfe146b5" + }, + { + "message": "{}", + "format": "STRING", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "a98a6ccc397b199e" + }, + { + "message": "{}", + "format": "STRING", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "7832c3d116cb9867" + }, + { + "message": "{}", + "format": "STRING", + "context": null, + "id": "", + "maxChars": null, + "hash": "d0c22a4821601a1b" + }, + { + "message": "{}", + "format": "STRING", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "1f5884d6e5bb713d" + }, + { + "message": "{}", + "format": "STRING", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "cd87ae9a9a2c6c43" + }, + { + "message": "{}", + "format": "STRING", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "8e0b7b599ec895c3" + }, + { + "message": "{}", + "format": "STRING", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "159f6bd0644a6913" + }, + { + "message": "{}", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 0, + "hash": "3091ea7e53a2cdbd" + }, + { + "message": "{}", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 5, + "hash": "16df42c38cb38cd1" + }, + { + "message": "{}", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 50, + "hash": "1e7b77b96152b84a" + }, + { + "message": "{}", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "2b968be81076506d" + }, + { + "message": "{}", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -10, + "hash": "720dbc6cd827ba3e" + }, + { + "message": "{}", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -500, + "hash": "392e37f2cd9917ea" + }, + { + "message": "{}", + "format": "STRING", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "3a08aa4531a3798b" + }, + { + "message": "{}", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "75c9429e692b4650" + }, + { + "message": "{}", + "format": "STRING", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "5e270682c9280379" + }, + { + "message": "{{escaped}}", + "format": "STRING", + "context": null, + "id": null, + "maxChars": null, + "hash": "146424f94fdbd40b" + }, + { + "message": "{{escaped}}", + "format": "STRING", + "context": "", + "id": null, + "maxChars": null, + "hash": "146424f94fdbd40b" + }, + { + "message": "{{escaped}}", + "format": "STRING", + "context": "button", + "id": null, + "maxChars": null, + "hash": "7fffeba514067b5c" + }, + { + "message": "{{escaped}}", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "3358eea57cf2d5ac" + }, + { + "message": "{{escaped}}", + "format": "STRING", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "db6af67081df66af" + }, + { + "message": "{{escaped}}", + "format": "STRING", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "0a121b820422b6d7" + }, + { + "message": "{{escaped}}", + "format": "STRING", + "context": null, + "id": "", + "maxChars": null, + "hash": "146424f94fdbd40b" + }, + { + "message": "{{escaped}}", + "format": "STRING", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "e87dd5bd9ac440af" + }, + { + "message": "{{escaped}}", + "format": "STRING", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "c82925829edac11f" + }, + { + "message": "{{escaped}}", + "format": "STRING", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "8c2d64d780d984db" + }, + { + "message": "{{escaped}}", + "format": "STRING", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "563b588661b9db26" + }, + { + "message": "{{escaped}}", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 0, + "hash": "a8d657af52727792" + }, + { + "message": "{{escaped}}", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 5, + "hash": "28801359bdeb9157" + }, + { + "message": "{{escaped}}", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 50, + "hash": "8123a452f906576a" + }, + { + "message": "{{escaped}}", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "f7e5e0df335b3214" + }, + { + "message": "{{escaped}}", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -10, + "hash": "75bf524bef3bc40e" + }, + { + "message": "{{escaped}}", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -500, + "hash": "18597eabd113e239" + }, + { + "message": "{{escaped}}", + "format": "STRING", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "76468b6d09790da4" + }, + { + "message": "{{escaped}}", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "fb2d8471fc407bc6" + }, + { + "message": "{{escaped}}", + "format": "STRING", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "390c586bf0b1a1da" + }, + { + "message": "{a}{b}{c}", + "format": "STRING", + "context": null, + "id": null, + "maxChars": null, + "hash": "576b83eb6890c20d" + }, + { + "message": "{a}{b}{c}", + "format": "STRING", + "context": "", + "id": null, + "maxChars": null, + "hash": "576b83eb6890c20d" + }, + { + "message": "{a}{b}{c}", + "format": "STRING", + "context": "button", + "id": null, + "maxChars": null, + "hash": "a067fdc034c780af" + }, + { + "message": "{a}{b}{c}", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "dcb0fe58e5425db0" + }, + { + "message": "{a}{b}{c}", + "format": "STRING", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "4cfdcde6622fe04a" + }, + { + "message": "{a}{b}{c}", + "format": "STRING", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "f190be1fcc3b8e75" + }, + { + "message": "{a}{b}{c}", + "format": "STRING", + "context": null, + "id": "", + "maxChars": null, + "hash": "576b83eb6890c20d" + }, + { + "message": "{a}{b}{c}", + "format": "STRING", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "2ef1379b53d55c04" + }, + { + "message": "{a}{b}{c}", + "format": "STRING", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "bab17bdf8835f51e" + }, + { + "message": "{a}{b}{c}", + "format": "STRING", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "4c672bcaf0993c67" + }, + { + "message": "{a}{b}{c}", + "format": "STRING", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "b367e8e86eac0cd7" + }, + { + "message": "{a}{b}{c}", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 0, + "hash": "249a1ee60696e174" + }, + { + "message": "{a}{b}{c}", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 5, + "hash": "c0caf0f3f047b178" + }, + { + "message": "{a}{b}{c}", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 50, + "hash": "085cfc888f204879" + }, + { + "message": "{a}{b}{c}", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "a2f7fe50f304ee56" + }, + { + "message": "{a}{b}{c}", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -10, + "hash": "d172d163d95ad7ba" + }, + { + "message": "{a}{b}{c}", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -500, + "hash": "959bacaa0a828192" + }, + { + "message": "{a}{b}{c}", + "format": "STRING", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "e1c4deea967766fc" + }, + { + "message": "{a}{b}{c}", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "faf8291acc3ec982" + }, + { + "message": "{a}{b}{c}", + "format": "STRING", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "67b24ac56433460d" + }, + { + "message": "single_word", + "format": "STRING", + "context": null, + "id": null, + "maxChars": null, + "hash": "60e35181c69e0bf3" + }, + { + "message": "single_word", + "format": "STRING", + "context": "", + "id": null, + "maxChars": null, + "hash": "60e35181c69e0bf3" + }, + { + "message": "single_word", + "format": "STRING", + "context": "button", + "id": null, + "maxChars": null, + "hash": "4f58ff86b48777f6" + }, + { + "message": "single_word", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "5d3f30b5da6d45c7" + }, + { + "message": "single_word", + "format": "STRING", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "07a953e0200e8a1e" + }, + { + "message": "single_word", + "format": "STRING", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "51b10b02285cfe95" + }, + { + "message": "single_word", + "format": "STRING", + "context": null, + "id": "", + "maxChars": null, + "hash": "60e35181c69e0bf3" + }, + { + "message": "single_word", + "format": "STRING", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "d10253b89e73d659" + }, + { + "message": "single_word", + "format": "STRING", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "a9ccce94b7c008ff" + }, + { + "message": "single_word", + "format": "STRING", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "c7030311d6984171" + }, + { + "message": "single_word", + "format": "STRING", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "45ca6e1dfe9d70a3" + }, + { + "message": "single_word", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 0, + "hash": "2c4cc51cfa90002d" + }, + { + "message": "single_word", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 5, + "hash": "141f201c0b1fcaf8" + }, + { + "message": "single_word", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 50, + "hash": "71f16f6e4aa58492" + }, + { + "message": "single_word", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "fe7702fbc0505103" + }, + { + "message": "single_word", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -10, + "hash": "17193c054850c8d8" + }, + { + "message": "single_word", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -500, + "hash": "a0002a12b4fc753b" + }, + { + "message": "single_word", + "format": "STRING", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "19813f602d720644" + }, + { + "message": "single_word", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "fff35d47e9ea232c" + }, + { + "message": "single_word", + "format": "STRING", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "840de0f2a3759da8" + }, + { + "message": "hyphen-ated", + "format": "STRING", + "context": null, + "id": null, + "maxChars": null, + "hash": "3b8d27340aca3ca8" + }, + { + "message": "hyphen-ated", + "format": "STRING", + "context": "", + "id": null, + "maxChars": null, + "hash": "3b8d27340aca3ca8" + }, + { + "message": "hyphen-ated", + "format": "STRING", + "context": "button", + "id": null, + "maxChars": null, + "hash": "ec44dfa3a74ab41b" + }, + { + "message": "hyphen-ated", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "f95823b34d00360b" + }, + { + "message": "hyphen-ated", + "format": "STRING", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "5af271275d9177d0" + }, + { + "message": "hyphen-ated", + "format": "STRING", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "e4bf85797205f9a6" + }, + { + "message": "hyphen-ated", + "format": "STRING", + "context": null, + "id": "", + "maxChars": null, + "hash": "3b8d27340aca3ca8" + }, + { + "message": "hyphen-ated", + "format": "STRING", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "b766a9389b0aa853" + }, + { + "message": "hyphen-ated", + "format": "STRING", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "9cbafbef1e900bd8" + }, + { + "message": "hyphen-ated", + "format": "STRING", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "dfd4f28430fd45b3" + }, + { + "message": "hyphen-ated", + "format": "STRING", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "e8944169b6603d0e" + }, + { + "message": "hyphen-ated", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 0, + "hash": "3d0ad9fe7180c62c" + }, + { + "message": "hyphen-ated", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 5, + "hash": "a1bbde7730a06cc1" + }, + { + "message": "hyphen-ated", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 50, + "hash": "dcd807e061da6295" + }, + { + "message": "hyphen-ated", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "fc25ccb294cc22a9" + }, + { + "message": "hyphen-ated", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -10, + "hash": "bbfd4e95f3208de2" + }, + { + "message": "hyphen-ated", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -500, + "hash": "b9475db2ac1c5b61" + }, + { + "message": "hyphen-ated", + "format": "STRING", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "1bd66b29359a5a6e" + }, + { + "message": "hyphen-ated", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "89c3dc5a5408b254" + }, + { + "message": "hyphen-ated", + "format": "STRING", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "4a6d22a48e6b8a28" + }, + { + "message": "under_scored", + "format": "STRING", + "context": null, + "id": null, + "maxChars": null, + "hash": "273ccb8c19887857" + }, + { + "message": "under_scored", + "format": "STRING", + "context": "", + "id": null, + "maxChars": null, + "hash": "273ccb8c19887857" + }, + { + "message": "under_scored", + "format": "STRING", + "context": "button", + "id": null, + "maxChars": null, + "hash": "613144224578a3b8" + }, + { + "message": "under_scored", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "ef4b2ab987641805" + }, + { + "message": "under_scored", + "format": "STRING", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "ddd65aac61ae5b33" + }, + { + "message": "under_scored", + "format": "STRING", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "8f9780318e6fdaa2" + }, + { + "message": "under_scored", + "format": "STRING", + "context": null, + "id": "", + "maxChars": null, + "hash": "273ccb8c19887857" + }, + { + "message": "under_scored", + "format": "STRING", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "821a6b4712a6f3b3" + }, + { + "message": "under_scored", + "format": "STRING", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "7ea8af6d035902d4" + }, + { + "message": "under_scored", + "format": "STRING", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "9f87bd2b84b55e1c" + }, + { + "message": "under_scored", + "format": "STRING", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "f76ce7c6cb8a636c" + }, + { + "message": "under_scored", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 0, + "hash": "14816ce16f6b6f73" + }, + { + "message": "under_scored", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 5, + "hash": "bc1c8364c732e9fa" + }, + { + "message": "under_scored", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 50, + "hash": "e8784584e49afaab" + }, + { + "message": "under_scored", + "format": "STRING", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "f7e61f4838c29fa4" + }, + { + "message": "under_scored", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -10, + "hash": "75daefb63f340f8d" + }, + { + "message": "under_scored", + "format": "STRING", + "context": null, + "id": null, + "maxChars": -500, + "hash": "ebf9c0956c64c925" + }, + { + "message": "under_scored", + "format": "STRING", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "0f9f5c82df0dbda6" + }, + { + "message": "under_scored", + "format": "STRING", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "3d3dd8f9506dffb1" + }, + { + "message": "under_scored", + "format": "STRING", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "47b765f6030ccaa4" + }, + { + "message": "", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": null, + "hash": "315afb75bf9bd6ff" + }, + { + "message": "", + "format": "I18NEXT", + "context": "", + "id": null, + "maxChars": null, + "hash": "315afb75bf9bd6ff" + }, + { + "message": "", + "format": "I18NEXT", + "context": "button", + "id": null, + "maxChars": null, + "hash": "3dcca7cb9efacea9" + }, + { + "message": "", + "format": "I18NEXT", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "1f5cddd18eca035c" + }, + { + "message": "", + "format": "I18NEXT", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "c99855c0e93d1f4c" + }, + { + "message": "", + "format": "I18NEXT", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "f4206f2900405d61" + }, + { + "message": "", + "format": "I18NEXT", + "context": null, + "id": "", + "maxChars": null, + "hash": "315afb75bf9bd6ff" + }, + { + "message": "", + "format": "I18NEXT", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "56ef2250a74df3ee" + }, + { + "message": "", + "format": "I18NEXT", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "2110331c124b8690" + }, + { + "message": "", + "format": "I18NEXT", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "4f4168248fc2fe7a" + }, + { + "message": "", + "format": "I18NEXT", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "474080c85f5cb22a" + }, + { + "message": "", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 0, + "hash": "f6cd9cec84563c1d" + }, + { + "message": "", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 5, + "hash": "014a6dca945b6a87" + }, + { + "message": "", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 50, + "hash": "6ae174a68a143612" + }, + { + "message": "", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "b76e623fa706fc16" + }, + { + "message": "", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": -10, + "hash": "0979f8f050cef676" + }, + { + "message": "", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": -500, + "hash": "5a5da1a95220be74" + }, + { + "message": "", + "format": "I18NEXT", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "5cd6b0af5091e522" + }, + { + "message": "", + "format": "I18NEXT", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "531007d406e9c8c0" + }, + { + "message": "", + "format": "I18NEXT", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "554908db4faa724e" + }, + { + "message": "Hello", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": null, + "hash": "8a94530e065ca454" + }, + { + "message": "Hello", + "format": "I18NEXT", + "context": "", + "id": null, + "maxChars": null, + "hash": "8a94530e065ca454" + }, + { + "message": "Hello", + "format": "I18NEXT", + "context": "button", + "id": null, + "maxChars": null, + "hash": "6d10ea676d5213f1" + }, + { + "message": "Hello", + "format": "I18NEXT", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "8bd0e248f242184c" + }, + { + "message": "Hello", + "format": "I18NEXT", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "b3330b9f1ed7c45e" + }, + { + "message": "Hello", + "format": "I18NEXT", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "6cca344941d68d1a" + }, + { + "message": "Hello", + "format": "I18NEXT", + "context": null, + "id": "", + "maxChars": null, + "hash": "8a94530e065ca454" + }, + { + "message": "Hello", + "format": "I18NEXT", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "2167fa2476ca61ce" + }, + { + "message": "Hello", + "format": "I18NEXT", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "1537d82daef9a43f" + }, + { + "message": "Hello", + "format": "I18NEXT", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "347a2f76a4159ea1" + }, + { + "message": "Hello", + "format": "I18NEXT", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "771dfe9039f93003" + }, + { + "message": "Hello", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 0, + "hash": "84f7098fba574254" + }, + { + "message": "Hello", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 5, + "hash": "415642d1d2bcefa1" + }, + { + "message": "Hello", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 50, + "hash": "86f094a1d006c37b" + }, + { + "message": "Hello", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "15518bd6875fc262" + }, + { + "message": "Hello", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": -10, + "hash": "81b0ff5343a63de7" + }, + { + "message": "Hello", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": -500, + "hash": "07e3018ba3824394" + }, + { + "message": "Hello", + "format": "I18NEXT", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "bbdb715a7065f09a" + }, + { + "message": "Hello", + "format": "I18NEXT", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "adef296255203f93" + }, + { + "message": "Hello", + "format": "I18NEXT", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "c77dce425441cbb9" + }, + { + "message": "Hello, world!", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": null, + "hash": "97607418e88e200c" + }, + { + "message": "Hello, world!", + "format": "I18NEXT", + "context": "", + "id": null, + "maxChars": null, + "hash": "97607418e88e200c" + }, + { + "message": "Hello, world!", + "format": "I18NEXT", + "context": "button", + "id": null, + "maxChars": null, + "hash": "24313bc9879ef434" + }, + { + "message": "Hello, world!", + "format": "I18NEXT", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "72f52be893374b70" + }, + { + "message": "Hello, world!", + "format": "I18NEXT", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "702cb57eaa3ca87a" + }, + { + "message": "Hello, world!", + "format": "I18NEXT", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "2621e80a2589da84" + }, + { + "message": "Hello, world!", + "format": "I18NEXT", + "context": null, + "id": "", + "maxChars": null, + "hash": "97607418e88e200c" + }, + { + "message": "Hello, world!", + "format": "I18NEXT", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "652d745c6dc8d007" + }, + { + "message": "Hello, world!", + "format": "I18NEXT", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "6fa8675026d7b1c0" + }, + { + "message": "Hello, world!", + "format": "I18NEXT", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "a0c9adfe5ff62eca" + }, + { + "message": "Hello, world!", + "format": "I18NEXT", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "d09a4de576d951a0" + }, + { + "message": "Hello, world!", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 0, + "hash": "b0fed9e112fd29de" + }, + { + "message": "Hello, world!", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 5, + "hash": "7c3ec9eecdaca0be" + }, + { + "message": "Hello, world!", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 50, + "hash": "f715ac098c821cd4" + }, + { + "message": "Hello, world!", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "87c412a0d1771f67" + }, + { + "message": "Hello, world!", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": -10, + "hash": "96e2e6ede2246337" + }, + { + "message": "Hello, world!", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": -500, + "hash": "f21e905fd9bf39fd" + }, + { + "message": "Hello, world!", + "format": "I18NEXT", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "325906b655a99c28" + }, + { + "message": "Hello, world!", + "format": "I18NEXT", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "e78c4e9abf596a9d" + }, + { + "message": "Hello, world!", + "format": "I18NEXT", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "c7fc3430c18a1872" + }, + { + "message": "Hello, {{name}}!", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": null, + "hash": "c5c58c5e7d6c5cc2" + }, + { + "message": "Hello, {{name}}!", + "format": "I18NEXT", + "context": "", + "id": null, + "maxChars": null, + "hash": "c5c58c5e7d6c5cc2" + }, + { + "message": "Hello, {{name}}!", + "format": "I18NEXT", + "context": "button", + "id": null, + "maxChars": null, + "hash": "af95e8c940669ff9" + }, + { + "message": "Hello, {{name}}!", + "format": "I18NEXT", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "e260806c2fffc236" + }, + { + "message": "Hello, {{name}}!", + "format": "I18NEXT", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "8363f86a05d10fb8" + }, + { + "message": "Hello, {{name}}!", + "format": "I18NEXT", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "8debf1f6ed343457" + }, + { + "message": "Hello, {{name}}!", + "format": "I18NEXT", + "context": null, + "id": "", + "maxChars": null, + "hash": "c5c58c5e7d6c5cc2" + }, + { + "message": "Hello, {{name}}!", + "format": "I18NEXT", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "a033c5f1f18224f2" + }, + { + "message": "Hello, {{name}}!", + "format": "I18NEXT", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "8726fdc3465859c3" + }, + { + "message": "Hello, {{name}}!", + "format": "I18NEXT", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "1441068df7c82ed5" + }, + { + "message": "Hello, {{name}}!", + "format": "I18NEXT", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "e0fa8876878b09c9" + }, + { + "message": "Hello, {{name}}!", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 0, + "hash": "ceba899385756b2b" + }, + { + "message": "Hello, {{name}}!", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 5, + "hash": "9235ddca886a9a9f" + }, + { + "message": "Hello, {{name}}!", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 50, + "hash": "9d321667973cb6a1" + }, + { + "message": "Hello, {{name}}!", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "0dc102eeb8deedc1" + }, + { + "message": "Hello, {{name}}!", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": -10, + "hash": "709c15a058a60bce" + }, + { + "message": "Hello, {{name}}!", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": -500, + "hash": "6c64575d25b6019b" + }, + { + "message": "Hello, {{name}}!", + "format": "I18NEXT", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "ea1e481c189107a2" + }, + { + "message": "Hello, {{name}}!", + "format": "I18NEXT", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "78e09f54fce1b9ed" + }, + { + "message": "Hello, {{name}}!", + "format": "I18NEXT", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "913c547fad8f8443" + }, + { + "message": "Welcome {{user}}!", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": null, + "hash": "f520f0395b9b90ab" + }, + { + "message": "Welcome {{user}}!", + "format": "I18NEXT", + "context": "", + "id": null, + "maxChars": null, + "hash": "f520f0395b9b90ab" + }, + { + "message": "Welcome {{user}}!", + "format": "I18NEXT", + "context": "button", + "id": null, + "maxChars": null, + "hash": "a8836a699ab5addb" + }, + { + "message": "Welcome {{user}}!", + "format": "I18NEXT", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "c08e82528167d38b" + }, + { + "message": "Welcome {{user}}!", + "format": "I18NEXT", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "c57c121a85028534" + }, + { + "message": "Welcome {{user}}!", + "format": "I18NEXT", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "fd21a00fb6ce846e" + }, + { + "message": "Welcome {{user}}!", + "format": "I18NEXT", + "context": null, + "id": "", + "maxChars": null, + "hash": "f520f0395b9b90ab" + }, + { + "message": "Welcome {{user}}!", + "format": "I18NEXT", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "dc8f6cd08c23ae20" + }, + { + "message": "Welcome {{user}}!", + "format": "I18NEXT", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "e76db6ebbd067a0b" + }, + { + "message": "Welcome {{user}}!", + "format": "I18NEXT", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "ccb9fd3aba7b3562" + }, + { + "message": "Welcome {{user}}!", + "format": "I18NEXT", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "4b4d10ef41759173" + }, + { + "message": "Welcome {{user}}!", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 0, + "hash": "7e2036dda32aa562" + }, + { + "message": "Welcome {{user}}!", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 5, + "hash": "3870d2301c0e3cef" + }, + { + "message": "Welcome {{user}}!", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 50, + "hash": "e77df492ab9bb809" + }, + { + "message": "Welcome {{user}}!", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "ca773203fe51c23f" + }, + { + "message": "Welcome {{user}}!", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": -10, + "hash": "266b6aaa7d317b82" + }, + { + "message": "Welcome {{user}}!", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": -500, + "hash": "3c37c832125797ad" + }, + { + "message": "Welcome {{user}}!", + "format": "I18NEXT", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "4028106ef709cbf6" + }, + { + "message": "Welcome {{user}}!", + "format": "I18NEXT", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "7e8fde502af6d3c7" + }, + { + "message": "Welcome {{user}}!", + "format": "I18NEXT", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "14960b983501dccd" + }, + { + "message": "You have {{count}} items", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": null, + "hash": "807727a66ae047ad" + }, + { + "message": "You have {{count}} items", + "format": "I18NEXT", + "context": "", + "id": null, + "maxChars": null, + "hash": "807727a66ae047ad" + }, + { + "message": "You have {{count}} items", + "format": "I18NEXT", + "context": "button", + "id": null, + "maxChars": null, + "hash": "b135f05ec641b984" + }, + { + "message": "You have {{count}} items", + "format": "I18NEXT", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "8196eb9c1f3cc24e" + }, + { + "message": "You have {{count}} items", + "format": "I18NEXT", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "97d1ff668c999a67" + }, + { + "message": "You have {{count}} items", + "format": "I18NEXT", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "e37754183638fb51" + }, + { + "message": "You have {{count}} items", + "format": "I18NEXT", + "context": null, + "id": "", + "maxChars": null, + "hash": "807727a66ae047ad" + }, + { + "message": "You have {{count}} items", + "format": "I18NEXT", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "5e9be87a98930b80" + }, + { + "message": "You have {{count}} items", + "format": "I18NEXT", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "61c678c400d7de66" + }, + { + "message": "You have {{count}} items", + "format": "I18NEXT", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "1b9ad216e2ad1733" + }, + { + "message": "You have {{count}} items", + "format": "I18NEXT", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "2cb9adce05e76c50" + }, + { + "message": "You have {{count}} items", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 0, + "hash": "9a6462325808d7b9" + }, + { + "message": "You have {{count}} items", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 5, + "hash": "5796df0ffad95a48" + }, + { + "message": "You have {{count}} items", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 50, + "hash": "45f002e880efffd2" + }, + { + "message": "You have {{count}} items", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "4ffe6cc3e32cb318" + }, + { + "message": "You have {{count}} items", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": -10, + "hash": "ec7f5b99c14caeca" + }, + { + "message": "You have {{count}} items", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": -500, + "hash": "71d4a8153835d55e" + }, + { + "message": "You have {{count}} items", + "format": "I18NEXT", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "1029ee5d3555d8a6" + }, + { + "message": "You have {{count}} items", + "format": "I18NEXT", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "b851517517a3fe02" + }, + { + "message": "You have {{count}} items", + "format": "I18NEXT", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "37a8c92e61366b0f" + }, + { + "message": "Nested {{outer}} with {{inner}}", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": null, + "hash": "84fad1da4b4d8d28" + }, + { + "message": "Nested {{outer}} with {{inner}}", + "format": "I18NEXT", + "context": "", + "id": null, + "maxChars": null, + "hash": "84fad1da4b4d8d28" + }, + { + "message": "Nested {{outer}} with {{inner}}", + "format": "I18NEXT", + "context": "button", + "id": null, + "maxChars": null, + "hash": "9533a8d2d2259b44" + }, + { + "message": "Nested {{outer}} with {{inner}}", + "format": "I18NEXT", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "39bbfdc63a3005b3" + }, + { + "message": "Nested {{outer}} with {{inner}}", + "format": "I18NEXT", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "8b6a3697a3a1d683" + }, + { + "message": "Nested {{outer}} with {{inner}}", + "format": "I18NEXT", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "d89816a75325fda1" + }, + { + "message": "Nested {{outer}} with {{inner}}", + "format": "I18NEXT", + "context": null, + "id": "", + "maxChars": null, + "hash": "84fad1da4b4d8d28" + }, + { + "message": "Nested {{outer}} with {{inner}}", + "format": "I18NEXT", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "952955b43cc094be" + }, + { + "message": "Nested {{outer}} with {{inner}}", + "format": "I18NEXT", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "ede2836cee650150" + }, + { + "message": "Nested {{outer}} with {{inner}}", + "format": "I18NEXT", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "242c63e0f9ba9865" + }, + { + "message": "Nested {{outer}} with {{inner}}", + "format": "I18NEXT", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "4ace64643cbaa3a9" + }, + { + "message": "Nested {{outer}} with {{inner}}", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 0, + "hash": "53a078e9411f1842" + }, + { + "message": "Nested {{outer}} with {{inner}}", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 5, + "hash": "9e81bab30e04e966" + }, + { + "message": "Nested {{outer}} with {{inner}}", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 50, + "hash": "8a244ef4926dc89e" + }, + { + "message": "Nested {{outer}} with {{inner}}", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "253df1745d0b788c" + }, + { + "message": "Nested {{outer}} with {{inner}}", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": -10, + "hash": "975eae2aa3ef2bf8" + }, + { + "message": "Nested {{outer}} with {{inner}}", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": -500, + "hash": "e69e200cf25be24c" + }, + { + "message": "Nested {{outer}} with {{inner}}", + "format": "I18NEXT", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "9264f921b80503bf" + }, + { + "message": "Nested {{outer}} with {{inner}}", + "format": "I18NEXT", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "d56dd0fc5d5f079d" + }, + { + "message": "Nested {{outer}} with {{inner}}", + "format": "I18NEXT", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "f8fdeb8787636849" + }, + { + "message": "Mixed {{name}} and {count}", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": null, + "hash": "76aeeb5024e5ffb4" + }, + { + "message": "Mixed {{name}} and {count}", + "format": "I18NEXT", + "context": "", + "id": null, + "maxChars": null, + "hash": "76aeeb5024e5ffb4" + }, + { + "message": "Mixed {{name}} and {count}", + "format": "I18NEXT", + "context": "button", + "id": null, + "maxChars": null, + "hash": "5cb77018fd39cd1f" + }, + { + "message": "Mixed {{name}} and {count}", + "format": "I18NEXT", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "9cdbaf31533aa757" + }, + { + "message": "Mixed {{name}} and {count}", + "format": "I18NEXT", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "6752418442104028" + }, + { + "message": "Mixed {{name}} and {count}", + "format": "I18NEXT", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "dcf46859953f77a5" + }, + { + "message": "Mixed {{name}} and {count}", + "format": "I18NEXT", + "context": null, + "id": "", + "maxChars": null, + "hash": "76aeeb5024e5ffb4" + }, + { + "message": "Mixed {{name}} and {count}", + "format": "I18NEXT", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "a83192ee06db1316" + }, + { + "message": "Mixed {{name}} and {count}", + "format": "I18NEXT", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "99548407707bc2b2" + }, + { + "message": "Mixed {{name}} and {count}", + "format": "I18NEXT", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "e54c8b74c863c75b" + }, + { + "message": "Mixed {{name}} and {count}", + "format": "I18NEXT", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "04097ce0f0f6d6a6" + }, + { + "message": "Mixed {{name}} and {count}", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 0, + "hash": "a668889eef7894a0" + }, + { + "message": "Mixed {{name}} and {count}", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 5, + "hash": "25a845601600753c" + }, + { + "message": "Mixed {{name}} and {count}", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 50, + "hash": "453bffa7a766dac1" + }, + { + "message": "Mixed {{name}} and {count}", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "f236fc529f51c242" + }, + { + "message": "Mixed {{name}} and {count}", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": -10, + "hash": "91e752b1731264da" + }, + { + "message": "Mixed {{name}} and {count}", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": -500, + "hash": "18da69d426f33b34" + }, + { + "message": "Mixed {{name}} and {count}", + "format": "I18NEXT", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "031d1837725364e3" + }, + { + "message": "Mixed {{name}} and {count}", + "format": "I18NEXT", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "c82d84081397e8bb" + }, + { + "message": "Mixed {{name}} and {count}", + "format": "I18NEXT", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "a849457bdce39152" + }, + { + "message": " whitespace ", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": null, + "hash": "d0e7f3803867895a" + }, + { + "message": " whitespace ", + "format": "I18NEXT", + "context": "", + "id": null, + "maxChars": null, + "hash": "d0e7f3803867895a" + }, + { + "message": " whitespace ", + "format": "I18NEXT", + "context": "button", + "id": null, + "maxChars": null, + "hash": "db6ee86696968eac" + }, + { + "message": " whitespace ", + "format": "I18NEXT", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "fbd471080a51fea1" + }, + { + "message": " whitespace ", + "format": "I18NEXT", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "6588caa565d14344" + }, + { + "message": " whitespace ", + "format": "I18NEXT", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "153abb17cfe3e907" + }, + { + "message": " whitespace ", + "format": "I18NEXT", + "context": null, + "id": "", + "maxChars": null, + "hash": "d0e7f3803867895a" + }, + { + "message": " whitespace ", + "format": "I18NEXT", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "dcd8d23f5fb2c464" + }, + { + "message": " whitespace ", + "format": "I18NEXT", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "5dcfc2a8d04b3456" + }, + { + "message": " whitespace ", + "format": "I18NEXT", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "f843e418b2dc7a7c" + }, + { + "message": " whitespace ", + "format": "I18NEXT", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "7c206b0926c8403b" + }, + { + "message": " whitespace ", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 0, + "hash": "4dabdd7b6ed6efa9" + }, + { + "message": " whitespace ", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 5, + "hash": "22c244f06e3c17a2" + }, + { + "message": " whitespace ", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 50, + "hash": "58981442bef7abd3" + }, + { + "message": " whitespace ", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "d5cbd46c5577ce50" + }, + { + "message": " whitespace ", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": -10, + "hash": "f8c5944c94572a07" + }, + { + "message": " whitespace ", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": -500, + "hash": "092dfad1c52d0bb8" + }, + { + "message": " whitespace ", + "format": "I18NEXT", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "ea0d9da29853f102" + }, + { + "message": " whitespace ", + "format": "I18NEXT", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "2cbc57f928044281" + }, + { + "message": " whitespace ", + "format": "I18NEXT", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "68b4889fb91f41bb" + }, + { + "message": "emoji 🎉 {{name}}", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": null, + "hash": "4e8c64d9aad16ad0" + }, + { + "message": "emoji 🎉 {{name}}", + "format": "I18NEXT", + "context": "", + "id": null, + "maxChars": null, + "hash": "4e8c64d9aad16ad0" + }, + { + "message": "emoji 🎉 {{name}}", + "format": "I18NEXT", + "context": "button", + "id": null, + "maxChars": null, + "hash": "928aca849e4ded96" + }, + { + "message": "emoji 🎉 {{name}}", + "format": "I18NEXT", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "d5052e024b996e60" + }, + { + "message": "emoji 🎉 {{name}}", + "format": "I18NEXT", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "71666173f29f231b" + }, + { + "message": "emoji 🎉 {{name}}", + "format": "I18NEXT", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "280b8e578bec02cd" + }, + { + "message": "emoji 🎉 {{name}}", + "format": "I18NEXT", + "context": null, + "id": "", + "maxChars": null, + "hash": "4e8c64d9aad16ad0" + }, + { + "message": "emoji 🎉 {{name}}", + "format": "I18NEXT", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "35c84e4fb754c144" + }, + { + "message": "emoji 🎉 {{name}}", + "format": "I18NEXT", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "d7e2a5a4c29866b8" + }, + { + "message": "emoji 🎉 {{name}}", + "format": "I18NEXT", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "ca85b4669c4a78eb" + }, + { + "message": "emoji 🎉 {{name}}", + "format": "I18NEXT", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "bed10f3d836ad045" + }, + { + "message": "emoji 🎉 {{name}}", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 0, + "hash": "6fe0cd3e1f773235" + }, + { + "message": "emoji 🎉 {{name}}", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 5, + "hash": "94f8f1fbbc16564e" + }, + { + "message": "emoji 🎉 {{name}}", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 50, + "hash": "7bdffb9a4124523a" + }, + { + "message": "emoji 🎉 {{name}}", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "ea4ad0fbd888ea21" + }, + { + "message": "emoji 🎉 {{name}}", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": -10, + "hash": "f3e614cb827cd6c6" + }, + { + "message": "emoji 🎉 {{name}}", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": -500, + "hash": "e5039876e1293598" + }, + { + "message": "emoji 🎉 {{name}}", + "format": "I18NEXT", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "d41598c5efe634d8" + }, + { + "message": "emoji 🎉 {{name}}", + "format": "I18NEXT", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "30e4e35c2cc21e45" + }, + { + "message": "emoji 🎉 {{name}}", + "format": "I18NEXT", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "28e6a039c86df2cc" + }, + { + "message": "{{a}}{{b}}{{c}}", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": null, + "hash": "f8a230d7e7fd5845" + }, + { + "message": "{{a}}{{b}}{{c}}", + "format": "I18NEXT", + "context": "", + "id": null, + "maxChars": null, + "hash": "f8a230d7e7fd5845" + }, + { + "message": "{{a}}{{b}}{{c}}", + "format": "I18NEXT", + "context": "button", + "id": null, + "maxChars": null, + "hash": "42899e55d9498421" + }, + { + "message": "{{a}}{{b}}{{c}}", + "format": "I18NEXT", + "context": "menu", + "id": null, + "maxChars": null, + "hash": "3ed1b0fe525e0253" + }, + { + "message": "{{a}}{{b}}{{c}}", + "format": "I18NEXT", + "context": "a_very_long_context_value_for_testing", + "id": null, + "maxChars": null, + "hash": "7a409a08dc64c576" + }, + { + "message": "{{a}}{{b}}{{c}}", + "format": "I18NEXT", + "context": "こんにちは", + "id": null, + "maxChars": null, + "hash": "8300a8beab9a9b73" + }, + { + "message": "{{a}}{{b}}{{c}}", + "format": "I18NEXT", + "context": null, + "id": "", + "maxChars": null, + "hash": "f8a230d7e7fd5845" + }, + { + "message": "{{a}}{{b}}{{c}}", + "format": "I18NEXT", + "context": null, + "id": "greeting", + "maxChars": null, + "hash": "12df23dd99e1956c" + }, + { + "message": "{{a}}{{b}}{{c}}", + "format": "I18NEXT", + "context": null, + "id": "user.profile.button", + "maxChars": null, + "hash": "3bfe4afe2b19f5e8" + }, + { + "message": "{{a}}{{b}}{{c}}", + "format": "I18NEXT", + "context": null, + "id": "x/y/z", + "maxChars": null, + "hash": "3ba092d7ee2a761c" + }, + { + "message": "{{a}}{{b}}{{c}}", + "format": "I18NEXT", + "context": null, + "id": "a-b-c", + "maxChars": null, + "hash": "5acaa88c8fa04691" + }, + { + "message": "{{a}}{{b}}{{c}}", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 0, + "hash": "d078c4aa2306dee4" + }, + { + "message": "{{a}}{{b}}{{c}}", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 5, + "hash": "72ca7b9c02c45394" + }, + { + "message": "{{a}}{{b}}{{c}}", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 50, + "hash": "ca3c460117fd8f3b" + }, + { + "message": "{{a}}{{b}}{{c}}", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": 1000, + "hash": "4ddfcf1fd4ac2abe" + }, + { + "message": "{{a}}{{b}}{{c}}", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": -10, + "hash": "34f504eb82935787" + }, + { + "message": "{{a}}{{b}}{{c}}", + "format": "I18NEXT", + "context": null, + "id": null, + "maxChars": -500, + "hash": "ec32e142c2109850" + }, + { + "message": "{{a}}{{b}}{{c}}", + "format": "I18NEXT", + "context": "button", + "id": "save_btn", + "maxChars": null, + "hash": "59203d11a737f80e" + }, + { + "message": "{{a}}{{b}}{{c}}", + "format": "I18NEXT", + "context": "menu", + "id": null, + "maxChars": 20, + "hash": "de8290076381b5c2" + }, + { + "message": "{{a}}{{b}}{{c}}", + "format": "I18NEXT", + "context": "menu", + "id": "menu_save", + "maxChars": 50, + "hash": "6e712be705c1fa44" + } + ], + "collisionCases": [ + { + "message": "", + "context": null, + "hashes": { + "ICU": "e2b070737b68f01d", + "STRING": "301efa5d6b271581", + "I18NEXT": "315afb75bf9bd6ff" + } + }, + { + "message": "", + "context": "button", + "hashes": { + "ICU": "b3be66567482f5b6", + "STRING": "c1ba29f981c859e1", + "I18NEXT": "3dcca7cb9efacea9" + } + }, + { + "message": "Hello", + "context": null, + "hashes": { + "ICU": "3cdffbf6452b553c", + "STRING": "c6e2133cd6d89e5d", + "I18NEXT": "8a94530e065ca454" + } + }, + { + "message": "Hello", + "context": "button", + "hashes": { + "ICU": "7ff12157cc99c288", + "STRING": "85e6a57c73fa392c", + "I18NEXT": "6d10ea676d5213f1" + } + }, + { + "message": "Hello, {name}!", + "context": null, + "hashes": { + "ICU": "9b323e35e1a80c51", + "STRING": "71e647ee929eb212", + "I18NEXT": "da1a461df06cb6f0" + } + }, + { + "message": "Hello, {name}!", + "context": "button", + "hashes": { + "ICU": "9f2b79f9cacb6ce9", + "STRING": "63f9ca42b13c18a9", + "I18NEXT": "f61a161005f412a0" + } + }, + { + "message": "simple", + "context": null, + "hashes": { + "ICU": "193d57867ff62c8c", + "STRING": "8e9bc869411d7670", + "I18NEXT": "6baa6531de7d5372" + } + }, + { + "message": "simple", + "context": "button", + "hashes": { + "ICU": "9f1ca551758d174b", + "STRING": "7e2e008bc3d903d8", + "I18NEXT": "61c00fa9fd66061e" + } + } + ] +} \ No newline at end of file diff --git a/packages/gt-i18n/tests/i18n_manager/__init__.py b/packages/gt-i18n/tests/i18n_manager/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/packages/gt-i18n/tests/i18n_manager/test_deprecation.py b/packages/gt-i18n/tests/i18n_manager/test_deprecation.py new file mode 100644 index 0000000..99b9e70 --- /dev/null +++ b/packages/gt-i18n/tests/i18n_manager/test_deprecation.py @@ -0,0 +1,127 @@ +"""Golden-standard tests for the deprecation path on I18nManager. + +gt PR #1207 renamed several I18nManager methods but kept the old names as +backward-compat aliases. The Python port follows the same pattern but uses +``DeprecationWarning`` (Pythonic) rather than a JSDoc annotation. + +Rename map: + +| Old (deprecated, still works) | New | +| ----------------------------- | --------------------------- | +| ``get_translations(locale)`` | ``load_translations(...)`` | +| ``get_translation_resolver`` | ``get_lookup_translation`` | +| ``resolve_translation_sync`` | ``lookup_translation`` | +| ``get_translation_loader`` | (pass ``load_translations`` directly) | + +Contract: +1. Calling an old method emits ``DeprecationWarning`` pointing at the new name. +2. The old method's behavior is IDENTICAL to the new method — users upgrading + later won't see a behavior change, just the warning disappears. + +All tests should FAIL until PR #1207 is ported — the new methods they delegate +to don't exist yet, and the old methods don't yet emit warnings. +""" + +from __future__ import annotations + +import pytest +from gt_i18n import I18nManager + +# --------------------------------------------------------------------------- +# test_get_translations_emits_deprecation_warning +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_get_translations_emits_deprecation_warning() -> None: + """``get_translations(locale)`` emits DeprecationWarning pointing at ``load_translations``.""" + manager = I18nManager(default_locale="en", locales=["en", "es"], load_translations=lambda locale: {}) + + with pytest.warns(DeprecationWarning, match="load_translations"): + result = await manager.get_translations("es") + + # Behavior parity: same return value as the new name. + assert result == await manager.load_translations("es") + + +# --------------------------------------------------------------------------- +# test_get_translation_resolver_emits_deprecation_warning +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_get_translation_resolver_emits_deprecation_warning() -> None: + """``get_translation_resolver(locale)`` emits DeprecationWarning → ``get_lookup_translation``.""" + manager = I18nManager(default_locale="en", locales=["en", "es"], load_translations=lambda locale: {}) + + with pytest.warns(DeprecationWarning, match="get_lookup_translation"): + resolver = await manager.get_translation_resolver("es") + + # The returned resolver is a callable, same as get_lookup_translation's return. + assert callable(resolver) + + +# --------------------------------------------------------------------------- +# test_resolve_translation_sync_emits_deprecation_warning +# --------------------------------------------------------------------------- + + +def test_resolve_translation_sync_emits_deprecation_warning() -> None: + """``resolve_translation_sync(message, options)`` emits DeprecationWarning → ``lookup_translation``.""" + manager = I18nManager(default_locale="en", locales=["en", "es"], load_translations=lambda locale: {}) + manager.set_locale("es") + + with pytest.warns(DeprecationWarning, match="lookup_translation"): + result = manager.resolve_translation_sync("Hello", {"_format": "ICU"}) + + # Behavior parity: same value as lookup_translation (both return None on cold cache). + assert result == manager.lookup_translation("Hello", _format="ICU") + + +# --------------------------------------------------------------------------- +# test_get_translation_loader_emits_deprecation_warning +# +# ``get_translation_loader()`` exposed the raw loader callable; the new path +# is to pass ``load_translations`` directly to the constructor (which you +# already do). The method is kept but warns. +# --------------------------------------------------------------------------- + + +def test_get_translation_loader_emits_deprecation_warning() -> None: + """``get_translation_loader()`` emits DeprecationWarning.""" + manager = I18nManager(default_locale="en", locales=["en", "es"], load_translations=lambda locale: {}) + + with pytest.warns(DeprecationWarning, match="load_translations"): + loader = manager.get_translation_loader() + + # Still returns the loader callable. + assert callable(loader) + + +# --------------------------------------------------------------------------- +# test_deprecated_methods_preserve_semantics +# +# Parametric contract: for each (deprecated, new) pair, the return values +# match. This guards against silent semantic drift between the pair. +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_deprecated_methods_preserve_semantics() -> None: + """Each deprecated method must return the same thing as its replacement.""" + h_map = {"hash_x": "translated-x"} + manager = I18nManager( + default_locale="en", + locales=["en", "es"], + load_translations=lambda locale: h_map if locale == "es" else {}, + ) + manager.set_locale("es") + + # get_translations vs load_translations — same dict. + import warnings + + with warnings.catch_warnings(): + warnings.simplefilter("ignore", DeprecationWarning) + old = await manager.get_translations("es") + new = await manager.load_translations("es") + assert old == new == h_map diff --git a/packages/gt-i18n/tests/i18n_manager/test_i18n_manager_runtime.py b/packages/gt-i18n/tests/i18n_manager/test_i18n_manager_runtime.py new file mode 100644 index 0000000..ac71d32 --- /dev/null +++ b/packages/gt-i18n/tests/i18n_manager/test_i18n_manager_runtime.py @@ -0,0 +1,249 @@ +"""Golden-standard tests for the new ``I18nManager`` runtime-translation API. + +PR #1207 added four new methods on ``I18nManager``: + +- ``lookup_translation(message, **options)`` — SYNC dict-cache lookup; returns + cached translation or None. Never triggers a network call. +- ``lookup_translation_with_fallback(message, **options)`` — ASYNC; returns the + cached translation, or fetches via runtime translate on a miss. +- ``load_translations(locale=None)`` — ASYNC; returns dict[hash, translation] + for the locale (loading if needed). Replaces deprecated ``get_translations``. +- ``get_lookup_translation(locale=None, prefetch=None)`` — ASYNC; returns a + sync callable bound to a locale, optionally prefetching a list of entries. + +All four are tested here as the observable contract of the public +``I18nManager`` class. + +All tests in this file should FAIL until PR #1207 is ported. +""" + +from __future__ import annotations + +import asyncio +from typing import TYPE_CHECKING, Any + +import pytest +from gt_i18n import I18nManager +from gt_i18n.translation_functions._hash_message import hash_message + +if TYPE_CHECKING: + from conftest import FakeGT # noqa: F401 — only for type hints + + +# --------------------------------------------------------------------------- +# test_lookup_translation_sync_returns_cached +# +# Example: +# await manager.load_translations("es") # populates cache from loader +# result = manager.lookup_translation("Hello", _format="ICU") +# # → translation from the pre-loaded dict; NO API call fired +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_lookup_translation_sync_returns_cached(fake_gt: FakeGT) -> None: + """After ``load_translations``, ``lookup_translation`` returns cached values synchronously.""" + h = hash_message("Hello") + + def loader(locale: str) -> dict[str, str]: + return {h: "Hola"} if locale == "es" else {} + + manager = I18nManager(default_locale="en", locales=["en", "es"], load_translations=loader) + manager.set_locale("es") + + await manager.load_translations("es") + assert manager.lookup_translation("Hello", _format="ICU") == "Hola" + + # No runtime API was fired — sync dict lookup only. + assert len(fake_gt.calls) == 0 + + +# --------------------------------------------------------------------------- +# test_lookup_translation_sync_returns_none_on_miss +# +# Sync-safe: a miss does NOT trigger a fetch. The sync ``t()`` path relies on this. +# --------------------------------------------------------------------------- + + +def test_lookup_translation_sync_returns_none_on_miss(fake_gt: FakeGT) -> None: + """``lookup_translation`` returns None on miss and does not trigger runtime translate.""" + manager = I18nManager(default_locale="en", locales=["en", "es"], load_translations=lambda locale: {}) + manager.set_locale("es") + + assert manager.lookup_translation("Never-seen message", _format="ICU") is None + assert len(fake_gt.calls) == 0, "sync lookup_translation MUST NOT fire translate_many" + + +# --------------------------------------------------------------------------- +# test_lookup_translation_with_fallback_triggers_runtime_on_miss +# +# Example: +# result = await manager.lookup_translation_with_fallback( +# "Greeting", _format="STRING" +# ) +# # → translate_many called; result matches the mock response. +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_lookup_translation_with_fallback_triggers_runtime_on_miss(fake_gt: FakeGT) -> None: + """``lookup_translation_with_fallback`` hits translate_many on miss and returns fetched translation.""" + manager = I18nManager(default_locale="en", locales=["en", "es"], load_translations=lambda locale: {}) + manager.set_locale("es") + + result = await manager.lookup_translation_with_fallback("Greeting", _format="STRING") + + assert result == "[es]Greeting" + assert len(fake_gt.calls) == 1 + + +# --------------------------------------------------------------------------- +# test_lookup_translation_with_fallback_returns_cached_without_api_call +# +# Second call for the same message returns cached value — no second API call. +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_lookup_translation_with_fallback_returns_cached_without_api_call(fake_gt: FakeGT) -> None: + """Second call for the same message is a cache hit — no repeat API call.""" + manager = I18nManager(default_locale="en", locales=["en", "es"], load_translations=lambda locale: {}) + manager.set_locale("es") + + first = await manager.lookup_translation_with_fallback("Hi", _format="STRING") + second = await manager.lookup_translation_with_fallback("Hi", _format="STRING") + + assert first == second == "[es]Hi" + assert len(fake_gt.calls) == 1, "second lookup must be served from cache" + + +# --------------------------------------------------------------------------- +# test_load_translations_returns_dict_for_locale +# +# JS rename: ``getTranslations(locale)`` → ``loadTranslations(locale)``. +# Python gets ``load_translations(locale)`` as the new canonical name. +# +# Example: +# result = await manager.load_translations("es") +# # → {"": "", ...} +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_load_translations_returns_dict_for_locale(fake_gt: FakeGT) -> None: + """``load_translations(locale)`` returns the dict of {hash: translation} for that locale.""" + loaded = {"hash_a": "valA", "hash_b": "valB"} + + def loader(locale: str) -> dict[str, str]: + return loaded if locale == "es" else {} + + manager = I18nManager(default_locale="en", locales=["en", "es"], load_translations=loader) + + result = await manager.load_translations("es") + assert result == loaded + + +# --------------------------------------------------------------------------- +# test_get_lookup_translation_returns_callable_after_prefetch +# +# Mirrors the JS ``useGT`` / ``getGT`` pattern: caller awaits once to prefetch, +# then receives a SYNC lookup callable it can use in hot paths. +# +# Example: +# lookup = await manager.get_lookup_translation( +# "es", +# prefetch=[{"message": "Hi", "options": {"_format": "STRING"}}], +# ) +# # After the prefetch completes, lookup is sync and cache-hits. +# result = lookup("Hi", {"_format": "STRING"}) +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_get_lookup_translation_returns_callable_after_prefetch(fake_gt: FakeGT) -> None: + """``get_lookup_translation`` returns a sync callable that resolves from the prefetched cache.""" + manager = I18nManager(default_locale="en", locales=["en", "es"], load_translations=lambda locale: {}) + manager.set_locale("es") + + lookup = await manager.get_lookup_translation( + "es", + prefetch=[{"message": "Hi", "options": {"_format": "STRING"}}], + ) + + # The prefetch fetched "Hi"; the returned callable is sync and cache-hits. + calls_before_lookup = len(fake_gt.calls) + result = lookup("Hi", {"_format": "STRING"}) + assert result == "[es]Hi" + assert len(fake_gt.calls) == calls_before_lookup, "sync lookup after prefetch must not fire API" + + +# --------------------------------------------------------------------------- +# test_lifecycle_callbacks_passed_in_constructor_fire_correctly +# +# The I18nManager accepts a `lifecycle` dict and wires it through to the +# internal caches. This test asserts miss → hit transitions on the +# translations-level callbacks. +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_lifecycle_callbacks_passed_in_constructor_fire_correctly(fake_gt: FakeGT) -> None: + """``lifecycle`` kwarg on I18nManager wires callbacks through to the caches.""" + hits: list[dict[str, Any]] = [] + misses: list[dict[str, Any]] = [] + + manager = I18nManager( + default_locale="en", + locales=["en", "es"], + load_translations=lambda locale: {}, + lifecycle={ + "on_translations_cache_hit": lambda *, locale, hash, value: hits.append( + {"locale": locale, "hash": hash, "value": value} + ), + "on_translations_cache_miss": lambda *, locale, hash, value: misses.append( + {"locale": locale, "hash": hash, "value": value} + ), + }, + ) + manager.set_locale("es") + + # First call: miss callback fires (runtime translate happens). + await manager.lookup_translation_with_fallback("Hello", _format="STRING") + assert len(misses) == 1 + assert len(hits) == 0 + + # Second call for the same message: hit callback fires; no new miss. + await manager.lookup_translation_with_fallback("Hello", _format="STRING") + assert len(misses) == 1 + assert len(hits) == 1 + + +# --------------------------------------------------------------------------- +# test_batching_params_configurable_on_i18n_manager +# +# Pin: batch_size / batch_interval_ms / max_concurrent_requests can be set on +# I18nManager and flow through to the TranslationsCache it creates. +# Asserted behaviorally: batch_size=2 means 5 concurrent fetches produce at +# least 3 batches (none > 2 entries). +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_batching_params_configurable_on_i18n_manager(fake_gt: FakeGT) -> None: + """Batching constants set on I18nManager take effect in the underlying cache.""" + manager = I18nManager( + default_locale="en", + locales=["en", "es"], + load_translations=lambda locale: {}, + batch_size=2, + batch_interval_ms=5, + ) + manager.set_locale("es") + + messages = [f"msg-{i}" for i in range(5)] + await asyncio.gather(*(manager.lookup_translation_with_fallback(m, _format="STRING") for m in messages)) + + assert sum(len(c["sources"]) for c in fake_gt.calls) == 5 + assert all(len(c["sources"]) <= 2 for c in fake_gt.calls), ( + f"batch_size=2 was not honored: batches={[len(c['sources']) for c in fake_gt.calls]}" + ) diff --git a/packages/gt-i18n/tests/i18n_manager/test_i18n_manager_scenarios.py b/packages/gt-i18n/tests/i18n_manager/test_i18n_manager_scenarios.py new file mode 100644 index 0000000..163f8a5 --- /dev/null +++ b/packages/gt-i18n/tests/i18n_manager/test_i18n_manager_scenarios.py @@ -0,0 +1,371 @@ +"""Extended integration tests for ``I18nManager``. + +Dense coverage for locale switching, preloading, deprecation parity across +many inputs, and source-locale short-circuit semantics. +""" + +from __future__ import annotations + +import asyncio +import warnings +from typing import TYPE_CHECKING, Any + +import pytest +from gt_i18n import I18nManager + +if TYPE_CHECKING: + from conftest import FakeGT # noqa: F401 + + +# --------------------------------------------------------------------------- +# Locale switching +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_locale_switch_midflight_routes_to_correct_locale(fake_gt: FakeGT) -> None: + """Switching manager's locale between calls sends each to the correct target.""" + manager = I18nManager(default_locale="en", locales=["en", "es", "fr", "de"], load_translations=lambda loc: {}) + + manager.set_locale("es") + await manager.lookup_translation_with_fallback("Hello", _format="STRING") + manager.set_locale("fr") + await manager.lookup_translation_with_fallback("Hello", _format="STRING") + manager.set_locale("de") + await manager.lookup_translation_with_fallback("Hello", _format="STRING") + + targets = [c["options"].get("target_locale") or c["options"].get("targetLocale") for c in fake_gt.calls] + assert set(targets) == {"es", "fr", "de"} + + +@pytest.mark.asyncio +async def test_per_call_locale_override_does_not_change_manager_state(fake_gt: FakeGT) -> None: + """``_locale=...`` on a single call does NOT mutate manager.get_locale().""" + manager = I18nManager(default_locale="en", locales=["en", "es", "fr"], load_translations=lambda loc: {}) + manager.set_locale("es") + await manager.lookup_translation_with_fallback("Hi", _format="STRING", _locale="fr") + assert manager.get_locale() == "es" # unchanged + + +# --------------------------------------------------------------------------- +# Preload + sync lookup path +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_load_all_preloads_every_configured_locale(fake_gt: FakeGT) -> None: + """``load_all_translations`` calls the loader once per locale.""" + loaded: list[str] = [] + + def loader(locale: str) -> dict[str, str]: + loaded.append(locale) + return {} + + manager = I18nManager(default_locale="en", locales=["en", "es", "fr", "ja"], load_translations=loader) + await manager.load_all_translations() + assert set(loaded) == {"en", "es", "fr", "ja"} + + +@pytest.mark.asyncio +async def test_sync_lookup_after_preload_hits_cache_every_time(fake_gt: FakeGT) -> None: + """Preloaded translations are available to sync ``lookup_translation``.""" + from gt_i18n.translation_functions._hash_message import hash_message + + h = hash_message("Hello, {name}!", format="ICU") + + def loader(locale: str) -> dict[str, str]: + return {h: "Hola, {name}!"} if locale == "es" else {} + + manager = I18nManager(default_locale="en", locales=["en", "es"], load_translations=loader) + await manager.load_translations("es") + manager.set_locale("es") + + for _ in range(10): + assert manager.lookup_translation("Hello, {name}!", _format="ICU") == "Hola, {name}!" + assert len(fake_gt.calls) == 0 + + +# --------------------------------------------------------------------------- +# Source-locale short circuits +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_lookup_with_fallback_at_source_locale_returns_none_and_no_api(fake_gt: FakeGT) -> None: + """For the source locale, lookup_translation_with_fallback returns None and skips API.""" + manager = I18nManager(default_locale="en", locales=["en", "es"], load_translations=lambda loc: {}) + manager.set_locale("en") + result = await manager.lookup_translation_with_fallback("Hello", _format="STRING") + assert result is None + assert len(fake_gt.calls) == 0 + + +def test_sync_lookup_at_source_locale_returns_none_without_touching_cache(fake_gt: FakeGT) -> None: + """Source-locale sync lookup returns None early, never reads the cache.""" + manager = I18nManager(default_locale="en", locales=["en", "es"], load_translations=lambda loc: {}) + manager.set_locale("en") + assert manager.lookup_translation("Hello", _format="STRING") is None + + +# --------------------------------------------------------------------------- +# load_translations behavior +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_load_translations_twice_uses_cache(fake_gt: FakeGT) -> None: + """Second call to load_translations for the same locale doesn't re-invoke the loader.""" + calls = [0] + + def loader(locale: str) -> dict[str, str]: + calls[0] += 1 + return {"h": "v"} + + manager = I18nManager(default_locale="en", locales=["en", "es"], load_translations=loader) + a = await manager.load_translations("es") + b = await manager.load_translations("es") + assert a == b == {"h": "v"} + assert calls[0] == 1 + + +@pytest.mark.asyncio +async def test_load_translations_uses_current_locale_when_none_given(fake_gt: FakeGT) -> None: + """Calling load_translations() with no arg uses the manager's current locale.""" + loaded: list[str] = [] + + def loader(locale: str) -> dict[str, str]: + loaded.append(locale) + return {} + + manager = I18nManager(default_locale="en", locales=["en", "fr"], load_translations=loader) + manager.set_locale("fr") + await manager.load_translations() + assert loaded == ["fr"] + + +# --------------------------------------------------------------------------- +# get_lookup_translation semantics +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_get_lookup_translation_without_prefetch_still_returns_callable( + fake_gt: FakeGT, +) -> None: + """``get_lookup_translation`` with no prefetch still returns a working sync callable.""" + manager = I18nManager(default_locale="en", locales=["en", "es"], load_translations=lambda loc: {}) + lookup = await manager.get_lookup_translation("es") + # Nothing prefetched → cache cold → lookup misses. + result = lookup("Hello", {"_format": "STRING"}) + assert result is None + + +@pytest.mark.asyncio +async def test_get_lookup_translation_with_multi_prefetch_warms_all(fake_gt: FakeGT) -> None: + """Every entry in ``prefetch`` becomes a cache hit on the returned callable.""" + manager = I18nManager( + default_locale="en", locales=["en", "es"], load_translations=lambda loc: {}, batch_interval_ms=5 + ) + manager.set_locale("es") + prefetch = [ + {"message": "one", "options": {"_format": "STRING"}}, + {"message": "two", "options": {"_format": "STRING"}}, + {"message": "three", "options": {"_format": "STRING"}}, + ] + lookup = await manager.get_lookup_translation("es", prefetch=prefetch) + assert lookup("one", {"_format": "STRING"}) == "[es]one" + assert lookup("two", {"_format": "STRING"}) == "[es]two" + assert lookup("three", {"_format": "STRING"}) == "[es]three" + + +@pytest.mark.asyncio +async def test_get_lookup_translation_with_empty_prefetch_list(fake_gt: FakeGT) -> None: + """``prefetch=[]`` is valid and equivalent to no prefetch.""" + manager = I18nManager(default_locale="en", locales=["en", "es"], load_translations=lambda loc: {}) + lookup = await manager.get_lookup_translation("es", prefetch=[]) + assert lookup("x", {"_format": "STRING"}) is None + + +# --------------------------------------------------------------------------- +# Deprecation parity (matrix) +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +@pytest.mark.parametrize( + "message,options", + [ + ("Hello", {"_format": "ICU"}), + ("Save", {"_format": "ICU", "_context": "button"}), + ("Long message with {vars}", {"_format": "ICU"}), + ("Plain STRING", {"_format": "STRING"}), + ("{{i18next}}", {"_format": "I18NEXT"}), + ], +) +async def test_resolve_translation_sync_parity_with_lookup_translation( + fake_gt: FakeGT, message: str, options: dict[str, Any] +) -> None: + """Deprecated ``resolve_translation_sync`` must return identical results to ``lookup_translation``.""" + manager = I18nManager(default_locale="en", locales=["en", "es"], load_translations=lambda loc: {}) + manager.set_locale("es") + + with warnings.catch_warnings(): + warnings.simplefilter("ignore", DeprecationWarning) + a = manager.resolve_translation_sync(message, options) + b = manager.lookup_translation(message, **options) + assert a == b + + +@pytest.mark.asyncio +async def test_get_translations_parity_with_load_translations_for_multi_locale(fake_gt: FakeGT) -> None: + """Deprecated ``get_translations`` returns the same dict as ``load_translations`` for each locale.""" + data = {"es": {"h1": "es1"}, "fr": {"h2": "fr2"}, "de": {"h3": "de3"}} + + def loader(locale: str) -> dict[str, str]: + return data.get(locale, {}) + + manager = I18nManager(default_locale="en", locales=["en", *data.keys()], load_translations=loader) + + for loc in data: + with warnings.catch_warnings(): + warnings.simplefilter("ignore", DeprecationWarning) + old = await manager.get_translations(loc) + new = await manager.load_translations(loc) + assert old == new == data[loc] + + +def test_get_translation_loader_returns_same_callable_as_constructed() -> None: + """``get_translation_loader`` returns the loader that was passed in.""" + + def my_loader(locale: str) -> dict[str, str]: + return {} + + manager = I18nManager(default_locale="en", locales=["en", "es"], load_translations=my_loader) + with warnings.catch_warnings(): + warnings.simplefilter("ignore", DeprecationWarning) + got = manager.get_translation_loader() + assert got is my_loader + + +# --------------------------------------------------------------------------- +# Deprecation warning contents +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_get_translations_warning_message_mentions_load_translations() -> None: + manager = I18nManager(default_locale="en", locales=["en", "es"], load_translations=lambda loc: {}) + with pytest.warns(DeprecationWarning, match="load_translations") as record: + await manager.get_translations("es") + assert len(record) >= 1 + + +@pytest.mark.asyncio +async def test_get_translation_resolver_warning_mentions_get_lookup_translation() -> None: + manager = I18nManager(default_locale="en", locales=["en", "es"], load_translations=lambda loc: {}) + with pytest.warns(DeprecationWarning, match="get_lookup_translation"): + await manager.get_translation_resolver("es") + + +def test_resolve_translation_sync_warning_mentions_lookup_translation() -> None: + manager = I18nManager(default_locale="en", locales=["en", "es"], load_translations=lambda loc: {}) + manager.set_locale("es") + with pytest.warns(DeprecationWarning, match="lookup_translation"): + manager.resolve_translation_sync("Hi", {"_format": "ICU"}) + + +def test_get_translation_loader_warning_mentions_load_translations() -> None: + manager = I18nManager(default_locale="en", locales=["en", "es"], load_translations=lambda loc: {}) + with pytest.warns(DeprecationWarning, match="load_translations"): + manager.get_translation_loader() + + +# --------------------------------------------------------------------------- +# Batching param defaults + overrides +# --------------------------------------------------------------------------- + + +def test_default_batching_params_match_js_defaults() -> None: + """Default batch_size/interval/concurrency values match JS constants.""" + manager = I18nManager(default_locale="en", locales=["en", "es"], load_translations=lambda loc: {}) + tc = asyncio.run(manager._locales_cache.miss("es")) + assert tc._batch_size == 25 + assert tc._batch_interval_ms == 50 + assert tc._max_concurrent_requests == 100 + + +@pytest.mark.asyncio +async def test_batching_kwargs_flow_through_to_translations_cache(fake_gt: FakeGT) -> None: + """Custom batching constants reach the TranslationsCache instances.""" + manager = I18nManager( + default_locale="en", + locales=["en", "es"], + load_translations=lambda loc: {}, + batch_size=7, + batch_interval_ms=13, + max_concurrent_requests=3, + ) + tc = await manager._locales_cache.miss("es") + assert tc._batch_size == 7 + assert tc._batch_interval_ms == 13 + assert tc._max_concurrent_requests == 3 + + +# --------------------------------------------------------------------------- +# Lifecycle callback fanout +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_lifecycle_is_shared_between_locales(fake_gt: FakeGT) -> None: + """The same lifecycle dict fires callbacks for every locale served.""" + misses: list[str] = [] + + manager = I18nManager( + default_locale="en", + locales=["en", "es", "fr"], + load_translations=lambda loc: {}, + lifecycle={ + "on_translations_cache_miss": lambda *, locale, hash, value: misses.append(locale), + }, + ) + manager.set_locale("es") + await manager.lookup_translation_with_fallback("A", _format="STRING") + manager.set_locale("fr") + await manager.lookup_translation_with_fallback("B", _format="STRING") + assert misses == ["es", "fr"] + + +@pytest.mark.asyncio +async def test_lifecycle_locales_miss_fires_for_each_locale_load(fake_gt: FakeGT) -> None: + """Each locale's initial load fires ``on_locales_cache_miss`` once.""" + observed: list[str] = [] + + manager = I18nManager( + default_locale="en", + locales=["en", "es", "fr"], + load_translations=lambda loc: {}, + lifecycle={"on_locales_cache_miss": lambda *, locale, value: observed.append(locale)}, + ) + await manager.load_translations("es") + await manager.load_translations("fr") + assert observed == ["es", "fr"] + + +# --------------------------------------------------------------------------- +# Translation timeout forwarding +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_translation_timeout_ms_reaches_translate_many(fake_gt: FakeGT) -> None: + """``translation_timeout_ms`` kwarg is passed through to GT.translate_many calls.""" + manager = I18nManager( + default_locale="en", + locales=["en", "es"], + load_translations=lambda loc: {}, + translation_timeout_ms=1234, + ) + manager.set_locale("es") + await manager.lookup_translation_with_fallback("Hi", _format="STRING") + assert fake_gt.calls[0]["timeout"] == 1234 diff --git a/packages/gt-i18n/tests/i18n_manager/test_locales_cache.py b/packages/gt-i18n/tests/i18n_manager/test_locales_cache.py new file mode 100644 index 0000000..2328523 --- /dev/null +++ b/packages/gt-i18n/tests/i18n_manager/test_locales_cache.py @@ -0,0 +1,262 @@ +"""Golden-standard tests for ``LocalesCache``. + +``LocalesCache`` is the outer cache in the two-level hierarchy introduced by +gt PR #1207. It maps ``locale → TranslationsCache``, enforces a TTL, and +invokes the user's ``load_translations`` loader exactly once per locale +per TTL window (with concurrent callers deduped via the same in-flight task). + +These tests lock the externally-observable behavior. They exercise the +private ``_locales_cache`` module directly because ``LocalesCache`` is the +contract that ``I18nManager`` depends on — if the shape changes, callers +upstream break. + +All tests in this file should FAIL until PR #1207 is ported (the module, +class, and constructor kwargs don't exist yet). +""" + +from __future__ import annotations + +import asyncio +from collections.abc import Awaitable, Callable +from typing import Any + +import pytest + +# Imports deliberately reference the not-yet-existent modules. These will +# ImportError until implementation lands — that's the point of the golden tests. +from gt_i18n.i18n_manager._locales_cache import LocalesCache # noqa: E402 + + +def _noop_translate_many_factory(locale: str) -> Callable[[dict[str, Any]], Awaitable[dict[str, Any]]]: + """Dummy ``create_translate_many`` factory — returns a callable that never runs. + + LocalesCache tests don't exercise runtime translation; they only care about the + locale-level loader behavior. Pass this when the test doesn't use translate_many. + """ + + async def _never(sources: dict[str, Any]) -> dict[str, Any]: + raise AssertionError("translate_many should not be called by LocalesCache tests") + + return _never + + +# --------------------------------------------------------------------------- +# test_miss_calls_loader_once +# +# Example: +# cache = LocalesCache(load_translations=loader, ...) +# await cache.miss("es") # loader invoked, result cached +# await cache.miss("es") # within TTL: cached, loader NOT invoked again +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_miss_calls_loader_once() -> None: + """Within the TTL window, a locale's loader is called once even across multiple misses.""" + call_count = 0 + + def loader(locale: str) -> dict[str, str]: + nonlocal call_count + call_count += 1 + return {"hash_greeting": "Hola"} + + cache = LocalesCache( + load_translations=loader, + create_translate_many=_noop_translate_many_factory, + ttl_ms=60_000, + ) + + first = await cache.miss("es") + second = await cache.miss("es") + + assert call_count == 1, "loader should have been invoked exactly once" + # Both calls return the same TranslationsCache instance — we have a single canonical cache per locale. + assert first is second + + +# --------------------------------------------------------------------------- +# test_expired_entry_triggers_reload +# +# Example: +# cache = LocalesCache(ttl_ms=10, ...) +# await cache.miss("es") +# await asyncio.sleep(0.02) # exceed the 10ms TTL +# await cache.miss("es") # loader invoked again +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_expired_entry_triggers_reload() -> None: + """After the TTL elapses, the next access re-invokes the loader.""" + call_count = 0 + + def loader(locale: str) -> dict[str, str]: + nonlocal call_count + call_count += 1 + return {} + + cache = LocalesCache( + load_translations=loader, + create_translate_many=_noop_translate_many_factory, + ttl_ms=10, # 10ms TTL for fast test + ) + + await cache.miss("es") + assert call_count == 1 + await asyncio.sleep(0.02) # 20ms > 10ms TTL + await cache.miss("es") + assert call_count == 2, "loader should have been re-invoked after TTL expiry" + + +# --------------------------------------------------------------------------- +# test_concurrent_miss_for_same_locale_dedups +# +# Mirrors the JS Cache.fallbackPromises pattern: if two callers miss() for the +# same locale while the loader is still in flight, they share one task and +# one loader invocation. +# +# Example: +# async def slow_loader(locale): +# await asyncio.sleep(0.02) +# return {} +# results = await asyncio.gather(cache.miss("es"), cache.miss("es")) +# # loader called ONCE +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_concurrent_miss_for_same_locale_dedups() -> None: + """Concurrent misses for the same locale share one in-flight loader task.""" + call_count = 0 + + async def slow_loader(locale: str) -> dict[str, str]: + nonlocal call_count + call_count += 1 + await asyncio.sleep(0.02) + return {"hash_greeting": "Bonjour"} + + cache = LocalesCache( + load_translations=slow_loader, + create_translate_many=_noop_translate_many_factory, + ttl_ms=60_000, + ) + + a, b, c = await asyncio.gather(cache.miss("fr"), cache.miss("fr"), cache.miss("fr")) + + assert call_count == 1, "concurrent misses should dedupe to a single loader invocation" + assert a is b is c, "all concurrent callers should receive the same TranslationsCache" + + +# --------------------------------------------------------------------------- +# test_loader_exception_does_not_poison_cache +# +# If the loader raises once, the next miss() should retry — a poisoned cache +# would leave the locale permanently broken. +# +# Example: +# counter = [0] +# def flaky_loader(locale): +# counter[0] += 1 +# if counter[0] == 1: +# raise RuntimeError("transient") +# return {"hash_x": "OK"} +# # first miss raises; second miss succeeds. +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_loader_exception_does_not_poison_cache() -> None: + """A loader failure is transient — subsequent misses retry and can succeed.""" + attempts = 0 + + def flaky_loader(locale: str) -> dict[str, str]: + nonlocal attempts + attempts += 1 + if attempts == 1: + raise RuntimeError("transient network failure") + return {"hash_x": "OK"} + + cache = LocalesCache( + load_translations=flaky_loader, + create_translate_many=_noop_translate_many_factory, + ttl_ms=60_000, + ) + + with pytest.raises(RuntimeError, match="transient"): + await cache.miss("de") + + # Second attempt should retry the loader — NOT return cached failure. + result = await cache.miss("de") + assert attempts == 2 + assert result is not None + + +# --------------------------------------------------------------------------- +# test_on_locales_cache_miss_fires_on_loader_invocation +# +# The lifecycle callback receives the resolved ``value`` (dict of translations +# from the loader) along with the locale. +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_on_locales_cache_miss_fires_on_loader_invocation() -> None: + """``on_locales_cache_miss`` fires with the loaded translations payload.""" + observed: list[dict[str, Any]] = [] + + def on_miss(*, locale: str, value: dict[str, str]) -> None: + observed.append({"locale": locale, "value": value}) + + def loader(locale: str) -> dict[str, str]: + return {"hash_a": "A", "hash_b": "B"} + + cache = LocalesCache( + load_translations=loader, + create_translate_many=_noop_translate_many_factory, + ttl_ms=60_000, + lifecycle={"on_locales_cache_miss": on_miss}, + ) + + await cache.miss("es") + + assert len(observed) == 1 + assert observed[0]["locale"] == "es" + assert observed[0]["value"] == {"hash_a": "A", "hash_b": "B"} + + +# --------------------------------------------------------------------------- +# test_on_locales_cache_hit_fires_on_cached_access +# +# When a cached locale is re-accessed within the TTL, ``on_locales_cache_hit`` +# fires (and ``on_locales_cache_miss`` does not fire on that access). +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_on_locales_cache_hit_fires_on_cached_access() -> None: + """``on_locales_cache_hit`` fires on repeated access within TTL; miss does not re-fire.""" + hits: list[dict[str, Any]] = [] + misses: list[dict[str, Any]] = [] + + cache = LocalesCache( + load_translations=lambda locale: {"hash_a": "A"}, + create_translate_many=_noop_translate_many_factory, + ttl_ms=60_000, + lifecycle={ + "on_locales_cache_hit": lambda *, locale, value: hits.append({"locale": locale, "value": value}), + "on_locales_cache_miss": lambda *, locale, value: misses.append({"locale": locale, "value": value}), + }, + ) + + # First access: miss fires. + await cache.miss("es") + assert len(misses) == 1 + assert len(hits) == 0 + + # Second access: hit fires; miss does not fire again. + # Note: `get()` is the sync accessor — it returns the TranslationsCache if cached. + cached = cache.get("es") + assert cached is not None + assert len(hits) == 1 + assert hits[0] == {"locale": "es", "value": {"hash_a": "A"}} + assert len(misses) == 1 # unchanged diff --git a/packages/gt-i18n/tests/i18n_manager/test_locales_cache_scenarios.py b/packages/gt-i18n/tests/i18n_manager/test_locales_cache_scenarios.py new file mode 100644 index 0000000..d868e13 --- /dev/null +++ b/packages/gt-i18n/tests/i18n_manager/test_locales_cache_scenarios.py @@ -0,0 +1,402 @@ +"""Extended behavioral tests for ``LocalesCache``. + +Complements ``test_locales_cache.py`` (the golden-standard file). +""" + +from __future__ import annotations + +import asyncio +import time +from collections.abc import Awaitable, Callable +from typing import Any + +import pytest +from gt_i18n.i18n_manager._locales_cache import LocalesCache + + +def _noop_translate_many_factory(locale: str) -> Callable[[dict[str, Any]], Awaitable[dict[str, Any]]]: + async def _never(sources: dict[str, Any]) -> dict[str, Any]: + raise AssertionError("translate_many should not be invoked in these tests") + + return _never + + +# --------------------------------------------------------------------------- +# Multi-locale independence +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_different_locales_dedup_independently() -> None: + """Concurrent misses for different locales each call their loader exactly once.""" + counters: dict[str, int] = {} + + async def slow_loader(locale: str) -> dict[str, str]: + counters[locale] = counters.get(locale, 0) + 1 + await asyncio.sleep(0.01) + return {f"{locale}_hash": f"{locale}_value"} + + cache = LocalesCache( + load_translations=slow_loader, + create_translate_many=_noop_translate_many_factory, + ttl_ms=60_000, + ) + await asyncio.gather(*(cache.miss(loc) for loc in ["es", "fr", "de", "ja"] * 3)) + assert counters == {"es": 1, "fr": 1, "de": 1, "ja": 1} + + +@pytest.mark.asyncio +async def test_one_locale_expiring_does_not_affect_others() -> None: + """TTL expiry on locale A leaves locale B's entry intact.""" + loads: list[str] = [] + + def loader(locale: str) -> dict[str, str]: + loads.append(locale) + return {} + + cache = LocalesCache( + load_translations=loader, + create_translate_many=_noop_translate_many_factory, + ttl_ms=10, + ) + await cache.miss("es") + await cache.miss("fr") + assert loads == ["es", "fr"] + + await asyncio.sleep(0.02) # both expired + + # Only re-hit es; fr should still be expired but we don't touch it. + await cache.miss("es") + assert loads == ["es", "fr", "es"] + + +# --------------------------------------------------------------------------- +# TTL edge cases +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_very_large_ttl_keeps_entry_across_multiple_hits() -> None: + """With a multi-hour TTL, 100 sequential gets produce no reload.""" + call_count = [0] + + def loader(locale: str) -> dict[str, str]: + call_count[0] += 1 + return {} + + cache = LocalesCache( + load_translations=loader, + create_translate_many=_noop_translate_many_factory, + ttl_ms=60 * 60 * 1000, # 1 hour + ) + await cache.miss("es") + for _ in range(100): + cache.get("es") + assert call_count[0] == 1 + + +@pytest.mark.asyncio +async def test_expired_entry_is_evicted_from_internal_cache_on_get() -> None: + """After TTL, ``get()`` removes the stale entry in addition to returning None.""" + cache = LocalesCache( + load_translations=lambda locale: {}, + create_translate_many=_noop_translate_many_factory, + ttl_ms=5, + ) + await cache.miss("es") + assert "es" in cache._cache # populated + await asyncio.sleep(0.015) + assert cache.get("es") is None + assert "es" not in cache._cache # evicted + + +# --------------------------------------------------------------------------- +# Sync vs async loaders +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_sync_loader_is_supported() -> None: + """A plain sync loader works — LocalesCache awaits only if the result is awaitable.""" + + def sync_loader(locale: str) -> dict[str, str]: + return {"k": f"v_{locale}"} + + cache = LocalesCache( + load_translations=sync_loader, + create_translate_many=_noop_translate_many_factory, + ) + tc = await cache.miss("es") + assert tc is not None + + +@pytest.mark.asyncio +async def test_async_loader_is_supported() -> None: + """An async loader is awaited properly.""" + + async def async_loader(locale: str) -> dict[str, str]: + await asyncio.sleep(0) + return {"k": f"async_v_{locale}"} + + cache = LocalesCache( + load_translations=async_loader, + create_translate_many=_noop_translate_many_factory, + ) + tc = await cache.miss("es") + assert tc is not None + + +# --------------------------------------------------------------------------- +# Loader returns empty / unexpected +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_loader_returning_empty_dict_still_caches() -> None: + """A loader returning {} still produces a valid cached entry.""" + calls = [0] + + def loader(locale: str) -> dict[str, str]: + calls[0] += 1 + return {} + + cache = LocalesCache( + load_translations=loader, + create_translate_many=_noop_translate_many_factory, + ) + await cache.miss("es") + # Second miss returns cached (loader not called again). + await cache.miss("es") + assert calls[0] == 1 + + +@pytest.mark.asyncio +async def test_loader_returning_many_entries_are_all_available() -> None: + """Large loader payload is fully preserved in the cache entry.""" + big = {f"hash_{i}": f"value_{i}" for i in range(500)} + + cache = LocalesCache( + load_translations=lambda locale: big, + create_translate_many=_noop_translate_many_factory, + ) + await cache.miss("es") + entry = cache._cache["es"] + assert entry.translations == big + assert len(entry.translations) == 500 + + +# --------------------------------------------------------------------------- +# Callback semantics +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_hit_callback_fires_each_time_get_succeeds() -> None: + """``on_locales_cache_hit`` fires on every successful get() — not just the first.""" + hits: list[str] = [] + + cache = LocalesCache( + load_translations=lambda locale: {}, + create_translate_many=_noop_translate_many_factory, + lifecycle={"on_locales_cache_hit": lambda *, locale, value: hits.append(locale)}, + ) + await cache.miss("es") + for _ in range(5): + cache.get("es") + assert hits == ["es"] * 5 + + +@pytest.mark.asyncio +async def test_miss_callback_payload_is_snapshot_not_live_ref() -> None: + """Mutating the user-returned dict after the fact doesn't affect the callback payload.""" + shared_dict = {"h1": "v1"} + received: list[dict[str, str]] = [] + + def loader(locale: str) -> dict[str, str]: + return shared_dict + + cache = LocalesCache( + load_translations=loader, + create_translate_many=_noop_translate_many_factory, + lifecycle={"on_locales_cache_miss": lambda *, locale, value: received.append(value)}, + ) + await cache.miss("es") + # Mutate the dict after miss. + shared_dict["h2"] = "v2" + # The callback payload is a snapshot — it should NOT reflect h2. + assert received == [{"h1": "v1"}] + + +# --------------------------------------------------------------------------- +# get() vs miss() behavior on cold cache +# --------------------------------------------------------------------------- + + +def test_get_on_cold_cache_is_none_with_no_callback() -> None: + """Sync get() on a never-missed locale returns None; hit callback does NOT fire.""" + hits: list[str] = [] + cache = LocalesCache( + load_translations=lambda locale: {}, + create_translate_many=_noop_translate_many_factory, + lifecycle={"on_locales_cache_hit": lambda *, locale, value: hits.append(locale)}, + ) + assert cache.get("es") is None + assert hits == [] + + +@pytest.mark.asyncio +async def test_miss_then_get_returns_same_translations_cache() -> None: + """``miss()`` and ``get()`` return the SAME TranslationsCache instance (identity).""" + cache = LocalesCache( + load_translations=lambda locale: {}, + create_translate_many=_noop_translate_many_factory, + ) + a = await cache.miss("es") + b = cache.get("es") + assert a is b + + +# --------------------------------------------------------------------------- +# Batching params flow to TranslationsCache +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_batching_params_forwarded_to_inner_translations_cache() -> None: + """Batch kwargs on LocalesCache are passed through to each TranslationsCache it creates.""" + cache = LocalesCache( + load_translations=lambda locale: {}, + create_translate_many=_noop_translate_many_factory, + batch_size=7, + batch_interval_ms=13, + max_concurrent_requests=3, + ) + tc = await cache.miss("es") + assert tc._batch_size == 7 + assert tc._batch_interval_ms == 13 + assert tc._max_concurrent_requests == 3 + + +# --------------------------------------------------------------------------- +# Many locales at once (stress) +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_50_locales_concurrent_miss() -> None: + """50 distinct concurrent misses each load exactly once.""" + counts: dict[str, int] = {} + + async def loader(locale: str) -> dict[str, str]: + counts[locale] = counts.get(locale, 0) + 1 + await asyncio.sleep(0) + return {} + + cache = LocalesCache( + load_translations=loader, + create_translate_many=_noop_translate_many_factory, + ) + locales = [f"loc-{i}" for i in range(50)] + await asyncio.gather(*(cache.miss(loc) for loc in locales)) + assert all(counts[loc] == 1 for loc in locales) + + +# --------------------------------------------------------------------------- +# Exception handling variations +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_loader_exception_type_is_preserved() -> None: + """The original exception type propagates to the caller.""" + + class CustomError(RuntimeError): + pass + + def loader(locale: str) -> dict[str, str]: + raise CustomError("custom failure") + + cache = LocalesCache( + load_translations=loader, + create_translate_many=_noop_translate_many_factory, + ) + with pytest.raises(CustomError, match="custom failure"): + await cache.miss("es") + + +@pytest.mark.asyncio +async def test_concurrent_misses_see_same_exception() -> None: + """All concurrent awaiters receive the SAME exception instance from one failing load.""" + + async def loader(locale: str) -> dict[str, str]: + await asyncio.sleep(0.01) + raise RuntimeError("fail-once") + + cache = LocalesCache( + load_translations=loader, + create_translate_many=_noop_translate_many_factory, + ) + results = await asyncio.gather( + cache.miss("es"), + cache.miss("es"), + cache.miss("es"), + return_exceptions=True, + ) + errors = [r for r in results if isinstance(r, BaseException)] + assert len(errors) == 3 + assert all(isinstance(e, RuntimeError) and "fail-once" in str(e) for e in errors) + + +@pytest.mark.asyncio +async def test_retry_after_exception_can_succeed() -> None: + """After a transient error, subsequent miss() retries and can succeed.""" + state = {"should_fail": True} + + def loader(locale: str) -> dict[str, str]: + if state["should_fail"]: + raise ConnectionError("transient") + return {"ok": "yes"} + + cache = LocalesCache( + load_translations=loader, + create_translate_many=_noop_translate_many_factory, + ) + with pytest.raises(ConnectionError): + await cache.miss("es") + state["should_fail"] = False + result = await cache.miss("es") + assert result is not None + + +# --------------------------------------------------------------------------- +# TTL boundary +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_entry_just_before_expiry_is_still_a_hit() -> None: + """An entry with 1000ms left on its TTL is treated as fresh.""" + cache = LocalesCache( + load_translations=lambda locale: {}, + create_translate_many=_noop_translate_many_factory, + ttl_ms=10_000, + ) + await cache.miss("es") + entry = cache._cache["es"] + now = time.monotonic() + assert entry.expires_at > now, "expires_at should be in the future" + assert cache.get("es") is not None + + +@pytest.mark.asyncio +async def test_reload_after_expiry_produces_new_translations_cache_instance() -> None: + """Post-expiry reload builds a FRESH TranslationsCache, not the same one.""" + cache = LocalesCache( + load_translations=lambda locale: {}, + create_translate_many=_noop_translate_many_factory, + ttl_ms=10, + ) + tc1 = await cache.miss("es") + await asyncio.sleep(0.02) + tc2 = await cache.miss("es") + assert tc1 is not tc2 diff --git a/packages/gt-i18n/tests/i18n_manager/test_translations_cache.py b/packages/gt-i18n/tests/i18n_manager/test_translations_cache.py new file mode 100644 index 0000000..ffae852 --- /dev/null +++ b/packages/gt-i18n/tests/i18n_manager/test_translations_cache.py @@ -0,0 +1,413 @@ +"""Golden-standard tests for ``TranslationsCache``. + +``TranslationsCache`` is the inner cache in the two-level hierarchy introduced by +gt PR #1207. It is keyed by message hash, batches runtime translate requests, +enforces batch-size / batch-interval / concurrency limits, and dedups in-flight +requests (so concurrent misses for the same hash share one API call). + +These behaviors are load-bearing — they determine how many API calls the library +makes in hot code paths — so they are pinned here as the black-box contract. + +All tests should FAIL until PR #1207 is ported (module + class don't exist yet). +""" + +from __future__ import annotations + +import asyncio +from collections.abc import Callable +from typing import TYPE_CHECKING, Any + +import pytest +from gt_i18n.i18n_manager._translations_cache import TranslationsCache # noqa: E402 + +if TYPE_CHECKING: + from conftest import FakeGT # noqa: F401 — only for type hints + + +def _key(message: str, *, fmt: str = "STRING", context: str | None = None, id_: str | None = None) -> dict[str, Any]: + """Build a TranslationsCache lookup key (mirrors JS ``TranslationKey``). + + Example: + key = _key("Hello", fmt="ICU") + # → {"message": "Hello", "options": {"_format": "ICU"}} + """ + options: dict[str, Any] = {"_format": fmt} + if context is not None: + options["_context"] = context + if id_ is not None: + options["_id"] = id_ + return {"message": message, "options": options} + + +# --------------------------------------------------------------------------- +# test_sync_get_returns_none_on_miss +# +# ``cache.get(key)`` is sync and MUST NOT trigger a fetch. Safe to call from +# synchronous codepaths like ``t()``. +# --------------------------------------------------------------------------- + + +def test_sync_get_returns_none_on_miss(translate_many_for_locale: Callable[[str], Any]) -> None: + """Sync ``get()`` returns None for unknown keys; never triggers a network call.""" + cache = TranslationsCache( + locale="es", + translate_many=translate_many_for_locale("es"), + ) + assert cache.get(_key("Hello")) is None + + +# --------------------------------------------------------------------------- +# test_async_miss_triggers_translate_many_and_caches +# +# Example: +# result = await cache.miss({"message": "Hello", "options": {"_format": "STRING"}}) +# # → fake_gt.translate_many called once with {hash: {source, metadata}} + target_locale="es" +# # Subsequent cache.get() returns the translation (hit). +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_async_miss_triggers_translate_many_and_caches( + fake_gt: FakeGT, + translate_many_for_locale: Callable[[str], Any], +) -> None: + """A miss fires translate_many; the result is cached and returned by later get().""" + cache = TranslationsCache( + locale="es", + translate_many=translate_many_for_locale("es"), + ) + + key = _key("Hello, world!") + translation = await cache.miss(key) + + assert translation == "[es]Hello, world!" + assert len(fake_gt.calls) == 1 + + # Subsequent sync get() is a cache hit — no additional API call. + cached = cache.get(key) + assert cached == "[es]Hello, world!" + assert len(fake_gt.calls) == 1, "sync get() after miss must not trigger another API call" + + +# --------------------------------------------------------------------------- +# test_concurrent_miss_same_hash_dedups_request +# +# The critical dedup guarantee: if ``tx("Hello")`` is called from 3 places +# concurrently, only ONE API request goes out with ONE entry. +# +# Example: +# await asyncio.gather(cache.miss(k), cache.miss(k), cache.miss(k)) +# # → fake_gt.translate_many called 1x with 1 entry. +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_concurrent_miss_same_hash_dedups_request( + fake_gt: FakeGT, + translate_many_for_locale: Callable[[str], Any], +) -> None: + """Concurrent misses for the same key produce exactly one API call with one entry.""" + fake_gt.delay_s = 0.02 # hold requests in flight long enough for callers to queue up + + cache = TranslationsCache( + locale="es", + translate_many=translate_many_for_locale("es"), + batch_interval_ms=5, + ) + + key = _key("Hello") + results = await asyncio.gather(cache.miss(key), cache.miss(key), cache.miss(key)) + + assert all(r == "[es]Hello" for r in results) + assert len(fake_gt.calls) == 1 + assert len(fake_gt.calls[0]["sources"]) == 1 + + +# --------------------------------------------------------------------------- +# test_batches_within_interval +# +# Multiple distinct misses within one batch interval coalesce into a single API +# call whose request body contains all of them. +# +# Example: +# fire 5 concurrent miss()es with different messages → +# after ~interval: 1 API call with 5 entries. +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_batches_within_interval( + fake_gt: FakeGT, + translate_many_for_locale: Callable[[str], Any], +) -> None: + """Misses fired within one interval window are coalesced into one API call.""" + cache = TranslationsCache( + locale="es", + translate_many=translate_many_for_locale("es"), + batch_interval_ms=30, + batch_size=25, + ) + + keys = [_key(f"message-{i}") for i in range(5)] + results = await asyncio.gather(*(cache.miss(k) for k in keys)) + + # All returned successfully. + assert [r for r in results] == [f"[es]message-{i}" for i in range(5)] + # Exactly one API call, containing all 5 entries. + assert len(fake_gt.calls) == 1 + assert len(fake_gt.calls[0]["sources"]) == 5 + + +# --------------------------------------------------------------------------- +# test_respects_batch_size_limit +# +# If the queue exceeds batch_size, it's split into multiple sub-batches each +# with size <= batch_size. No batch ever exceeds the limit. +# +# Example: +# batch_size=3; fire 7 concurrent misses → batches [3, 3, 1] (never > 3). +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_respects_batch_size_limit( + fake_gt: FakeGT, + translate_many_for_locale: Callable[[str], Any], +) -> None: + """No single API call contains more entries than ``batch_size``.""" + cache = TranslationsCache( + locale="es", + translate_many=translate_many_for_locale("es"), + batch_size=3, + batch_interval_ms=5, + ) + + keys = [_key(f"m-{i}") for i in range(7)] + await asyncio.gather(*(cache.miss(k) for k in keys)) + + total_entries = sum(len(c["sources"]) for c in fake_gt.calls) + assert total_entries == 7 + assert all(len(c["sources"]) <= 3 for c in fake_gt.calls), f"batches: {[len(c['sources']) for c in fake_gt.calls]}" + + +# --------------------------------------------------------------------------- +# test_respects_max_concurrent_requests +# +# Pin: the cache does not fire more than ``max_concurrent_requests`` API calls +# at once, even if the queue is longer. Remaining batches wait for a slot. +# +# Example: +# batch_size=1, max_concurrent_requests=2, fake_gt holds requests pending. +# Fire 5 misses → at most 2 in-flight API calls at any instant. +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_respects_max_concurrent_requests( + fake_gt: FakeGT, + translate_many_for_locale: Callable[[str], Any], +) -> None: + """The cache never has more than ``max_concurrent_requests`` translate_many calls in flight.""" + # Hold all requests pending until we release them. + fake_gt.inflight_event = asyncio.Event() + + cache = TranslationsCache( + locale="es", + translate_many=translate_many_for_locale("es"), + batch_size=1, # one entry per batch to maximize batch count + batch_interval_ms=1, + max_concurrent_requests=2, + ) + + keys = [_key(f"msg-{i}") for i in range(5)] + tasks = [asyncio.create_task(cache.miss(k)) for k in keys] + + # Let the batcher schedule requests; wait for it to reach its concurrency cap. + for _ in range(20): + await asyncio.sleep(0.01) + if fake_gt.max_inflight >= 2: + break + assert fake_gt.max_inflight == 2, f"expected max_inflight == 2, got {fake_gt.max_inflight}" + + # Release the held requests so the remaining queue drains. + fake_gt.inflight_event.set() + await asyncio.gather(*tasks) + + assert fake_gt.max_inflight == 2, "concurrency cap must not be exceeded across the full run" + + +# --------------------------------------------------------------------------- +# test_partial_failure_caches_successes_and_returns_none_for_failures +# +# When a batch response has mixed success/failure, successful entries are +# cached, failed entries return None from miss() AND are NOT cached (so the +# next call can retry them). +# +# Example: +# batch of 3 → response: {h1: success, h2: success, h3: {success: False}} +# cache.get(h1) / cache.get(h2) → hit +# cache.get(h3) → None (failures are retryable) +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_partial_failure_caches_successes_and_returns_none_for_failures( + fake_gt: FakeGT, + translate_many_for_locale: Callable[[str], Any], +) -> None: + """Per-entry failures don't poison the batch — successful entries still cache.""" + + # Custom response: first two succeed, third fails. + def mixed_response(sources: dict[str, Any], options: dict[str, Any]) -> dict[str, Any]: + hashes = list(sources.keys()) + result: dict[str, Any] = {} + for i, h in enumerate(hashes): + if i == 2: + result[h] = {"success": False, "error": "upstream failure", "code": 500} + else: + result[h] = {"success": True, "translation": f"ok:{sources[h]['source']}", "locale": "es"} + return result + + fake_gt.response_factory = mixed_response + + cache = TranslationsCache( + locale="es", + translate_many=translate_many_for_locale("es"), + batch_interval_ms=5, + ) + + k0, k1, k2 = _key("a"), _key("b"), _key("c") + r0, r1, r2 = await asyncio.gather(cache.miss(k0), cache.miss(k1), cache.miss(k2)) + + assert r0 == "ok:a" + assert r1 == "ok:b" + assert r2 is None, "failed entries must return None from miss()" + + # Successes are cached. + assert cache.get(k0) == "ok:a" + assert cache.get(k1) == "ok:b" + # Failures are NOT cached — retryable. + assert cache.get(k2) is None + + +# --------------------------------------------------------------------------- +# test_request_body_shape_matches_js_wire_format +# +# The request body is camelCase on the wire (per port-from-gt-js.md §"Options, +# kwargs, and wire format"). metadata includes dataFormat, context, id, maxChars. +# +# Example JS equivalent: +# translateMany({ +# "": { +# source: "Hello", +# metadata: { hash: "", dataFormat: "ICU", context: "...", id: "...", maxChars: 50 } +# } +# }, { targetLocale: "es" }) +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_request_body_shape_matches_js_wire_format( + fake_gt: FakeGT, + translate_many_for_locale: Callable[[str], Any], +) -> None: + """Request body uses camelCase keys matching JS; metadata carries context/id/maxChars/dataFormat.""" + cache = TranslationsCache( + locale="es", + translate_many=translate_many_for_locale("es"), + batch_interval_ms=5, + ) + + key = {"message": "Save", "options": {"_format": "ICU", "_context": "button", "_id": "save.btn", "_max_chars": 50}} + await cache.miss(key) + + assert len(fake_gt.calls) == 1 + call = fake_gt.calls[0] + # target_locale (or camelCase equivalent) in options. + target = call["options"].get("target_locale") or call["options"].get("targetLocale") + assert target == "es" + + # sources is {hash: {source, metadata}}. + (hash_key, entry) = next(iter(call["sources"].items())) + assert isinstance(hash_key, str) and len(hash_key) > 0 + assert entry["source"] == "Save" + meta = entry["metadata"] + assert meta["dataFormat"] == "ICU" + assert meta["context"] == "button" + assert meta["id"] == "save.btn" + assert meta["maxChars"] == 50 + + +# --------------------------------------------------------------------------- +# test_on_translations_cache_hit_fires +# +# Callback receives {locale, hash, value}. +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_on_translations_cache_hit_fires( + fake_gt: FakeGT, + translate_many_for_locale: Callable[[str], Any], +) -> None: + """``on_translations_cache_hit`` fires on a cached read with {locale, hash, value}.""" + hits: list[dict[str, Any]] = [] + + cache = TranslationsCache( + locale="es", + translate_many=translate_many_for_locale("es"), + batch_interval_ms=5, + lifecycle={ + "on_translations_cache_hit": lambda *, locale, hash, value: hits.append( + {"locale": locale, "hash": hash, "value": value} + ), + }, + ) + + key = _key("Cached") + # First access: miss; fires miss callback (not verified here). + await cache.miss(key) + # Second access: hit. + got = cache.get(key) + + assert got == "[es]Cached" + assert len(hits) == 1 + assert hits[0]["locale"] == "es" + assert isinstance(hits[0]["hash"], str) and len(hits[0]["hash"]) > 0 + assert hits[0]["value"] == "[es]Cached" + + +# --------------------------------------------------------------------------- +# test_on_translations_cache_miss_fires_with_fetched_value +# +# On a miss that triggers an API call, the callback fires AFTER the value is +# retrieved — with the fetched value. +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_on_translations_cache_miss_fires_with_fetched_value( + fake_gt: FakeGT, + translate_many_for_locale: Callable[[str], Any], +) -> None: + """``on_translations_cache_miss`` fires post-fetch with the translation that was retrieved.""" + misses: list[dict[str, Any]] = [] + + cache = TranslationsCache( + locale="es", + translate_many=translate_many_for_locale("es"), + batch_interval_ms=5, + lifecycle={ + "on_translations_cache_miss": lambda *, locale, hash, value: misses.append( + {"locale": locale, "hash": hash, "value": value} + ), + }, + ) + + key = _key("Fetched") + await cache.miss(key) + + assert len(misses) == 1 + assert misses[0]["locale"] == "es" + assert misses[0]["value"] == "[es]Fetched" + assert isinstance(misses[0]["hash"], str) and len(misses[0]["hash"]) > 0 diff --git a/packages/gt-i18n/tests/i18n_manager/test_translations_cache_scenarios.py b/packages/gt-i18n/tests/i18n_manager/test_translations_cache_scenarios.py new file mode 100644 index 0000000..059f770 --- /dev/null +++ b/packages/gt-i18n/tests/i18n_manager/test_translations_cache_scenarios.py @@ -0,0 +1,503 @@ +"""Extended behavioral tests for ``TranslationsCache`` — dense edge-case coverage. + +Complements ``test_translations_cache.py`` (the golden-standard file). These +exercise corners of the batching / dedup / lifecycle / partial-failure logic +that the golden file doesn't cover individually. +""" + +from __future__ import annotations + +import asyncio +from collections.abc import Callable +from typing import TYPE_CHECKING, Any + +import pytest +from gt_i18n.i18n_manager._translations_cache import TranslationsCache + +if TYPE_CHECKING: + from conftest import FakeGT # noqa: F401 — type hints only + + +def _key(message: str, *, fmt: str = "STRING", **opts: Any) -> dict[str, Any]: + options: dict[str, Any] = {"_format": fmt, **{f"_{k}": v for k, v in opts.items() if v is not None}} + return {"message": message, "options": options} + + +# --------------------------------------------------------------------------- +# Initial translations seed +# --------------------------------------------------------------------------- + + +def test_initial_translations_populate_cache_so_get_hits_without_network( + translate_many_for_locale: Callable[[str], Any], +) -> None: + """Passing ``initial={...}`` makes those hashes immediate cache hits.""" + from gt_i18n.i18n_manager._translations_cache import _compute_hash + + k = _key("Hello", fmt="ICU") + h = _compute_hash(k["message"], k["options"]) + + cache = TranslationsCache( + locale="es", + translate_many=translate_many_for_locale("es"), + initial={h: "Hola"}, + ) + assert cache.get(k) == "Hola" + + +@pytest.mark.asyncio +async def test_initial_translations_are_used_even_on_async_miss_path( + fake_gt: FakeGT, + translate_many_for_locale: Callable[[str], Any], +) -> None: + """``miss()`` returns initial-seeded values without calling translate_many.""" + from gt_i18n.i18n_manager._translations_cache import _compute_hash + + k = _key("Cached source", fmt="STRING") + h = _compute_hash(k["message"], k["options"]) + + cache = TranslationsCache( + locale="es", + translate_many=translate_many_for_locale("es"), + initial={h: "Valor en cache"}, + ) + got = await cache.miss(k) + assert got == "Valor en cache" + assert len(fake_gt.calls) == 0, "initial entries must not trigger translate_many" + + +@pytest.mark.asyncio +async def test_initial_empty_dict_is_equivalent_to_no_initial( + fake_gt: FakeGT, + translate_many_for_locale: Callable[[str], Any], +) -> None: + """``initial={}`` doesn't confuse the cache — all misses go through translate_many.""" + cache = TranslationsCache( + locale="es", + translate_many=translate_many_for_locale("es"), + initial={}, + batch_interval_ms=5, + ) + await cache.miss(_key("x")) + assert len(fake_gt.calls) == 1 + + +# --------------------------------------------------------------------------- +# Batch-timer edge cases +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_timer_fires_on_empty_queue_is_a_noop( + fake_gt: FakeGT, + translate_many_for_locale: Callable[[str], Any], +) -> None: + """If the queue is emptied before the timer fires, the timer's drain is a no-op.""" + cache = TranslationsCache( + locale="es", + translate_many=translate_many_for_locale("es"), + batch_size=1, # immediate drain on enqueue + batch_interval_ms=15, + ) + # Fire one miss — immediate drain, queue empties before timer fires. + await cache.miss(_key("only")) + # Wait past the timer interval so it fires. + await asyncio.sleep(0.03) + # Timer fired but had nothing to do; exactly one API call total. + assert len(fake_gt.calls) == 1 + + +@pytest.mark.asyncio +async def test_timer_restarts_after_firing( + fake_gt: FakeGT, + translate_many_for_locale: Callable[[str], Any], +) -> None: + """After the timer fires once, a subsequent miss starts a fresh timer.""" + cache = TranslationsCache( + locale="es", + translate_many=translate_many_for_locale("es"), + batch_size=25, + batch_interval_ms=10, + ) + await cache.miss(_key("a")) + await asyncio.sleep(0.03) # let timer fire + a second timer window close + await cache.miss(_key("b")) + assert len(fake_gt.calls) == 2, "second miss should fire a fresh batch after first timer completed" + + +@pytest.mark.asyncio +async def test_exactly_batch_size_triggers_immediate_drain( + fake_gt: FakeGT, + translate_many_for_locale: Callable[[str], Any], +) -> None: + """When queue reaches exactly batch_size, drain happens without waiting for timer.""" + cache = TranslationsCache( + locale="es", + translate_many=translate_many_for_locale("es"), + batch_size=3, + batch_interval_ms=1_000, # very long — if we had to wait, test would hang + ) + results = await asyncio.gather(cache.miss(_key("a")), cache.miss(_key("b")), cache.miss(_key("c"))) + assert all(r is not None for r in results) + assert len(fake_gt.calls) == 1 + + +# --------------------------------------------------------------------------- +# Cached + new interleaving +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_mixed_cached_and_new_in_same_gather( + fake_gt: FakeGT, + translate_many_for_locale: Callable[[str], Any], +) -> None: + """Cached keys short-circuit; new keys go through batching.""" + from gt_i18n.i18n_manager._translations_cache import _compute_hash + + k_cached = _key("already", fmt="STRING") + h_cached = _compute_hash(k_cached["message"], k_cached["options"]) + + cache = TranslationsCache( + locale="es", + translate_many=translate_many_for_locale("es"), + initial={h_cached: "yacheado"}, + batch_interval_ms=5, + ) + a, b, c = await asyncio.gather( + cache.miss(k_cached), + cache.miss(_key("new1")), + cache.miss(_key("new2")), + ) + assert a == "yacheado" + assert b == "[es]new1" + assert c == "[es]new2" + # Only one API call, containing only the 2 new keys. + assert len(fake_gt.calls) == 1 + assert len(fake_gt.calls[0]["sources"]) == 2 + + +# --------------------------------------------------------------------------- +# Response-shape tolerance +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_response_missing_hash_treated_as_failure( + fake_gt: FakeGT, + translate_many_for_locale: Callable[[str], Any], +) -> None: + """If the server response omits a hash entirely, miss() returns None for it.""" + fake_gt.response_factory = lambda sources, options: {} # return nothing + + cache = TranslationsCache( + locale="es", + translate_many=translate_many_for_locale("es"), + batch_interval_ms=5, + ) + got = await cache.miss(_key("abandoned")) + assert got is None + + +@pytest.mark.asyncio +async def test_response_with_unsolicited_extra_hash_is_ignored( + fake_gt: FakeGT, + translate_many_for_locale: Callable[[str], Any], +) -> None: + """Server returning a hash we didn't request doesn't crash or pollute cache.""" + + def factory(sources: dict[str, Any], options: dict[str, Any]) -> dict[str, Any]: + result: dict[str, Any] = {"unsolicited_hash_zzz": {"success": True, "translation": "intruder"}} + for h, entry in sources.items(): + result[h] = {"success": True, "translation": f"ok:{entry['source']}"} + return result + + fake_gt.response_factory = factory + + cache = TranslationsCache( + locale="es", + translate_many=translate_many_for_locale("es"), + batch_interval_ms=5, + ) + got = await cache.miss(_key("legit")) + assert got == "ok:legit" + # Unsolicited entry should not be reachable via a subsequent cache check. + # There's no message that would hash to unsolicited_hash_zzz, so just assert + # the cache doesn't explode and legit is hit-cacheable. + assert cache.get(_key("legit")) == "ok:legit" + + +@pytest.mark.asyncio +async def test_response_with_success_false_does_not_cache( + fake_gt: FakeGT, + translate_many_for_locale: Callable[[str], Any], +) -> None: + """A failed entry is NOT cached — next miss() retries.""" + attempts = [0] + + def factory(sources: dict[str, Any], options: dict[str, Any]) -> dict[str, Any]: + attempts[0] += 1 + if attempts[0] == 1: + return {h: {"success": False, "error": "nope", "code": 500} for h in sources} + return {h: {"success": True, "translation": f"ok:{e['source']}"} for h, e in sources.items()} + + fake_gt.response_factory = factory + + cache = TranslationsCache( + locale="es", + translate_many=translate_many_for_locale("es"), + batch_interval_ms=5, + ) + first = await cache.miss(_key("retryable")) + second = await cache.miss(_key("retryable")) + assert first is None + assert second == "ok:retryable" + assert attempts[0] == 2 + + +# --------------------------------------------------------------------------- +# Translate_many throws exception — all awaiters propagate +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_translate_many_exception_propagates_to_all_awaiters() -> None: + """A network error on translate_many raises in every waiting miss() call.""" + + async def broken_translate(sources: dict[str, Any]) -> dict[str, Any]: + raise ConnectionError("upstream offline") + + cache = TranslationsCache( + locale="es", + translate_many=broken_translate, + batch_interval_ms=5, + ) + with pytest.raises(ConnectionError, match="upstream offline"): + await cache.miss(_key("first")) + + +# --------------------------------------------------------------------------- +# Concurrency cap — stricter +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_concurrency_cap_of_1( + fake_gt: FakeGT, + translate_many_for_locale: Callable[[str], Any], +) -> None: + """With max_concurrent=1, no two batch calls overlap.""" + fake_gt.inflight_event = asyncio.Event() + cache = TranslationsCache( + locale="es", + translate_many=translate_many_for_locale("es"), + batch_size=1, + batch_interval_ms=1, + max_concurrent_requests=1, + ) + tasks = [asyncio.create_task(cache.miss(_key(f"msg-{i}"))) for i in range(4)] + for _ in range(15): + await asyncio.sleep(0.01) + if fake_gt.max_inflight >= 1 and fake_gt.inflight_count == 1: + break + assert fake_gt.max_inflight == 1 + fake_gt.inflight_event.set() + await asyncio.gather(*tasks) + assert fake_gt.max_inflight == 1 + + +# --------------------------------------------------------------------------- +# Lifecycle callback ordering +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_miss_callback_fires_after_cache_write_so_hit_is_consistent( + fake_gt: FakeGT, + translate_many_for_locale: Callable[[str], Any], +) -> None: + """Inside on_translations_cache_miss, a subsequent get() must see the value.""" + observations: list[Any] = [] + + def on_miss(*, locale: str, hash: str, value: str) -> None: + # At this point the value must already be in the cache. + observations.append( + ("miss", locale, hash, value, cache.get({"message": "x", "options": {"_format": "STRING"}})) + ) + + cache = TranslationsCache( + locale="es", + translate_many=translate_many_for_locale("es"), + batch_interval_ms=5, + lifecycle={"on_translations_cache_miss": on_miss}, + ) + await cache.miss(_key("x")) + # Check at-write-time observation: the sync get() inside the callback saw the value. + assert len(observations) == 1 + # Last element in tuple is the result of the sync get() inside the callback. + assert observations[0][4] == "[es]x" + + +@pytest.mark.asyncio +async def test_hit_callback_never_fires_on_a_pure_miss( + fake_gt: FakeGT, + translate_many_for_locale: Callable[[str], Any], +) -> None: + """A brand-new miss() must not fire the hit callback.""" + hits: list[Any] = [] + cache = TranslationsCache( + locale="es", + translate_many=translate_many_for_locale("es"), + batch_interval_ms=5, + lifecycle={"on_translations_cache_hit": lambda **p: hits.append(p)}, + ) + await cache.miss(_key("brand new")) + assert hits == [] + + +# --------------------------------------------------------------------------- +# Options combinations flow through to request metadata +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +@pytest.mark.parametrize( + "fmt,context,id_,max_chars", + [ + ("STRING", None, None, None), + ("STRING", "button", None, None), + ("STRING", None, "save.btn", None), + ("STRING", None, None, 50), + ("STRING", "button", "save.btn", 50), + ("ICU", "menu", "menu_save", 20), + ("I18NEXT", "modal", "modal_close", 10), + ], +) +async def test_metadata_wire_shape_exhaustive( + fake_gt: FakeGT, + translate_many_for_locale: Callable[[str], Any], + fmt: str, + context: str | None, + id_: str | None, + max_chars: int | None, +) -> None: + """Metadata includes camelCase keys only for non-None values, for all formats.""" + cache = TranslationsCache( + locale="es", + translate_many=translate_many_for_locale("es"), + batch_interval_ms=5, + ) + options: dict[str, Any] = {"_format": fmt} + if context is not None: + options["_context"] = context + if id_ is not None: + options["_id"] = id_ + if max_chars is not None: + options["_max_chars"] = max_chars + key = {"message": "Hi", "options": options} + await cache.miss(key) + + meta = next(iter(fake_gt.calls[0]["sources"].values()))["metadata"] + assert meta["dataFormat"] == fmt + if context is None: + assert "context" not in meta + else: + assert meta["context"] == context + if id_ is None: + assert "id" not in meta + else: + assert meta["id"] == id_ + if max_chars is None: + assert "maxChars" not in meta + else: + assert meta["maxChars"] == max_chars + + +# --------------------------------------------------------------------------- +# Sequential (non-concurrent) misses +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_sequential_misses_produce_separate_batches( + fake_gt: FakeGT, + translate_many_for_locale: Callable[[str], Any], +) -> None: + """Awaiting each miss() before the next should produce one API call per miss.""" + cache = TranslationsCache( + locale="es", + translate_many=translate_many_for_locale("es"), + batch_interval_ms=5, + ) + await cache.miss(_key("one")) + await cache.miss(_key("two")) + await cache.miss(_key("three")) + assert len(fake_gt.calls) == 3 + + +# --------------------------------------------------------------------------- +# Many different misses (stress) +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_fifty_distinct_concurrent_misses_all_resolve( + fake_gt: FakeGT, + translate_many_for_locale: Callable[[str], Any], +) -> None: + """50 distinct concurrent misses all resolve correctly under default batching.""" + cache = TranslationsCache( + locale="es", + translate_many=translate_many_for_locale("es"), + batch_size=25, + batch_interval_ms=5, + ) + results = await asyncio.gather(*(cache.miss(_key(f"k{i}")) for i in range(50))) + assert all(r == f"[es]k{i}" for i, r in enumerate(results)) + # Should be 2 batches of 25 (or equivalent groupings <= 25). + assert sum(len(c["sources"]) for c in fake_gt.calls) == 50 + assert all(len(c["sources"]) <= 25 for c in fake_gt.calls) + + +@pytest.mark.asyncio +async def test_hundred_duplicate_concurrent_misses_dedup_to_one( + fake_gt: FakeGT, + translate_many_for_locale: Callable[[str], Any], +) -> None: + """100 concurrent misses for the SAME key → 1 API call with 1 entry.""" + fake_gt.delay_s = 0.02 + cache = TranslationsCache( + locale="es", + translate_many=translate_many_for_locale("es"), + batch_interval_ms=5, + ) + key = _key("same") + results = await asyncio.gather(*(cache.miss(key) for _ in range(100))) + assert all(r == "[es]same" for r in results) + assert len(fake_gt.calls) == 1 + assert len(fake_gt.calls[0]["sources"]) == 1 + + +# --------------------------------------------------------------------------- +# Two distinct keys that produce the same message but different formats +# route through separate cache entries (no cross-format collision). +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_same_message_different_formats_are_separate_cache_entries( + fake_gt: FakeGT, + translate_many_for_locale: Callable[[str], Any], +) -> None: + """A STRING 'Hi' and ICU 'Hi' are different cache keys.""" + cache = TranslationsCache( + locale="es", + translate_many=translate_many_for_locale("es"), + batch_interval_ms=5, + ) + r1 = await cache.miss(_key("Hi", fmt="STRING")) + r2 = await cache.miss(_key("Hi", fmt="ICU")) + assert r1 == "[es]Hi" + assert r2 == "[es]Hi" + # Two API calls because they hashed differently. + assert len(fake_gt.calls) == 2 diff --git a/packages/gt-i18n/tests/test_translations_manager.py b/packages/gt-i18n/tests/test_translations_manager.py deleted file mode 100644 index 36dff79..0000000 --- a/packages/gt-i18n/tests/test_translations_manager.py +++ /dev/null @@ -1,78 +0,0 @@ -"""Tests for TranslationsManager.""" - -import asyncio - -import pytest -from gt_i18n.i18n_manager._translations_manager import TranslationsManager - - -@pytest.fixture -def sample_translations() -> dict[str, str]: - return {"greeting_hash": "Hola, mundo!"} - - -def test_custom_loader(sample_translations: dict[str, str]) -> None: - def loader(locale: str) -> dict[str, str]: - if locale == "es": - return sample_translations - return {} - - mgr = TranslationsManager(loader) - result = asyncio.run(mgr.get_translations("es")) - assert result == sample_translations - - -def test_cache_hit(sample_translations: dict[str, str]) -> None: - call_count = [0] - - def loader(locale: str) -> dict[str, str]: - call_count[0] += 1 - return sample_translations - - mgr = TranslationsManager(loader, cache_expiry_time=60_000) - asyncio.run(mgr.get_translations("es")) - asyncio.run(mgr.get_translations("es")) - assert call_count[0] == 1 - - -def test_sync_returns_cached(sample_translations: dict[str, str]) -> None: - def loader(locale: str) -> dict[str, str]: - return sample_translations - - mgr = TranslationsManager(loader) - # Before loading, sync returns empty - assert mgr.get_translations_sync("es") == {} - # After loading - asyncio.run(mgr.get_translations("es")) - assert mgr.get_translations_sync("es") == sample_translations - - -def test_load_all() -> None: - loaded = [] - - def loader(locale: str) -> dict[str, str]: - loaded.append(locale) - return {f"{locale}_hash": f"{locale}_value"} - - mgr = TranslationsManager(loader) - asyncio.run(mgr.load_all(["en", "es", "fr"])) - assert set(loaded) == {"en", "es", "fr"} - assert mgr.get_translations_sync("es") == {"es_hash": "es_value"} - - -def test_loader_error_returns_empty() -> None: - def loader(locale: str) -> dict[str, str]: - raise RuntimeError("fail") - - mgr = TranslationsManager(loader) - result = asyncio.run(mgr.get_translations("es")) - assert result == {} - - -def test_async_loader() -> None: - async def loader(locale: str) -> dict[str, str]: - return {"async_hash": "async_value"} - - mgr = TranslationsManager(loader) - result = asyncio.run(mgr.get_translations("es")) - assert result == {"async_hash": "async_value"} diff --git a/packages/gt-i18n/tests/translation_functions/__init__.py b/packages/gt-i18n/tests/translation_functions/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/packages/gt-i18n/tests/translation_functions/test_hash_message_format.py b/packages/gt-i18n/tests/translation_functions/test_hash_message_format.py new file mode 100644 index 0000000..08a937d --- /dev/null +++ b/packages/gt-i18n/tests/translation_functions/test_hash_message_format.py @@ -0,0 +1,123 @@ +"""Golden-standard tests for the format-aware hash_message behavior. + +gt PR #1207 changed the hash algorithm: ``indexVars()`` is now applied ONLY for +ICU format, not for STRING or I18NEXT formats. This is a subtle but critical +on-wire compatibility pin — if the Python hash diverges from the JS hash, a +translation fetched by a JS client and a translation fetched by a Python +client will miss each other's cache. + +Before this PR, Python's ``hash_message`` indexed vars unconditionally (the +current behavior). After the port, the function must accept a ``format`` kwarg +and branch on it: + +- ``format="ICU"`` — index vars, then hash (UNCHANGED for ICU sources). +- ``format="STRING"`` — hash the raw message (NO indexing). +- ``format="I18NEXT"`` — hash the raw message (NO indexing). + +All tests should FAIL until PR #1207 is ported — the current ``hash_message`` +does not accept a ``format`` kwarg. +""" + +from __future__ import annotations + +from generaltranslation._id._hash import hash_source +from generaltranslation.static._index_vars import index_vars +from gt_i18n.translation_functions._hash_message import hash_message + +# --------------------------------------------------------------------------- +# test_icu_format_hashes_with_indexed_vars +# +# The pre-existing ICU behavior MUST NOT change — ICU messages have their vars +# normalized (e.g. ``{name}`` → ``{_gt_0}``) before hashing so that two templates +# differing only in variable names collide. +# +# Example: +# hash_message("Hello, {name}!", format="ICU") +# # == hash_source(index_vars("Hello, {name}!"), data_format="ICU") +# --------------------------------------------------------------------------- + + +def test_icu_format_hashes_with_indexed_vars() -> None: + """For ICU format, the hash is computed over index_vars(message) — UNCHANGED.""" + message = "Hello, {name}!" + expected = hash_source(index_vars(message), data_format="ICU") + assert hash_message(message, format="ICU") == expected + + +# --------------------------------------------------------------------------- +# test_string_format_hashes_raw_message +# +# NEW in PR #1207: for STRING format, the message is hashed raw — no var +# indexing. The rationale is that STRING templates pass the user's literal +# template verbatim to the translation service. +# +# Example: +# hash_message("Hello, {name}!", format="STRING") +# # == hash_source("Hello, {name}!", data_format="STRING") (raw, NOT indexed) +# --------------------------------------------------------------------------- + + +def test_string_format_hashes_raw_message() -> None: + """For STRING format, the hash is computed over the RAW message — no index_vars.""" + message = "Hello, {name}!" + expected = hash_source(message, data_format="STRING") + assert hash_message(message, format="STRING") == expected + + +# --------------------------------------------------------------------------- +# test_i18next_format_hashes_raw_message +# +# NEW in PR #1207: same raw-hash treatment for I18NEXT format templates. +# +# Example: +# hash_message("Hello, {{name}}", format="I18NEXT") +# # == hash_source("Hello, {{name}}", data_format="I18NEXT") (raw) +# --------------------------------------------------------------------------- + + +def test_i18next_format_hashes_raw_message() -> None: + """For I18NEXT format, the hash is computed over the RAW message — no index_vars.""" + message = "Hello, {{name}}" + expected = hash_source(message, data_format="I18NEXT") + assert hash_message(message, format="I18NEXT") == expected + + +# --------------------------------------------------------------------------- +# test_icu_vs_string_produce_different_hashes_for_same_template +# +# Pin: a template that looks identical as a source string must hash +# differently depending on its format, because the translation service +# interprets it differently (ICU var is placeholder-indexed; STRING var is +# literal). Prevents cross-format cache collisions. +# --------------------------------------------------------------------------- + + +def test_icu_vs_string_produce_different_hashes_for_same_template() -> None: + """The same template hashed as ICU vs STRING must produce different keys.""" + message = "Hello, {name}!" + + icu_hash = hash_message(message, format="ICU") + string_hash = hash_message(message, format="STRING") + + assert icu_hash != string_hash, ( + "ICU and STRING formats must produce different hashes — they are different wire representations of the template" + ) + + +# --------------------------------------------------------------------------- +# test_format_default_stays_icu_for_backward_compat +# +# Current Python API: ``hash_message(msg)`` has no ``format`` kwarg and always +# ICU-indexes. The port adds a ``format`` kwarg but the default stays "ICU" +# so existing callers don't break. +# +# Example: +# # old code still works: +# hash_message("Hello, {name}!") # == hash_message("Hello, {name}!", format="ICU") +# --------------------------------------------------------------------------- + + +def test_format_default_stays_icu_for_backward_compat() -> None: + """Calling hash_message without ``format`` defaults to ICU (preserves existing callers).""" + message = "Hello, {name}!" + assert hash_message(message) == hash_message(message, format="ICU") diff --git a/packages/gt-i18n/tests/translation_functions/test_hash_parity_bulk.py b/packages/gt-i18n/tests/translation_functions/test_hash_parity_bulk.py new file mode 100644 index 0000000..ccf4f02 --- /dev/null +++ b/packages/gt-i18n/tests/translation_functions/test_hash_parity_bulk.py @@ -0,0 +1,150 @@ +"""Bulk hash-parity tests: Python ``hash_message`` must match the JS ``hashMessage``. + +Fixtures are generated by ``generate_runtime_fixtures.mjs`` against the JS +monorepo at ``~/Documents/dev/gt`` and committed as +``fixtures/runtime_translation_parity.json`` (1040 (message, format, ...) cases). + +Each fixture case becomes one parametrized test. Parametrize IDs embed the +format and index so a failure points at a specific entry: + + test_hash_parity[ICU-13] → 14th ICU case + test_hash_parity[STRING-7] → 8th STRING case + test_hash_parity[I18NEXT-22] → 23rd I18NEXT case + +If a single case fails, the Python ``hash_message`` has drifted from the JS +reference for that combination of (message, format, context, id, maxChars). +""" + +from __future__ import annotations + +import json +from pathlib import Path +from typing import Any + +import pytest +from gt_i18n.translation_functions._hash_message import hash_message + +FIXTURES_PATH = Path(__file__).parent.parent / "fixtures" / "runtime_translation_parity.json" + + +def _load_fixtures() -> dict[str, Any]: + return json.loads(FIXTURES_PATH.read_text()) + + +def _indexed_cases() -> list[tuple[str, dict[str, Any]]]: + """Yield (test_id, case) pairs keyed by ``{format}-{index-within-format}``.""" + data = _load_fixtures() + counters: dict[str, int] = {} + out: list[tuple[str, dict[str, Any]]] = [] + for case in data["cases"]: + fmt = case["format"] + idx = counters.get(fmt, 0) + counters[fmt] = idx + 1 + out.append((f"{fmt}-{idx}", case)) + return out + + +INDEXED_CASES = _indexed_cases() +CASE_IDS = [cid for cid, _ in INDEXED_CASES] +CASES = [c for _, c in INDEXED_CASES] + + +# --------------------------------------------------------------------------- +# Bulk parity — one test per fixture case +# --------------------------------------------------------------------------- + + +@pytest.mark.parametrize("case", CASES, ids=CASE_IDS) +def test_hash_parity(case: dict[str, Any]) -> None: + """Python ``hash_message`` matches JS ``hashMessage`` for this specific case.""" + kwargs: dict[str, Any] = {"format": case["format"]} + if case["context"] is not None: + kwargs["context"] = case["context"] + if case["id"] is not None: + kwargs["id"] = case["id"] + if case["maxChars"] is not None: + kwargs["max_chars"] = case["maxChars"] + + got = hash_message(case["message"], **kwargs) + assert got == case["hash"], ( + f"Hash mismatch for format={case['format']} message={case['message']!r} " + f"context={case['context']!r} id={case['id']!r} maxChars={case['maxChars']!r}\n" + f" expected (JS): {case['hash']}\n got (Python): {got}" + ) + + +# --------------------------------------------------------------------------- +# Cross-format collision tests — same message hashed as ICU vs STRING vs I18NEXT +# must produce three DIFFERENT hashes (except empty message where ICU == STRING +# since indexVars of empty is empty). +# --------------------------------------------------------------------------- + + +def _collision_cases() -> list[dict[str, Any]]: + data = _load_fixtures() + return data["collisionCases"] + + +def _collision_ids() -> list[str]: + cases = _collision_cases() + out = [] + for i, c in enumerate(cases): + msg_repr = c["message"][:20].replace(" ", "_") or "empty" + ctx = c["context"] or "none" + out.append(f"{i}-{msg_repr}-ctx={ctx}") + return out + + +@pytest.mark.parametrize("case", _collision_cases(), ids=_collision_ids()) +def test_cross_format_collision(case: dict[str, Any]) -> None: + """All three format hashes for a given (message, context) are mutually distinct in Python. + + The only exception is very short messages that happen to produce the same + bytes after normalization — the JS fixture already reflects these collisions, + so we assert parity against the JS output rather than strict distinctness. + """ + message = case["message"] + context = case["context"] + kwargs: dict[str, Any] = {} + if context is not None: + kwargs["context"] = context + + py_hashes = {fmt: hash_message(message, format=fmt, **kwargs) for fmt in ("ICU", "STRING", "I18NEXT")} + assert py_hashes == case["hashes"], ( + f"Collision-set mismatch for message={message!r} context={context!r}\n" + f" expected (JS): {case['hashes']}\n got (Python): {py_hashes}" + ) + + +# --------------------------------------------------------------------------- +# Sanity: every fixture case has a well-formed (16-char hex) hash. +# Catches fixture corruption early. +# --------------------------------------------------------------------------- + + +def test_fixture_hashes_are_well_formed() -> None: + """Every fixture case's hash is a 16-character hex string.""" + data = _load_fixtures() + bad = [c for c in data["cases"] if not (isinstance(c["hash"], str) and len(c["hash"]) == 16)] + assert not bad, f"{len(bad)} malformed hashes in fixture; first: {bad[0] if bad else None}" + + +def test_fixture_has_expected_volume() -> None: + """Fixture contains enough cases to actually stress the implementation.""" + data = _load_fixtures() + assert data["caseCount"] >= 500, f"fixture should have >= 500 cases; got {data['caseCount']}" + formats = {c["format"] for c in data["cases"]} + assert formats == {"ICU", "STRING", "I18NEXT"} + + +def test_fixture_covers_all_three_formats_with_all_option_variants() -> None: + """Each format has at least one case with each option combination present.""" + data = _load_fixtures() + for fmt in ("ICU", "STRING", "I18NEXT"): + fmt_cases = [c for c in data["cases"] if c["format"] == fmt] + assert any(c["context"] is not None for c in fmt_cases), f"no {fmt} cases with context" + assert any(c["id"] is not None for c in fmt_cases), f"no {fmt} cases with id" + assert any(c["maxChars"] is not None for c in fmt_cases), f"no {fmt} cases with maxChars" + assert any(c["context"] is None and c["id"] is None and c["maxChars"] is None for c in fmt_cases), ( + f"no {fmt} cases with all options None" + ) diff --git a/packages/gt-i18n/tests/translation_functions/test_tx.py b/packages/gt-i18n/tests/translation_functions/test_tx.py new file mode 100644 index 0000000..f9fef1c --- /dev/null +++ b/packages/gt-i18n/tests/translation_functions/test_tx.py @@ -0,0 +1,191 @@ +"""Golden-standard tests for the new public ``tx()`` function. + +``tx()`` is the Python mirror of the JS ``tx()`` added in gt PR #1207: +an async function that translates a string at runtime via +``GT.translate_many``, with batching, dedup, and source-fallback on error. + +Contract: +- Default format is STRING (matches JS `$format: 'STRING'`). +- Caches successful translations; concurrent calls dedup + batch. +- Returns the interpolated SOURCE on any failure (never raises or returns None). +- Skips the API entirely when target == source locale. + +All tests should FAIL until PR #1207 is ported (``tx`` is not yet exported). +""" + +from __future__ import annotations + +import asyncio +from collections.abc import Generator +from typing import TYPE_CHECKING, Any + +import pytest +from gt_i18n import I18nManager, set_i18n_manager, tx # `tx` does not exist yet + +if TYPE_CHECKING: + from conftest import FakeGT # noqa: F401 — only for type hints + + +@pytest.fixture +def _install_manager(fake_gt: FakeGT) -> Generator[I18nManager, None, None]: + """Install an I18nManager as the module singleton for ``tx()`` to consume.""" + manager = I18nManager( + default_locale="en", + locales=["en", "es", "fr"], + load_translations=lambda locale: {}, + batch_interval_ms=5, + ) + set_i18n_manager(manager) + yield manager + + +# --------------------------------------------------------------------------- +# test_tx_fetches_and_interpolates +# +# Example: +# result = await tx("Hello, {name}!", name="Alice", _locale="es", _format="ICU") +# # → Mock returns "[es]Hello, {name}!" → interpolation yields "[es]Hello, Alice!" +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_tx_fetches_and_interpolates(_install_manager: I18nManager, fake_gt: FakeGT) -> None: + """``tx()`` fetches a runtime translation and interpolates variables into the result.""" + _install_manager.set_locale("es") + + result = await tx("Hello, {name}!", name="Alice", _locale="es", _format="ICU") + + assert result == "[es]Hello, Alice!" + assert len(fake_gt.calls) == 1 + + +# --------------------------------------------------------------------------- +# test_tx_falls_back_to_source_on_api_error +# +# When the batch response indicates failure, ``tx()`` returns the +# interpolated SOURCE — never raises, never returns None. +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_tx_falls_back_to_source_on_api_error(_install_manager: I18nManager, fake_gt: FakeGT) -> None: + """A per-entry failure in the batch response → ``tx()`` returns the interpolated source.""" + + def all_failed(sources: dict[str, Any], options: dict[str, Any]) -> dict[str, Any]: + return {h: {"success": False, "error": "upstream", "code": 500} for h in sources.keys()} + + fake_gt.response_factory = all_failed + _install_manager.set_locale("es") + + result = await tx("Hello", _locale="es") + + # Interpolated source is returned (for "Hello" with no vars, that's just "Hello"). + assert result == "Hello" + + +# --------------------------------------------------------------------------- +# test_tx_source_locale_skips_api +# +# When target locale == default/source locale, there's nothing to translate. +# ``tx()`` short-circuits: no API call, returns the interpolated source. +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_tx_source_locale_skips_api(_install_manager: I18nManager, fake_gt: FakeGT) -> None: + """``tx()`` with target == source locale skips the API and returns the source.""" + _install_manager.set_locale("en") # default_locale is "en" + + result = await tx("Hello, {name}!", name="Alice", _format="ICU") + + assert result == "Hello, Alice!" + assert len(fake_gt.calls) == 0, "target == source locale must not fire translate_many" + + +# --------------------------------------------------------------------------- +# test_tx_passes_context_id_max_chars_to_request_metadata +# +# Per-call options flow through to the request body's `metadata`: +# metadata = {context, id, maxChars, dataFormat: "STRING", hash} +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_tx_passes_context_id_max_chars_to_request_metadata( + _install_manager: I18nManager, fake_gt: FakeGT +) -> None: + """``tx()`` propagates ``_context`` / ``_id`` / ``_max_chars`` into request metadata.""" + _install_manager.set_locale("es") + + await tx("Save", _context="button", _id="save.button", _max_chars=20, _locale="es") + + assert len(fake_gt.calls) == 1 + entry = next(iter(fake_gt.calls[0]["sources"].values())) + meta = entry["metadata"] + assert meta["context"] == "button" + assert meta["id"] == "save.button" + assert meta["maxChars"] == 20 + assert meta["dataFormat"] == "STRING", "tx() defaults $format=STRING, mirroring JS" + + +# --------------------------------------------------------------------------- +# test_tx_batches_concurrent_calls +# +# Multiple concurrent ``tx()`` calls coalesce into a single API request. +# This is the dedup/batching guarantee that makes ``tx()`` safe in hot paths. +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_tx_batches_concurrent_calls(_install_manager: I18nManager, fake_gt: FakeGT) -> None: + """Concurrent distinct ``tx()`` calls are batched into one translate_many call.""" + _install_manager.set_locale("es") + + results = await asyncio.gather( + tx("alpha", _locale="es"), + tx("beta", _locale="es"), + tx("gamma", _locale="es"), + ) + + assert results == ["[es]alpha", "[es]beta", "[es]gamma"] + assert len(fake_gt.calls) == 1 + assert len(fake_gt.calls[0]["sources"]) == 3 + + +# --------------------------------------------------------------------------- +# test_tx_locale_override_per_call +# +# Passing ``_locale`` per call overrides the manager's current locale for +# THIS call only. Enables "translate THIS into French" on the fly. +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_tx_locale_override_per_call(_install_manager: I18nManager, fake_gt: FakeGT) -> None: + """``_locale`` per-call override targets that locale instead of the manager's current one.""" + _install_manager.set_locale("es") # current locale is Spanish + + # …but this call asks for French. + result = await tx("Hello", _locale="fr") + + assert result == "[fr]Hello" + assert len(fake_gt.calls) == 1 + target = fake_gt.calls[0]["options"].get("target_locale") or fake_gt.calls[0]["options"].get("targetLocale") + assert target == "fr" + + +# --------------------------------------------------------------------------- +# test_tx_public_export +# +# Smoke test: ``tx`` is importable from the top-level ``gt_i18n`` package +# and is an async callable. +# --------------------------------------------------------------------------- + + +def test_tx_public_export() -> None: + """``tx`` is a public export and is a coroutine function.""" + import gt_i18n + + assert hasattr(gt_i18n, "tx"), "gt_i18n.tx should be a public export" + assert asyncio.iscoroutinefunction(gt_i18n.tx) + assert "tx" in gt_i18n.__all__ diff --git a/packages/gt-i18n/tests/translation_functions/test_tx_scenarios.py b/packages/gt-i18n/tests/translation_functions/test_tx_scenarios.py new file mode 100644 index 0000000..68b0f3d --- /dev/null +++ b/packages/gt-i18n/tests/translation_functions/test_tx_scenarios.py @@ -0,0 +1,279 @@ +"""Extended scenarios for the public ``tx()`` async translator. + +Covers option permutations, concurrency, format variations, unicode, caching, +and source-locale bypass at scale. +""" + +from __future__ import annotations + +import asyncio +from collections.abc import Generator +from typing import TYPE_CHECKING + +import pytest +from gt_i18n import I18nManager, set_i18n_manager, tx + +if TYPE_CHECKING: + from conftest import FakeGT # noqa: F401 + + +@pytest.fixture +def _manager(fake_gt: FakeGT) -> Generator[I18nManager, None, None]: + m = I18nManager( + default_locale="en", + locales=["en", "es", "fr", "de", "ja"], + load_translations=lambda loc: {}, + batch_interval_ms=5, + ) + set_i18n_manager(m) + yield m + + +# --------------------------------------------------------------------------- +# Smoke variations +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_tx_returns_str(_manager: I18nManager) -> None: + _manager.set_locale("es") + result = await tx("Hello") + assert isinstance(result, str) + + +@pytest.mark.asyncio +async def test_tx_with_empty_content(_manager: I18nManager) -> None: + _manager.set_locale("es") + result = await tx("") + # Empty source + fake returning "[es]" → "[es]" + assert result == "[es]" + + +@pytest.mark.asyncio +async def test_tx_with_very_long_content(_manager: I18nManager) -> None: + _manager.set_locale("es") + long_msg = "x" * 5000 + result = await tx(long_msg) + assert result == f"[es]{long_msg}" + + +@pytest.mark.asyncio +async def test_tx_with_unicode_content(_manager: I18nManager) -> None: + _manager.set_locale("es") + result = await tx("こんにちは") + assert result == "[es]こんにちは" + + +@pytest.mark.asyncio +async def test_tx_with_emoji_content(_manager: I18nManager) -> None: + _manager.set_locale("es") + result = await tx("🎉 party!") + assert result == "[es]🎉 party!" + + +@pytest.mark.asyncio +async def test_tx_with_multiline_content(_manager: I18nManager) -> None: + _manager.set_locale("es") + result = await tx("line1\nline2\nline3") + assert result == "[es]line1\nline2\nline3" + + +# --------------------------------------------------------------------------- +# Format overrides +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +@pytest.mark.parametrize("fmt", ["STRING", "ICU", "I18NEXT"]) +async def test_tx_format_override_reaches_request_metadata(fake_gt: FakeGT, _manager: I18nManager, fmt: str) -> None: + _manager.set_locale("es") + await tx(f"content-{fmt}", _format=fmt) + meta = next(iter(fake_gt.calls[0]["sources"].values()))["metadata"] + assert meta["dataFormat"] == fmt + + +# --------------------------------------------------------------------------- +# Caching +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_tx_identical_calls_do_not_refire_api(fake_gt: FakeGT, _manager: I18nManager) -> None: + """Second tx() for same content returns cached result — zero extra API calls.""" + _manager.set_locale("es") + a = await tx("repeat") + b = await tx("repeat") + assert a == b == "[es]repeat" + assert len(fake_gt.calls) == 1 + + +@pytest.mark.asyncio +async def test_tx_different_contexts_produce_separate_cache_entries(fake_gt: FakeGT, _manager: I18nManager) -> None: + """Same message with different _context hits the API twice (different cache keys).""" + _manager.set_locale("es") + await tx("Save", _context="button") + await tx("Save", _context="menu") + assert len(fake_gt.calls) == 2 + + +@pytest.mark.asyncio +async def test_tx_different_ids_produce_separate_cache_entries(fake_gt: FakeGT, _manager: I18nManager) -> None: + _manager.set_locale("es") + await tx("Go", _id="btn1") + await tx("Go", _id="btn2") + assert len(fake_gt.calls) == 2 + + +@pytest.mark.asyncio +async def test_tx_max_chars_variation_produces_separate_cache_entries(fake_gt: FakeGT, _manager: I18nManager) -> None: + _manager.set_locale("es") + await tx("Hi", _max_chars=10) + await tx("Hi", _max_chars=20) + assert len(fake_gt.calls) == 2 + + +# --------------------------------------------------------------------------- +# Source-locale short circuit permutations +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +@pytest.mark.parametrize("fmt", ["STRING", "ICU", "I18NEXT"]) +async def test_tx_source_locale_skips_api_for_any_format(fake_gt: FakeGT, _manager: I18nManager, fmt: str) -> None: + _manager.set_locale("en") # source + result = await tx("Hello", _format=fmt) + assert result == "Hello" + assert len(fake_gt.calls) == 0 + + +@pytest.mark.asyncio +async def test_tx_per_call_locale_override_routes_even_when_manager_is_source( + fake_gt: FakeGT, _manager: I18nManager +) -> None: + """Per-call _locale override: even if manager is at source, a different _locale hits API.""" + _manager.set_locale("en") + await tx("Hello", _locale="fr") + assert len(fake_gt.calls) == 1 + target = fake_gt.calls[0]["options"].get("target_locale") or fake_gt.calls[0]["options"].get("targetLocale") + assert target == "fr" + + +# --------------------------------------------------------------------------- +# Concurrency + batching +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_tx_ten_concurrent_distinct_calls_batch_into_one_request(fake_gt: FakeGT, _manager: I18nManager) -> None: + _manager.set_locale("es") + results = await asyncio.gather(*(tx(f"msg-{i}") for i in range(10))) + assert [r for r in results] == [f"[es]msg-{i}" for i in range(10)] + assert len(fake_gt.calls) == 1 + assert len(fake_gt.calls[0]["sources"]) == 10 + + +@pytest.mark.asyncio +async def test_tx_hundred_concurrent_duplicate_calls_dedup_to_one(fake_gt: FakeGT, _manager: I18nManager) -> None: + fake_gt.delay_s = 0.02 + _manager.set_locale("es") + results = await asyncio.gather(*(tx("same") for _ in range(100))) + assert all(r == "[es]same" for r in results) + assert len(fake_gt.calls) == 1 + + +@pytest.mark.asyncio +async def test_tx_concurrent_different_locales_each_get_their_own_call(fake_gt: FakeGT, _manager: I18nManager) -> None: + _manager.set_locale("es") + results = await asyncio.gather( + tx("Hi", _locale="es"), + tx("Hi", _locale="fr"), + tx("Hi", _locale="de"), + tx("Hi", _locale="ja"), + ) + # Each distinct locale gets its own TranslationsCache → one request per locale. + assert results == ["[es]Hi", "[fr]Hi", "[de]Hi", "[ja]Hi"] + assert len(fake_gt.calls) == 4 + + +# --------------------------------------------------------------------------- +# Error paths +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_tx_returns_source_when_translation_explicitly_fails(fake_gt: FakeGT, _manager: I18nManager) -> None: + fake_gt.response_factory = lambda sources, options: { + h: {"success": False, "error": "upstream", "code": 500} for h in sources + } + _manager.set_locale("es") + assert await tx("Fallback me") == "Fallback me" + + +# --------------------------------------------------------------------------- +# Options propagation exhaustive matrix +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +@pytest.mark.parametrize( + "opts", + [ + {"_context": "button"}, + {"_id": "save_btn"}, + {"_max_chars": 15}, + {"_context": "menu", "_id": "menu_save"}, + {"_context": "menu", "_max_chars": 30}, + {"_id": "menu_save", "_max_chars": 30}, + {"_context": "modal", "_id": "modal_close", "_max_chars": 10}, + ], +) +async def test_tx_options_reach_metadata(fake_gt: FakeGT, _manager: I18nManager, opts: dict[str, object]) -> None: + _manager.set_locale("es") + await tx("Save", **opts) + meta = next(iter(fake_gt.calls[0]["sources"].values()))["metadata"] + if "_context" in opts: + assert meta["context"] == opts["_context"] + if "_id" in opts: + assert meta["id"] == opts["_id"] + if "_max_chars" in opts: + assert meta["maxChars"] == opts["_max_chars"] + + +# --------------------------------------------------------------------------- +# Cache hit path exercises hash computation including options +# --------------------------------------------------------------------------- + + +@pytest.mark.asyncio +async def test_tx_repeated_call_same_options_cache_hit(fake_gt: FakeGT, _manager: I18nManager) -> None: + _manager.set_locale("es") + await tx("Save", _context="button", _id="save", _max_chars=20) + calls_before = len(fake_gt.calls) + result = await tx("Save", _context="button", _id="save", _max_chars=20) + assert result == "[es]Save" + assert len(fake_gt.calls) == calls_before + + +# --------------------------------------------------------------------------- +# Public export shape +# --------------------------------------------------------------------------- + + +def test_tx_is_importable_from_multiple_paths() -> None: + """``tx`` is reachable from top-level and from the translation_functions module.""" + from gt_i18n import tx as a + from gt_i18n.translation_functions import tx as b + from gt_i18n.translation_functions._tx import tx as c + + assert a is b is c + + +def test_tx_coroutine_signature() -> None: + import inspect + + sig = inspect.signature(tx) + params = list(sig.parameters.keys()) + # tx(content, **kwargs) + assert params[0] == "content" + # kwargs via var keyword + assert any(p.kind == inspect.Parameter.VAR_KEYWORD for p in sig.parameters.values())