fix(workflows): preserve pipelines key when proxying resolve to cloud#1020
Merged
fix(workflows): preserve pipelines key when proxying resolve to cloud#1020
Conversation
Starlette caches the parsed request JSON on ``request._json`` and returns the same dict reference on every ``request.json()`` call. ``_merge_legacy_pipelines`` was popping ``pipelines`` out of that cached dict during Pydantic validation, so when ``cloud_proxy`` later read the body to forward it upstream, ``pipelines`` was already gone. Old cloud builds still require ``pipelines`` and rejected the resulting ``nodes``-only body with ``body.pipelines: Field required`` — affecting every workflow proxied to cloud, including bundled starter workflows. Return a new dict from the validator instead. The original body is preserved verbatim, so whatever the frontend sent (legacy ``pipelines``, new ``nodes``, or both) is what reaches the cloud. Adds a regression test asserting model_validate does not mutate its input. Signed-off-by: Rafał Leszko <rafal@livepeer.org>
|
Important Review skippedAuto reviews are disabled on this repository. Please check the settings in the CodeRabbit UI or the ⚙️ Run configurationConfiguration used: Organization UI Review profile: CHILL Plan: Pro Run ID: You can disable this status message by setting the Use the checkbox below for a quick retry:
✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Selecting any featured workflow while connected to cloud failed with
Cloud proxy request failed: status: 422 ... loc: ['body', 'pipelines'], msg: 'Field required'— affecting every workflow, including the bundled starters.Root cause
_merge_legacy_pipelineswas poppingpipelinesout of the request dict during Pydantic validation:Starlette caches the parsed request JSON on
request._jsonand returns the same dict reference on everyawait request.json()call. FastAPI handed that cached dict to Pydantic for validation, thencloud_proxy._proxy_to_cloudread it again to forward the body upstream — by which pointpipelineshad been stripped and onlynodesremained. Older cloud builds still requirepipelinesand rejected the resultingnodes-only payload, so every proxied workflow blew up.Reproduced before the fix:
Fix
Return a new dict from the validator instead of mutating the input. The original
request._jsonis left alone, so whatever the frontend sent (legacypipelines, newnodes, or both) reaches the cloud verbatim. Internal model semantics are unchanged:wf.nodesstill combines both keys for in-process consumers.Compatibility matrix after this fix
nodes✅pipelines; out of our reach fornodes-onlyTest plan
uv run pytest tests/test_workflow_resolve.py— 54 passed (includes newtest_validation_does_not_mutate_input_dictregression).uv run ruff check src/scope/core/workflows/resolve.py tests/test_workflow_resolve.py— clean.uv run ruff format --check— clean.🤖 Generated with Claude Code