fix: cast block_seed to int before manual_seed()#620
fix: cast block_seed to int before manual_seed()#620livepeer-tessa wants to merge 1 commit intomainfrom
Conversation
torch.Generator.manual_seed() requires a long/int, but base_seed can arrive as a float when deserialized from JSON. Cast block_seed to int in both PrepareVideoLatentsBlock and PrepareLatentsBlock. Fixes #618 Signed-off-by: livepeer-robot <robot@livepeer.org>
📝 WalkthroughWalkthroughTwo pipeline blocks in the WAN2.1 pipeline are updated to explicitly cast Changes
Estimated code review effort🎯 1 (Trivial) | ⏱️ ~3 minutes Poem
🚥 Pre-merge checks | ✅ 4 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (4 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches
🧪 Generate unit tests (beta)
Comment |
There was a problem hiding this comment.
🧹 Nitpick comments (1)
src/scope/core/pipelines/wan2_1/blocks/prepare_latents.py (1)
93-93: Validatebase_seedbefore truncating it.
int(base_seed + block_state.current_start_frame)fixes42.0, but it also silently turns values like42.7into42. Sincebase_seedis declared as an integer input, it would be safer to normalize or reject non-integral floats once when the state/config is built, then keep integer arithmetic here.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/scope/core/pipelines/wan2_1/blocks/prepare_latents.py` at line 93, base_seed is being truncated silently by int(...) in the prepare_latents block (block_seed = int(base_seed + block_state.current_start_frame)); validate and normalize base_seed earlier (when the config/state is constructed) so it is guaranteed to be an integer or explicitly reject non-integral floats, then use integer arithmetic here without int() casting; update the config/state validation routine that sets base_seed to either raise on non-integer floats or coerce using a clear normalization (e.g., round or floor) and add a unit test asserting prepare_latents sees only integer base_seed.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In `@src/scope/core/pipelines/wan2_1/blocks/prepare_latents.py`:
- Line 93: base_seed is being truncated silently by int(...) in the
prepare_latents block (block_seed = int(base_seed +
block_state.current_start_frame)); validate and normalize base_seed earlier
(when the config/state is constructed) so it is guaranteed to be an integer or
explicitly reject non-integral floats, then use integer arithmetic here without
int() casting; update the config/state validation routine that sets base_seed to
either raise on non-integer floats or coerce using a clear normalization (e.g.,
round or floor) and add a unit test asserting prepare_latents sees only integer
base_seed.
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 40d98101-86a0-416d-a32b-58b1d26b257f
📒 Files selected for processing (2)
src/scope/core/pipelines/wan2_1/blocks/prepare_latents.pysrc/scope/core/pipelines/wan2_1/blocks/prepare_video_latents.py
🚀 fal.ai Preview Deployment
TestingConnect to this preview deployment by running this on your branch: 🧪 E2E tests will run automatically against this deployment. |
✅ E2E Tests passed
Test ArtifactsCheck the workflow run for screenshots. |
Summary
Cast
block_seedtointbefore passing totorch.Generator.manual_seed()in bothPrepareVideoLatentsBlockandPrepareLatentsBlock.Problem
manual_seed()requires along/int, butbase_seedcan arrive as afloatwhen deserialized from JSON, causing:Fix
One-line change in each block:
int(base_seed + block_state.current_start_frame)Fixes #618
Summary by CodeRabbit