Skip to content

fix: cast block_seed to int before manual_seed()#620

Open
livepeer-tessa wants to merge 1 commit intomainfrom
fix/manual-seed-float-cast
Open

fix: cast block_seed to int before manual_seed()#620
livepeer-tessa wants to merge 1 commit intomainfrom
fix/manual-seed-float-cast

Conversation

@livepeer-tessa
Copy link
Contributor

@livepeer-tessa livepeer-tessa commented Mar 8, 2026

Summary

Cast block_seed to int before passing to torch.Generator.manual_seed() in both PrepareVideoLatentsBlock and PrepareLatentsBlock.

Problem

manual_seed() requires a long/int, but base_seed can arrive as a float when deserialized from JSON, causing:

RuntimeError: manual_seed expected a long, but got float

Fix

One-line change in each block: int(base_seed + block_state.current_start_frame)

Fixes #618

Summary by CodeRabbit

  • Bug Fixes
    • Fixed seed calculation in video latent generation to ensure proper type handling, improving reliability and consistency in random number generation for video processing workflows.

torch.Generator.manual_seed() requires a long/int, but base_seed can
arrive as a float when deserialized from JSON. Cast block_seed to int
in both PrepareVideoLatentsBlock and PrepareLatentsBlock.

Fixes #618

Signed-off-by: livepeer-robot <robot@livepeer.org>
@coderabbitai
Copy link

coderabbitai bot commented Mar 8, 2026

📝 Walkthrough

Walkthrough

Two pipeline blocks in the WAN2.1 pipeline are updated to explicitly cast block_seed to an integer before passing it to torch.Generator.manual_seed(). The seed is computed by adding base_seed and block_state.current_start_frame, with the result now wrapped in int() to ensure type consistency.

Changes

Cohort / File(s) Summary
Block Seed Type Casting
src/scope/core/pipelines/wan2_1/blocks/prepare_latents.py, src/scope/core/pipelines/wan2_1/blocks/prepare_video_latents.py
Cast block_seed to int() when computed from base_seed and current_start_frame. Ensures torch.Generator.manual_seed() receives an integer instead of float, resolving RuntimeError: manual_seed expected a long, but got float.

Estimated code review effort

🎯 1 (Trivial) | ⏱️ ~3 minutes

Poem

🐰 A seed cast true, no float in sight,
Where torch.Generator finds delight,
An int, not float, the fix takes wing,
Now latents prepare without a sting!

🚥 Pre-merge checks | ✅ 4 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 0.00% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (4 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately summarizes the main change: casting block_seed to int before calling manual_seed(), which directly addresses the core issue.
Linked Issues check ✅ Passed The PR fully addresses issue #618 by casting block_seed to int in both PrepareVideoLatentsBlock and PrepareLatentsBlock before calling manual_seed().
Out of Scope Changes check ✅ Passed All changes are directly related to fixing the block_seed float casting issue in the two affected blocks; no out-of-scope modifications detected.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
  • 📝 Generate docstrings (stacked PR)
  • 📝 Generate docstrings (commit on current branch)
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch fix/manual-seed-float-cast

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick comments (1)
src/scope/core/pipelines/wan2_1/blocks/prepare_latents.py (1)

93-93: Validate base_seed before truncating it.

int(base_seed + block_state.current_start_frame) fixes 42.0, but it also silently turns values like 42.7 into 42. Since base_seed is declared as an integer input, it would be safer to normalize or reject non-integral floats once when the state/config is built, then keep integer arithmetic here.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/scope/core/pipelines/wan2_1/blocks/prepare_latents.py` at line 93,
base_seed is being truncated silently by int(...) in the prepare_latents block
(block_seed = int(base_seed + block_state.current_start_frame)); validate and
normalize base_seed earlier (when the config/state is constructed) so it is
guaranteed to be an integer or explicitly reject non-integral floats, then use
integer arithmetic here without int() casting; update the config/state
validation routine that sets base_seed to either raise on non-integer floats or
coerce using a clear normalization (e.g., round or floor) and add a unit test
asserting prepare_latents sees only integer base_seed.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Nitpick comments:
In `@src/scope/core/pipelines/wan2_1/blocks/prepare_latents.py`:
- Line 93: base_seed is being truncated silently by int(...) in the
prepare_latents block (block_seed = int(base_seed +
block_state.current_start_frame)); validate and normalize base_seed earlier
(when the config/state is constructed) so it is guaranteed to be an integer or
explicitly reject non-integral floats, then use integer arithmetic here without
int() casting; update the config/state validation routine that sets base_seed to
either raise on non-integer floats or coerce using a clear normalization (e.g.,
round or floor) and add a unit test asserting prepare_latents sees only integer
base_seed.

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: 40d98101-86a0-416d-a32b-58b1d26b257f

📥 Commits

Reviewing files that changed from the base of the PR and between 0a52f7c and 56cfb8a.

📒 Files selected for processing (2)
  • src/scope/core/pipelines/wan2_1/blocks/prepare_latents.py
  • src/scope/core/pipelines/wan2_1/blocks/prepare_video_latents.py

@github-actions
Copy link
Contributor

github-actions bot commented Mar 8, 2026

🚀 fal.ai Preview Deployment

App ID daydream/scope-pr-620--preview
WebSocket wss://fal.run/daydream/scope-pr-620--preview/ws
Commit 56cfb8a

Testing

Connect to this preview deployment by running this on your branch:

uv run build && SCOPE_CLOUD_APP_ID="daydream/scope-pr-620--preview/ws" uv run daydream-scope

🧪 E2E tests will run automatically against this deployment.

@github-actions
Copy link
Contributor

github-actions bot commented Mar 8, 2026

✅ E2E Tests passed

Status passed
fal App daydream/scope-pr-620--preview
Run View logs

Test Artifacts

Check the workflow run for screenshots.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

PrepareVideoLatentsBlock: manual_seed fails when block_seed is a float

1 participant