Skip to content

docs: add blueprint for finetuning an LLM on a Slurm worker#519

Draft
dmingolla wants to merge 13 commits intomasterfrom
ai-factory/finetuning-on-slurm-worker
Draft

docs: add blueprint for finetuning an LLM on a Slurm worker#519
dmingolla wants to merge 13 commits intomasterfrom
ai-factory/finetuning-on-slurm-worker

Conversation

@dmingolla
Copy link

Description

Add blueprint: How to finetune an LLM on a Slurm worker. Covers preparing the host folder, configuring the Slurm worker template (virtiofs, GPU, start script), and running the job from the Slurm controller.

Branches to which this PR applies

  • master

@dmingolla dmingolla changed the base branch from one-7.0 to master February 2, 2026 16:54
@dmingolla dmingolla changed the title Ai factory/finetuning on slurm worker docs: add blueprint for finetuning an LLM on a Slurm worker Feb 2, 2026
…00L GPU usage, update driver instructions, and add image references for clarity.
…th NVIDIA H100L and driver installation instructions, and improve alert formatting for better user guidance.
…ty and correct folder references for improved user understanding.
…e naming detail and emphasize model path configuration for clarity.
… name requirement into model path configuration for improved clarity.
@dmingolla dmingolla self-assigned this Feb 3, 2026
@dmingolla dmingolla requested a review from km4rcus February 3, 2026 08:29
…te model directory, download model, and install dependencies at boot, while clarifying GPU passthrough configuration.
…irements, streamline GPU passthrough and start script instructions, and enhance job submission details for improved user experience.
…h and start script sections, enhancing visual clarity and user guidance.
…dd important prerequisites for LLM Inference validation, and streamline start script instructions by removing unnecessary network configuration details
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant