All Problems Description Template Solution

Cosine LR Scheduler

Linear warmup + cosine annealing

Medium Training

Problem Description

Implement a cosine learning rate schedule with linear warmup.

Signature

def cosine_lr_schedule(step, total_steps, warmup_steps, max_lr, min_lr=0.0) -> float:

Schedule

step < warmup: lr = max_lr * step / warmup_steps (linear ramp) step >= warmup: lr = min_lr + 0.5*(max_lr-min_lr)*(1 + cos(π * progress))

where progress = (step - warmup) / (total - warmup)

Template

Implement the function below. Use only basic PyTorch operations.

# ✏️ YOUR IMPLEMENTATION HERE def cosine_lr_schedule(step, total_steps, warmup_steps, max_lr, min_lr=0.0): pass # warmup then cosine decay

Test Your Implementation

Use this code to debug before submitting.

# 🧪 Debug lrs = [cosine_lr_schedule(i, 100, 10, 0.001) for i in range(101)] print(f'Start: {lrs[0]:.6f}') print(f'Warmup end: {lrs[10]:.6f}') print(f'Mid: {lrs[55]:.6f}') print(f'End: {lrs[100]:.6f}')

Reference Solution

Try solving it yourself first! Click below to reveal the solution.

# ✅ SOLUTION def cosine_lr_schedule(step, total_steps, warmup_steps, max_lr, min_lr=0.0): if step < warmup_steps: return max_lr * step / warmup_steps if step >= total_steps: return min_lr progress = (step - warmup_steps) / (total_steps - warmup_steps) return min_lr + 0.5 * (max_lr - min_lr) * (1.0 + math.cos(math.pi * progress))

Tips

Run Locally

For interactive practice with auto-grading, run TorchCode locally:
pip install torch-judge then use check("cosine_lr")

Key Concepts

Linear warmup + cosine annealing

Cosine LR Scheduler

Description Template Test Solution Tips