Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue on page /第六章/6.2 动态调整学习率.html #93

Open
Colwzq opened this issue Jun 17, 2024 · 1 comment
Open

Issue on page /第六章/6.2 动态调整学习率.html #93

Colwzq opened this issue Jun 17, 2024 · 1 comment

Comments

@Colwzq
Copy link

Colwzq commented Jun 17, 2024

此处内容似乎有一定问题。

# 选择一种优化器
optimizer = torch.optim.Adam(...) 
# 选择上面提到的一种或多种动态调整学习率的方法
scheduler1 = torch.optim.lr_scheduler.... 
scheduler2 = torch.optim.lr_scheduler....
...
schedulern = torch.optim.lr_scheduler....
# 进行训练
for epoch in range(100):
    train(...)
    validate(...)
    optimizer.step()
    # 需要在优化器参数更新之后再动态调整学习率
# scheduler的优化是在每一轮后面进行的
scheduler1.step() 
...
schedulern.step()

文档中实例表明训练了100个epoch,可以认为训练已经终止,动态调整学习率不生效。

官方给的其中一个示例代码如下:

optimizer = optim.SGD(model.parameters(), lr=0.01, momentum=0.9)
scheduler = ExponentialLR(optimizer, gamma=0.9)

for epoch in range(20):
    for input, target in dataset:
        optimizer.zero_grad()
        output = model(input)
        loss = loss_fn(output, target)
        loss.backward()
        optimizer.step()
    scheduler.step()

可以看出有两个for循环迭代,scheduler.step()位于第一个for循环之后。

@ZhikangNiu
Copy link
Collaborator

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants