瀏覽代碼

remove the redundant lr step

Hamid Shojanazeri 1 年之前
父節點
當前提交
4f70348b94
共有 1 個文件被更改,包括 0 次插入2 次删除
  1. 0 2
      utils/train_utils.py

+ 0 - 2
utils/train_utils.py

@@ -193,8 +193,6 @@ def train(model, train_dataloader,eval_dataloader, tokenizer, optimizer, lr_sche
         else:
             print(f"Epoch {epoch+1}: train_perplexity={train_perplexity:.4f}, train_epoch_loss={train_epoch_loss:.4f}")
             
-        lr_scheduler.step()
-
     avg_train_prep = sum(train_prep)/len(train_prep)
     avg_train_loss = sum(train_loss)/len(train_loss)
     if train_config.run_validation: