浏览代码

fix a bug in the config for use_fast_kernels (#121)

Geeta Chauhan 1 年之前
父节点
当前提交
82e05c46e0
共有 1 个文件被更改,包括 2 次插入2 次删除
  1. 2 2
      configs/training.py

+ 2 - 2
configs/training.py

@@ -33,8 +33,8 @@ class train_config:
     dist_checkpoint_root_folder: str="PATH/to/save/FSDP/model" # will be used if using FSDP
     dist_checkpoint_folder: str="fine-tuned" # will be used if using FSDP
     save_optimizer: bool=False # will be used if using FSDP
-    use_fast_kernels: bool = False, # Enable using SDPA from PyTroch Accelerated Transformers, make use Flash Attention and Xformer memory-efficient kernels
+    use_fast_kernels: bool = False # Enable using SDPA from PyTroch Accelerated Transformers, make use Flash Attention and Xformer memory-efficient kernels
 
     
     
-    
+