Commit History

Author SHA1 Message Date
  Hamid Shojanazeri 8b0008433c fix typos 1 year ago
  Hamid Shojanazeri 564ef2f628 remove padding logic 1 year ago
  Hamid Shojanazeri 277a292fbc adding autotokenizer 1 year ago
  Hamid Shojanazeri 3f2fb9167e adding notes to model not supporting infilling 1 year ago
  sekyonda 1e0e4fb8a9 Update inference.md 1 year ago
  sekyondaMeta 37d7151494 Merge branch 'facebookresearch:main' into promptInfo 1 year ago
  Hamid Shojanazeri c62428b99c setting defaults of temp and top_p 1 year ago
  Hamid Shojanazeri c014ae7cb8 setting BT option to true 1 year ago
  Hamid Shojanazeri 4fa44e16d9 add note for python llama not suited for llama infilling 1 year ago
  Hamid Shojanazeri b18a186385 removing the option to take prompt from cli 1 year ago
  Hamid Shojanazeri 75991d8795 fix the extra line added and remove take prompt from cli 1 year ago
  Hamid Shojanazeri d28fc9898a addressing doc comments 1 year ago
  Hamid Shojanazeri a234d1fe0c fix typos 1 year ago
  Hamid Shojanazeri 2d9f4796e8 fixing the output format 1 year ago
  Hamid Shojanazeri 1e8ea70b26 adding llama code inference 1 year ago
  Geeta Chauhan 82e05c46e0 fix a bug in the config for use_fast_kernels (#121) 1 year ago
  Hamid Shojanazeri 971c079aa6 bugfix: remove duplicate load_peft_model (#124) 1 year ago
  sekyonda 58c6ae8f99 Update inference.md 1 year ago
  hongbo.mo fcc817e923 bugfix: remove duplicate load_peft_model 1 year ago
  Brian Vaughan 3faf005226 fix a bug in the config for use_fast_kernels 1 year ago
  Abhilash Majumder d5f39914e8 Merge branch 'main' into ipex_feature 1 year ago
  abhilash1910 82d3ca6e06 Fix bugs in data loading 1 year ago
  Geeta Chauhan 03faba661f Update paddings (#85) 1 year ago
  Geeta Chauhan 205e5a4b81 save cpu mem by leveraging FSDP rank0 broadcasting (#77) 1 year ago
  Hamid Shojanazeri 85a4ed1b65 Merge branch 'main' into update_paddings 1 year ago
  abhilash1910 ed7ba999a9 enable xpu finetuning and inference 1 year ago
  lchu feaa344af3 resolve conflicts 1 year ago
  Geeta Chauhan 3f1fef7a00 adding flash attention and xformer memory efficient through PT SDPA (#97) 1 year ago
  Hamid Shojanazeri beab5726cc add notes for padding 1 year ago
  Hamid Shojanazeri c3a11c4fbe update to main 1 year ago