.. |
__init__.py
|
207d2f80e9
Make code-llama and hf-tgi inference runnable as module
|
1 year ago |
chat_utils.py
|
a414ca6a57
Update chat format for llama3
|
11 months ago |
checkpoint_converter_fsdp_hf.py
|
ce9501f22c
remove relative imports
|
1 year ago |
llm.py
|
81984a9a44
Remove unnecessary spec format
|
1 year ago |
model_utils.py
|
d51d2cce9c
adding sdpa for flash attn
|
1 year ago |
prompt_format_utils.py
|
3e710f71f8
renaming the prompt format file to conform to repo standards
|
1 year ago |
safety_utils.py
|
c0886a0a89
Fixing typo in self
|
1 year ago |