This website works better with JavaScript
Home
Verkennen
Help
Registreren
Inloggen
marx
/
llama-recipes
Volgen
1
Ster
0
Vork
0
Bestanden
Issues
0
Pull-aanvragen
0
Wiki
Boom:
2f7256918e
Aftakkingen
Labels
main
llama-recipes
/
src
/
llama_recipes
/
inference
Yuanhao
e554c1c8bf
The tokenizer will not add eos_token by default
1 jaar geleden
..
__init__.py
207d2f80e9
Make code-llama and hf-tgi inference runnable as module
1 jaar geleden
chat_utils.py
e554c1c8bf
The tokenizer will not add eos_token by default
1 jaar geleden
checkpoint_converter_fsdp_hf.py
ce9501f22c
remove relative imports
1 jaar geleden
model_utils.py
4c9cc7d223
Move modules into separate src folder
1 jaar geleden
safety_utils.py
cf678b9bf0
Adjust imports to package structure + cleaned up imports
1 jaar geleden