Hamid Shojanazeri db7accfbe7 dix PR link преди 1 година
..
hf-text-generation-inference b3067b55dc fix typos and spelling errors преди 1 година
README.md db7accfbe7 dix PR link преди 1 година
chat_completion.py 557e881fcc aliginng the pad token with HF latest преди 1 година
chat_utils.py f6152893d8 update prompts преди 1 година
chats.json f6152893d8 update prompts преди 1 година
checkpoint_converter_fsdp_hf.py 50e9d17045 add the default option for find the HF model_name/path from train_param.yaml преди 1 година
inference.py c4e96af6ee clean up преди 1 година
model_utils.py 76a187c4d2 clean up преди 1 година
safety_utils.py 4767f09ecd Initial commit преди 1 година
samsum_prompt.txt 4767f09ecd Initial commit преди 1 година
vLLM_inference.py 4767f09ecd Initial commit преди 1 година

README.md

Inference

This folder contains inference examples for Llama 2. So far, we have provided support for three methods of inference:

  1. inference script script provides support for Hugging Face accelerate, PEFT and FSDP fine tuned models.

  2. vLLM_inference.py script takes advantage of vLLM's paged attention concept for low latency.

  3. The hf-text-generation-inference folder contains information on Hugging Face Text Generation Inference (TGI).

For more in depth information on inference including inference safety checks and examples, see the inference documentation here.

System Prompt Update

Observed Issue

We received feedback from the community on our prompt template and we are providing an update to reduce the false refusal rates seen. False refusals occur when the model incorrectly refuses to answer a question that it should, for example due to overly broad instructions to be cautious in how it provides responses.

Updated approach

Based on evaluation and analysis, we recommend the removal of the system prompt as the default setting. Pull request #105 removes the system prompt as the default option, but still provides an example to help enable experimentation for those using it.