Forráskód Böngészése

Update examples/examples_with_aws/Prompt_Engineering_with_Llama_2_On_Amazon_Bedrock.ipynb

Update for clarity

Co-authored-by: Hamid Shojanazeri <hamid.nazeri2010@gmail.com>

Eissa Jamil 1 éve
szülő
commit
f799c98e84

+ 1 - 1
examples/examples_with_aws/Prompt_Engineering_with_Llama_2_On_Amazon_Bedrock.ipynb

@@ -57,7 +57,7 @@
     "\n",
     "In 2023, Meta introduced the [Llama language models](https://ai.meta.com/llama/) (Llama base, Chat, Code Llama, Llama Guard). These are general purpose, state-of-the-art LLMs.\n",
     "\n",
-    "Llama 2 models come in 7 billion, 13 billion, and 70 billion parameter sizes. Smaller models are cheaper to deploy and run (see: deployment and performance); larger models are more capable.\n",
+    "Llama 2 models come in 7 billion, 13 billion, and 70 billion parameter sizes. Smaller models are cheaper to deploy and have lower inference latency (see: deployment and performance); larger models are more capable.\n",
     "\n",
     "#### Llama 2\n",
     "1. `llama-2-7b` - base pretrained 7 billion parameter model\n",