diff --git a/README.md b/README.md index 8421fc6..f95b3c9 100644 --- a/README.md +++ b/README.md @@ -279,7 +279,7 @@ See the snippet below for usage with Transformers: import transformers import torch -model_id = "meta-llama/Meta-Llama-3-70B-Instruct" +model_id = "meta-llama/Meta-Llama-3-8B-Instruct" pipeline = transformers.pipeline( "text-generation", @@ -317,12 +317,12 @@ print(outputs[0]["generated_text"][len(prompt):]) ### Use with `llama3` -Please, follow the instructions in the repository +Please, follow the instructions in the [repository](https://github.com/meta-llama/llama3) To download Original checkpoints, see the example command below leveraging `huggingface-cli`: ``` -huggingface-cli download meta-llama/Meta-Llama-3-70B-Instruct --include "original/*" --local-dir Meta-Llama-3-70B-Instruct +huggingface-cli download meta-llama/Meta-Llama-3-8B-Instruct --include "original/*" --local-dir Meta-Llama-3-8B-Instruct ``` For Hugging Face support, we recommend using transformers or TGI, but a similar command works.