diff --git a/README.md b/README.md index 523b6f8..5984295 100644 --- a/README.md +++ b/README.md @@ -447,7 +447,7 @@ For Hugging Face support, we recommend using transformers or TGI, but a similar **Overview** Llama 3 was pretrained on over 15 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over 10M human-annotated examples. Neither the pretraining nor the fine-tuning datasets include Meta user data. -**Data Freshness** The pretraining data has a cutoff of March 2023 for the 7B and December 2023 for the 70B models respectively. +**Data Freshness** The pretraining data has a cutoff of March 2023 for the 8B and December 2023 for the 70B models respectively. ## Benchmarks