Update README.md (#75)
- Update README.md (129019f3f22ae389b7817877283f184dad062a9b) Co-authored-by: Akshay L Aradhya <DollarAkshay@users.noreply.huggingface.co>
This commit is contained in:
parent
906e058d22
commit
e1945c40cd
@ -447,7 +447,7 @@ For Hugging Face support, we recommend using transformers or TGI, but a similar
|
||||
|
||||
**Overview** Llama 3 was pretrained on over 15 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over 10M human-annotated examples. Neither the pretraining nor the fine-tuning datasets include Meta user data.
|
||||
|
||||
**Data Freshness** The pretraining data has a cutoff of March 2023 for the 7B and December 2023 for the 70B models respectively.
|
||||
**Data Freshness** The pretraining data has a cutoff of March 2023 for the 8B and December 2023 for the 70B models respectively.
|
||||
|
||||
|
||||
## Benchmarks
|
||||
|
||||
Loading…
Reference in New Issue
Block a user