From 467adac81418ca88ccdd552d1821e10763932fe6 Mon Sep 17 00:00:00 2001 From: Gustavo de Rosa Date: Tue, 23 Apr 2024 14:18:44 +0000 Subject: [PATCH] Update README.md --- README.md | 12 +++--------- 1 file changed, 3 insertions(+), 9 deletions(-) diff --git a/README.md b/README.md index 196e0eb..b2d85ca 100644 --- a/README.md +++ b/README.md @@ -20,11 +20,7 @@ Phi-1.5 can write poems, draft emails, create stories, summarize texts, write Py ## How to Use -Phi-1.5 has been integrated in the `transformers` version 4.37.0. If you are using a lower version, ensure that you are doing the following: - -* When loading the model, ensure that `trust_remote_code=True` is passed as an argument of the `from_pretrained()` function. - -The current `transformers` version can be verified with: `pip list | grep transformers`. +Phi-1.5 has been integrated in the `transformers` version 4.37.0, please ensure that you are using a version equal or higher than it. ## Intended Uses @@ -91,8 +87,6 @@ where the model generates the text after the comments. * Phi-1.5 has not been tested to ensure that it performs adequately for any production-level application. Please refer to the limitation sections of this document for more details. -* If you are using `transformers<4.37.0`, always load the model with `trust_remote_code=True` to prevent side-effects. - ## Sample Code ```python @@ -101,8 +95,8 @@ from transformers import AutoModelForCausalLM, AutoTokenizer torch.set_default_device("cuda") -model = AutoModelForCausalLM.from_pretrained("microsoft/phi-1_5", torch_dtype="auto", trust_remote_code=True) -tokenizer = AutoTokenizer.from_pretrained("microsoft/phi-1_5", trust_remote_code=True) +model = AutoModelForCausalLM.from_pretrained("microsoft/phi-1_5", torch_dtype="auto") +tokenizer = AutoTokenizer.from_pretrained("microsoft/phi-1_5") inputs = tokenizer('''def print_prime(n): """