diff --git a/README.md b/README.md index 2d60a2f..84d0b62 100644 --- a/README.md +++ b/README.md @@ -20,6 +20,8 @@ tags: --- # `Stable LM 2 1.6B` +Please note: For commercial use, please refer to https://stability.ai/membership + ## Model Description `Stable LM 2 1.6B` is a 1.6 billion parameter decoder-only language model pre-trained on 2 trillion tokens of diverse multilingual and code datasets for two epochs. @@ -82,7 +84,8 @@ print(tokenizer.decode(tokens[0], skip_special_tokens=True)) * **Language(s)**: English * **Paper**: [Stable LM 2 1.6B Technical Report](https://drive.google.com/file/d/1JYJHszhS8EFChTbNAf8xmqhKjogWRrQF/view?usp=sharing) * **Library**: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox) -* **License**: [Stability AI Non-Commercial Research Community License](https://huggingface.co/stabilityai/stablelm-2-1_6b/blob/main/LICENSE). If you'd like to use this model for commercial products or purposes, please contact us [here](https://stability.ai/membership) to learn more. +* **License**: [Stability AI Non-Commercial Research Community License](https://huggingface.co/stabilityai/stablelm-2-1_6b/blob/main/LICENSE). +* **Commercial License**: to use this model commercially, please refer to https://stability.ai/membership * **Contact**: For questions and comments about the model, please email `lm@stability.ai` ### Model Architecture @@ -120,7 +123,7 @@ The model is pre-trained on the aforementioned datasets in `bfloat16` precision, ### Intended Use -The model is intended to be used as a foundational base model for application-specific fine-tuning. Developers must evaluate and fine-tune the model for safe performance in downstream applications. +The model is intended to be used as a foundational base model for application-specific fine-tuning. Developers must evaluate and fine-tune the model for safe performance in downstream applications. For commercial use, please refer to https://stability.ai/membership. ### Limitations and Bias ​