diff --git a/README.md b/README.md index 8a28d1d..abd0073 100644 --- a/README.md +++ b/README.md @@ -5,8 +5,8 @@ MIT License

-TorchScale is a PyTorch library that allows researchers and developeres to scale up Transformers efficiently and effectively. -It has the implemetention of fundamental research to improve modeling generality and capability, as well as training stability and efficiency of scaling Transformers. +TorchScale is a PyTorch library that allows researchers and developers to scale up Transformers efficiently and effectively. +It has the implementation of fundamental research to improve modeling generality and capability, as well as training stability and efficiency of scaling Transformers. - Stability - [**DeepNet**](https://arxiv.org/abs/2203.00555): scaling Transformers to 1,000 Layers and beyond - Generality - [**Foundation Transformers (Magneto)**](https://arxiv.org/abs/2210.06423) @@ -192,4 +192,4 @@ This project may contain trademarks or logos for projects, products, or services trademarks or logos is subject to and must follow [Microsoft's Trademark & Brand Guidelines](https://www.microsoft.com/en-us/legal/intellectualproperty/trademarks/usage/general). Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. -Any use of third-party trademarks or logos are subject to those third-party's policies. \ No newline at end of file +Any use of third-party trademarks or logos are subject to those third-party's policies.