From 3573227c8867bf0575eb5c849160c2df0d26271d Mon Sep 17 00:00:00 2001 From: gitnlp <36983436+gitnlp@users.noreply.github.com> Date: Wed, 26 Jul 2023 18:40:30 +0800 Subject: [PATCH] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 363b85f..45bc8bb 100644 --- a/README.md +++ b/README.md @@ -6,8 +6,8 @@

TorchScale is a PyTorch library that allows researchers and developers to scale up Transformers efficiently and effectively. -It has the implementation of fundamental research to improve modeling generality and capability as well as training stability and efficiency of scaling Transformers. +Fundamental research to develop new architectures for foundation models and A(G)I, focusing on modeling generality and capability, as well as training stability and efficiency. - Stability - [**DeepNet**](https://arxiv.org/abs/2203.00555): scaling Transformers to 1,000 Layers and beyond - Generality - [**Foundation Transformers (Magneto)**](https://arxiv.org/abs/2210.06423): towards true general-purpose modeling across tasks and modalities (including language, vision, speech, and multimodal) - Capability - A [**Length-Extrapolatable**](https://arxiv.org/abs/2212.10554) Transformer