Update README.md
This commit is contained in:
parent
f58c8247be
commit
3573227c88
|
@ -6,8 +6,8 @@
|
|||
</p>
|
||||
|
||||
TorchScale is a PyTorch library that allows researchers and developers to scale up Transformers efficiently and effectively.
|
||||
It has the implementation of fundamental research to improve modeling generality and capability as well as training stability and efficiency of scaling Transformers.
|
||||
|
||||
Fundamental research to develop new architectures for foundation models and A(G)I, focusing on modeling generality and capability, as well as training stability and efficiency.
|
||||
- Stability - [**DeepNet**](https://arxiv.org/abs/2203.00555): scaling Transformers to 1,000 Layers and beyond
|
||||
- Generality - [**Foundation Transformers (Magneto)**](https://arxiv.org/abs/2210.06423): towards true general-purpose modeling across tasks and modalities (including language, vision, speech, and multimodal)
|
||||
- Capability - A [**Length-Extrapolatable**](https://arxiv.org/abs/2212.10554) Transformer
|
||||
|
|
Loading…
Reference in New Issue
Block a user