From 41f6ee5687aeaa49fa0259b09dfb875d34ad7e60 Mon Sep 17 00:00:00 2001 From: shumingma Date: Thu, 17 Nov 2022 01:18:20 -0800 Subject: [PATCH] Update README.md --- README.md | 13 +++++-------- 1 file changed, 5 insertions(+), 8 deletions(-) diff --git a/README.md b/README.md index 5cd7cec..33e8333 100644 --- a/README.md +++ b/README.md @@ -1,14 +1,11 @@ -# Project +# OpenScale - Transformers at (any) Scale -> This repo has been populated by an initial template to help get you started. Please -> make sure to update the content to build a great experience for community-building. +Fundamental research to improve modeling generality and capability, as well as training stability and efficiency of scaling Transformers. -As the maintainer of this project, please make a few updates: +- Stability - [**DeepNet**](https://arxiv.org/abs/2203.00555): scaling Transformers to 1,000 Layers and beyond +- Generality - [**Foundation Transformers (Magneto)**](https://arxiv.org/abs/2210.06423) +- Efficiency - [**X-MoE**](https://arxiv.org/abs/2204.09179): scalable & finetunable sparse Mixture-of-Experts (MoE) -- Improving this README.MD file to provide a great experience -- Updating SUPPORT.MD with content about this project's support experience -- Understanding the security reporting process in SECURITY.MD -- Remove this section from the README ## Contributing