Commit
·
79e42cd
1
Parent(s):
af4e0bf
Update README.md
Browse files
README.md
CHANGED
|
@@ -17,24 +17,16 @@ These findings are baked into our highly efficient model training stack, the Mos
|
|
| 17 |
If you have questions, please feel free to reach out to us on [Twitter](https://twitter.com/mosaicml),
|
| 18 |
[Email]([email protected]), or join our [Slack channel](https://join.slack.com/t/mosaicml-community/shared_invite/zt-w0tiddn9-WGTlRpfjcO9J5jyrMub1dg)!
|
| 19 |
|
|
|
|
|
|
|
|
|
|
|
|
|
| 20 |
|
| 21 |
# [Composer Library](https://github.com/mosaicml/composer)
|
| 22 |
|
| 23 |
The open source Composer library makes it easy to train models faster at the algorithmic level. It is built on top of PyTorch.
|
| 24 |
Use our collection of speedup methods in your own training loop or—for the best experience—with our Composer trainer.
|
| 25 |
|
| 26 |
-
# [MosaicML Examples Repo](https://github.com/mosaicml/examples)
|
| 27 |
-
|
| 28 |
-
This repo contains reference examples for training ML models quickly and to high accuracy. It's designed to be easily forked and modified.
|
| 29 |
-
|
| 30 |
-
It currently features the following examples:
|
| 31 |
-
|
| 32 |
-
* [ResNet-50 + ImageNet](https://github.com/mosaicml/examples#resnet-50--imagenet)
|
| 33 |
-
* [DeeplabV3 + ADE20k](https://github.com/mosaicml/examples#deeplabv3--ade20k)
|
| 34 |
-
* [GPT / Large Language Models](https://github.com/mosaicml/examples#large-language-models-llms)
|
| 35 |
-
* [BERT](https://github.com/mosaicml/examples#bert)
|
| 36 |
-
|
| 37 |
-
|
| 38 |
# [StreamingDataset](https://github.com/mosaicml/streaming)
|
| 39 |
|
| 40 |
Fast, accurate streaming of training data from cloud storage. We built StreamingDataset to make training on large datasets from cloud storage as fast, cheap, and scalable as possible.
|
|
@@ -49,7 +41,20 @@ With support for major cloud storage providers (AWS, OCI, and GCS are supported
|
|
| 49 |
and designed as a drop-in replacement for your PyTorch [IterableDataset](https://pytorch.org/docs/stable/data.html#torch.utils.data.IterableDataset) class, StreamingDataset seamlessly integrates
|
| 50 |
into your existing training workflows.
|
| 51 |
|
| 52 |
-
# [MosaicML
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 53 |
|
| 54 |
The proprietary MosaicML Platform enables you to easily train large AI models on your data, in your secure environment.
|
| 55 |
|
|
|
|
| 17 |
If you have questions, please feel free to reach out to us on [Twitter](https://twitter.com/mosaicml),
|
| 18 |
[Email]([email protected]), or join our [Slack channel](https://join.slack.com/t/mosaicml-community/shared_invite/zt-w0tiddn9-WGTlRpfjcO9J5jyrMub1dg)!
|
| 19 |
|
| 20 |
+
# [LLM Foundry](https://github.com/mosaicml/llm-foundry/tree/main)
|
| 21 |
+
|
| 22 |
+
This repo contains code for training, finetuning, evaluating, and deploying LLMs for inference with [Composer](https://github.com/mosaicml/composer) and the [MosaicML platform](https://www.mosaicml.com/training).
|
| 23 |
+
|
| 24 |
|
| 25 |
# [Composer Library](https://github.com/mosaicml/composer)
|
| 26 |
|
| 27 |
The open source Composer library makes it easy to train models faster at the algorithmic level. It is built on top of PyTorch.
|
| 28 |
Use our collection of speedup methods in your own training loop or—for the best experience—with our Composer trainer.
|
| 29 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 30 |
# [StreamingDataset](https://github.com/mosaicml/streaming)
|
| 31 |
|
| 32 |
Fast, accurate streaming of training data from cloud storage. We built StreamingDataset to make training on large datasets from cloud storage as fast, cheap, and scalable as possible.
|
|
|
|
| 41 |
and designed as a drop-in replacement for your PyTorch [IterableDataset](https://pytorch.org/docs/stable/data.html#torch.utils.data.IterableDataset) class, StreamingDataset seamlessly integrates
|
| 42 |
into your existing training workflows.
|
| 43 |
|
| 44 |
+
# [MosaicML Examples Repo](https://github.com/mosaicml/examples)
|
| 45 |
+
|
| 46 |
+
This repo contains reference examples for training ML models quickly and to high accuracy. It's designed to be easily forked and modified.
|
| 47 |
+
|
| 48 |
+
It currently features the following examples:
|
| 49 |
+
|
| 50 |
+
* [ResNet-50 + ImageNet](https://github.com/mosaicml/examples#resnet-50--imagenet)
|
| 51 |
+
* [DeeplabV3 + ADE20k](https://github.com/mosaicml/examples#deeplabv3--ade20k)
|
| 52 |
+
* [GPT / Large Language Models](https://github.com/mosaicml/examples#large-language-models-llms)
|
| 53 |
+
* [BERT](https://github.com/mosaicml/examples#bert)
|
| 54 |
+
|
| 55 |
+
|
| 56 |
+
|
| 57 |
+
# [MosaicML Platform](https://mcli.docs.mosaicml.com/en/latest/getting_started/installation.html)
|
| 58 |
|
| 59 |
The proprietary MosaicML Platform enables you to easily train large AI models on your data, in your secure environment.
|
| 60 |
|