rganti commited on
Commit
fb4cd02
·
verified ·
1 Parent(s): 45ce02f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -7
README.md CHANGED
@@ -7,13 +7,13 @@ sdk: static
7
  pinned: false
8
  ---
9
 
10
- # Foundation Model Stack
11
 
12
- Foundation Model Stack (fms) is a collection of components developed out of IBM Research used for development, inference, training, and tuning of foundation models leveraging PyTorch native components.
13
 
14
  ## Optimizations
15
 
16
- In FMS, we aim to bring the latest optimizations for pre-training/inference/fine-tuning to all of our models. A few of these optimizations include, but are not limited to:
17
 
18
  - fully compilable models with no graph breaks
19
  - full tensor-parallel support for all applicable modules developed in fms
@@ -22,11 +22,11 @@ In FMS, we aim to bring the latest optimizations for pre-training/inference/fine
22
 
23
  ## Usage
24
 
25
- FMS is currently being deployed in [Text Generation Inference Server](https://github.com/IBM/text-generation-inference)
26
 
27
  ## Repositories
28
 
29
- - [foundation-model-stack](https://github.com/foundation-model-stack/foundation-model-stack): Main repository for which all fms models are based
30
- - [fms-extras](https://github.com/foundation-model-stack/fms-extras): New features staged to be integrated with foundation-model-stack
31
  - [fms-fsdp](https://github.com/foundation-model-stack/fms-fsdp): Pre-Training Examples using FSDP wrapped foundation models
32
- - [fms-hf-tuning](https://github.com/foundation-model-stack/fms-hf-tuning): Basic Tuning scripts for fms models leveraging SFTTrainer
 
7
  pinned: false
8
  ---
9
 
10
+ # IBM AI Platform
11
 
12
+ IBM's AI Platform is a collection of components developed out of IBM Research used for development, inference, training, and tuning of foundation models leveraging PyTorch native components.
13
 
14
  ## Optimizations
15
 
16
+ In this platform, we aim to bring the latest optimizations for pre-training/inference/fine-tuning to all of our models. A few of these optimizations include, but are not limited to:
17
 
18
  - fully compilable models with no graph breaks
19
  - full tensor-parallel support for all applicable modules developed in fms
 
22
 
23
  ## Usage
24
 
25
+ Components such as speculative decoding have been deployed to [vLLM](https://docs.vllm.ai/en/latest/getting_started/examples/mlpspeculator.html)
26
 
27
  ## Repositories
28
 
29
+ - [foundation-model-stack](https://github.com/foundation-model-stack/foundation-model-stack): Main repository for which all AI platform models are based
30
+ - [fms-extras](https://github.com/foundation-model-stack/fms-extras): New features staged to be integrated with our AI platform
31
  - [fms-fsdp](https://github.com/foundation-model-stack/fms-fsdp): Pre-Training Examples using FSDP wrapped foundation models
32
+ - [fms-hf-tuning](https://github.com/foundation-model-stack/fms-hf-tuning): Basic Tuning scripts for AI platform models leveraging SFTTrainer