llm-foundry icon indicating copy to clipboard operation
llm-foundry copied to clipboard

Tensor Parallelism, Pipeline Parallelism, Sequence Parallelism

Open pretidav opened this issue 7 months ago • 0 comments

It seems that only Zero3/DP (i.e. FSDP, or HSDP) are supported in LLM foundry, while other parallelization techniques like Tensor Parallelism (TP), Pipeline Parallelism (PP) and Sequence Parallelism (or Context Parallelism) are currently not supported.

Is there any plan to implement them any soon? At least one between TP and PP other than FSDP seems required for scaling up LLM training with large number of parameters and gpus.

pretidav avatar Mar 14 '25 07:03 pretidav