torchrec
torchrec copied to clipboard
Pytorch domain library for recommendation systems
Summary: As titled Differential Revision: D57215579
Summary: As titled. Explained in D57176524. Consolidate the code Differential Revision: D57214084
Summary: As titled Integrate TorchRec autoplanner for uneven sharding through option use_trec_hetero_auto_planner. The autoplanner would only be used when user explicitly set so. Continue testing would be done before we...
Summary: # context * previouls this helper function only supports `len(kjt.lengths()) == batch_size * len(kjt.keys())` * now it also supports that declaring len(kjt.lengths()) as a standalone dynamic dim Differential Revision:...
Summary: Added GPU sync tests to simulate gathering metric states on to rank 0 and computing. Tests don't cover this case before, which has resulted in SEVs in the past...
Summary: For easier concat of multiple shards when we call DT.full_tensor() with LocalShardsWrapper. The most important case is checkpointing with state_dict or any case where we need global tensor of...
Summary: Add callbacks to planner init, if best plan found apply list of callbacks to it. Differential Revision: D59128469
Summary: Clients requested to be able to call directly `LazyAwaitable._wait_async(a)` under compiled region. LazyAwaitable.wait_async uses `torch.fx.node.map_aggregate` which is skipped by dynamo => copied inside torchrec, changed immutable_list -> list, immutable_dict...
Summary: # context * previous landed D59031938 was reverted due to torchscript push schedule is behind * adding `torch.jit.is_scripting()` to protect the exposure. Differential Revision: D59081243