pytorch-lightning
pytorch-lightning copied to clipboard
Label tracking meta-issue (edit me to get automatically CC'ed on issues!)
This issue is used by lightning-probot to manage subscriptions to labels. To subscribe yourself to a label, add a line * label @yourusername, or add your username to an existing line (space separated) in the body of this issue. Do not try to subscribe in comments, the bot only parses the initial post.
This is a copy of https://github.com/pytorch/pytorch/issues/24422.
As a courtesy to others, please do not edit the subscriptions of users who are not you.
Note: Some labels have sublabels (e.g. callback), but you won't be subscribing to them automatically if you choose the label at the top level.
PR status
- ready @borda
- has conflicts
- won’t fix
Issue/PR category
- design @tchaton @justusschock @awaelchli @borda
- breaking change @borda @justusschock
- bug
- code quality
- deprecation @tchaton
- discussion @borda
- docs @borda
- feature @borda
- question
- refactor @justusschock @awaelchli
- release @borda
Issue/PR severity
- priority: 0 @tchaton
- priority: 1 @tchaton
- priority: 2 @tchaton
Issue/PR metadata
- duplicate
- good first issue @borda
- help wanted
- admin
- let’s do it! @tchaton
- waiting on author
- working as intended
- 3rd party
- experimental @borda
- needs triage
- community
Generic labels
- pl
Code section (pl)
- accelerator @justusschock
- accelerator: cpu @justusschock @awaelchli
- accelerator: cuda @justusschock @awaelchli
- accelerator: hpu (external) @jerome-habana
- accelerator: mps @justusschock
- accelerator: tpu @JackCaoG @Liyang90 @gkroiz
- callback @awaelchli
- callback: device stats @awaelchli
- callback: early stopping @awaelchli
- callback: fine-tuning
- callback: gradient accumulation
- callback: lambda function
- callback: lr monitor
- callback: model checkpoint @awaelchli
- callback: model summary @awaelchli
- callback: prediction writer
- callback: pruning
- callback: swa
- callback: timer @awaelchli
- callback: throughput
- ci @borda
- environment @awaelchli
- environment: kubeflow @awaelchli
- environment: lightning @awaelchli
- environment: lsf @awaelchli
- environment: slurm @awaelchli
- environment: torchelastic @awaelchli
- environment: mpi @awaelchli
- hooks @awaelchli @borda @justusschock
- io @borda @justusschock
- lightningcli @mauvilsa
- lightningdatamodule @awaelchli @borda
- fabric @justusschock @awaelchli
- lightningmodule @justusschock @awaelchli @borda
- logger @awaelchli @borda
- logger: comet
- logger: csv @borda
- logger: mlflow
- logger: neptune
- logger: tensorboard @awaelchli
- logger: wandb @awaelchli @morganmcg1 @borisdayma @scottire @parambharat
- logging
- loops @justusschock
- lr scheduler
- optimizer
- plugin @justusschock @awaelchli
- precision: double @justusschock @awaelchli
- precision: amp @justusschock @awaelchli
- precision: bnb @awaelchli
- precision: te @awaelchli
- precision: half @awaelchli
- profiler
- progress bar: rich
- progress bar: tqdm @awaelchli
- progress tracking (internal) @awaelchli
- strategy @justusschock
- strategy: ddp @justusschock @awaelchli
- strategy: deepspeed @awaelchli
- strategy: dp (removed in pl) @justusschock @awaelchli
- strategy: fsdp @awaelchli
- strategy: hpu (external) @jerome-habana
- strategy: hivemind (external)
- strategy: xla @JackCaoG @Liyang90 @gkroiz
- tests @borda
- trainer @justusschock @awaelchli @borda
- trainer: connector @justusschock @awaelchli
- trainer: fit @justusschock @awaelchli
- trainer: argument @justusschock @awaelchli @borda
- trainer: predict
- trainer: test @awaelchli
- trainer: tune @borda
- trainer: validate @awaelchli
- torch.compile
- tuner
Feature (pl)
- fault tolerance @justusschock @awaelchli
- checkpointing @awaelchli
- data handling @justusschock @awaelchli
- distributed @awaelchli
- optimization @borda
- reproducibility @awaelchli
- performance @borda
@Borda do not lock the issue so that the community can request being added/removed.
is it possible to configure the same for discussions as well? we have labels there.
Probably yes, if GitHub triggers API events for discussions too
https://github.com/carmocca/probot/blob/87ca7b14202ea6afce291e5ee5b9a6c40c2a99df/src/auto-cc-bot.ts#L77-L79
Hey @carmocca , Could you remove @AyushExel from logger: wandb?
Note that when a label occurs multiple times in the list above, the bottom one overrides the others and not all authors get included. I'm going to merge the two "docs" instances we currently have into one.
@carmocca I cannot edit this issue. Would you please remove myself, @ninginthecloud, @edward-io, and @jjenniferdai from being tagged? thanks!
@ananthsub done
Added @JackCaoG @steventk-g and @Liyang90 to TPU accelerator label <3 Thank you for your help
Hi @carmocca: Can you please remove @manangoel99 from logger:wandb and tag me instead?