Paddle icon indicating copy to clipboard operation
Paddle copied to clipboard

[Type Hints] 为公开 API 标注类型提示信息

Open megemini opened this issue 1 year ago • 93 comments

commit https://github.com/PaddlePaddle/Paddle/commit/bff34fb14dd9d9111bfcbc2f523a403b6a589d2a

⚠️提示⚠️ python/paddle/distributed 模块难度稍高,可先认领其他模块任务 🤟🤟🤟

🔚 第 1 批 🎉

序号 文件 API 数量 认领人 Github id PR 链接
✅A-1 python/paddle/tensor/array.py 4 ✅@zrr1999 PaddlePaddle/Paddle#65009
✅A-2 python/paddle/tensor/attribute.py 7 ✅@zrr1999 PaddlePaddle/Paddle#65255
✅A-3 python/paddle/tensor/creation.py 28 ✅@zrr1999 PaddlePaddle/Paddle#65082
✅A-4 python/paddle/tensor/einsum.py 7 ✅@zrr1999 PaddlePaddle/Paddle#65255
✅A-5 python/paddle/tensor/linalg.py 40 ✅@gouzil PaddlePaddle/Paddle#65274
✅A-6 python/paddle/tensor/logic.py 35 ✅@Asthestarsfalll PaddlePaddle/Paddle#65300
✅A-7 python/paddle/tensor/manipulation.py 77 ✅@Asthestarsfalll PaddlePaddle/Paddle#65351
✅A-8 python/paddle/tensor/math.py 192 ✅@SigureMo PaddlePaddle/Paddle#65073
✅A-9 python/paddle/tensor/ops.py 43 ✅@gouzil PaddlePaddle/Paddle#65249
✅A-10 python/paddle/tensor/random.py 15 ✅@ooooo-create PaddlePaddle/Paddle#65272
✅A-11 python/paddle/tensor/search.py 16 ✅@ooooo-create PaddlePaddle/Paddle#65354
✅A-12 python/paddle/tensor/stat.py 9 ✅@ooooo-create PaddlePaddle/Paddle#65337
✅A-13 python/paddle/tensor/to_string.py 2 ✅@gouzil PaddlePaddle/Paddle#65042
✅A-14 python/paddle/nn/layer/activation.py 11 ✅@ooooo-create PaddlePaddle/Paddle#65372
✅A-15 python/paddle/nn/layer/common.py 20 ✅@megemini PaddlePaddle/Paddle#65197
✅A-16 python/paddle/nn/layer/container.py 2 ✅@SigureMo
🙋@liyongchao911
PaddlePaddle/Paddle#65190
✅A-17 python/paddle/nn/layer/conv.py 7 ✅@liyongchao911 PaddlePaddle/Paddle#65183
✅A-18 python/paddle/nn/layer/distance.py 2 ✅@liyongchao911 PaddlePaddle/Paddle#65127
✅A-19 python/paddle/nn/layer/layers.py 1 ✅@SigureMo
🙋@liyongchao911
PaddlePaddle/Paddle#65190
✅A-20 python/paddle/nn/layer/loss.py 21 ✅@Asthestarsfalll PaddlePaddle/Paddle#65376
✅A-21 python/paddle/nn/layer/norm.py 9 ✅@Asthestarsfalll PaddlePaddle/Paddle#65454
✅A-22 python/paddle/nn/layer/pooling.py 18 ✅@Asthestarsfalll PaddlePaddle/Paddle#65460
✅A-23 python/paddle/nn/layer/rnn.py 2 ✅@Asthestarsfalll PaddlePaddle/Paddle#65375
✅A-24 python/paddle/nn/layer/transformer.py 4 ✅@Asthestarsfalll PaddlePaddle/Paddle#65457
✅A-25 python/paddle/nn/layer/vision.py 4 ✅@Asthestarsfalll PaddlePaddle/Paddle#65455
✅A-26 python/paddle/vision/transforms/transforms.py 22 ✅@ooooo-create PaddlePaddle/Paddle#65378
✅A-27 python/paddle/nn/initializer/assign.py 3 ✅@gouzil PaddlePaddle/Paddle#65206
✅A-28 python/paddle/nn/initializer/bilinear.py 2 ✅@gouzil PaddlePaddle/Paddle#65206
✅A-29 python/paddle/nn/initializer/constant.py 3 ✅@DrRyanHuang PaddlePaddle/Paddle#65095
✅A-30 python/paddle/nn/initializer/dirac.py 2 ✅@gouzil PaddlePaddle/Paddle#65087
✅A-31 python/paddle/nn/initializer/initializer.py 2 ✅@gouzil PaddlePaddle/Paddle#65087
✅A-32 python/paddle/nn/initializer/kaiming.py 5 ✅@gouzil PaddlePaddle/Paddle#65206
✅A-33 python/paddle/nn/initializer/normal.py 5 ✅@gouzil PaddlePaddle/Paddle#65206
✅A-34 python/paddle/nn/initializer/orthogonal.py 2 ✅@DrRyanHuang PaddlePaddle/Paddle#65125
✅A-35 python/paddle/nn/initializer/uniform.py 3 ✅@gouzil PaddlePaddle/Paddle#65206
✅A-36 python/paddle/nn/initializer/xavier.py 4 ✅@gouzil PaddlePaddle/Paddle#65206
✅A-37 python/paddle/optimizer/adadelta.py 2 ✅@ooooo-create PaddlePaddle/Paddle#65464
✅A-38 python/paddle/optimizer/adagrad.py 2 ✅@ooooo-create PaddlePaddle/Paddle#65464
✅A-39 python/paddle/optimizer/adam.py 2 ✅@ooooo-create PaddlePaddle/Paddle#65076
✅A-40 python/paddle/optimizer/adamax.py 2 ✅@ooooo-create PaddlePaddle/Paddle#65236
✅A-41 python/paddle/optimizer/adamw.py 2 ✅@ooooo-create PaddlePaddle/Paddle#65236
✅A-42 python/paddle/optimizer/asgd.py 2 ✅@ooooo-create PaddlePaddle/Paddle#65236
✅A-43 python/paddle/optimizer/lamb.py 2 ✅@gouzil PaddlePaddle/Paddle#65247
✅A-44 python/paddle/optimizer/lbfgs.py 2 ✅@enkilee PaddlePaddle/Paddle#65308
✅A-45 python/paddle/optimizer/momentum.py 2 ✅@enkilee PaddlePaddle/Paddle#65284
✅A-46 python/paddle/optimizer/nadam.py 2 ✅@enkilee PaddlePaddle/Paddle#65273
✅A-47 python/paddle/optimizer/optimizer.py 1 ✅@ooooo-create PaddlePaddle/Paddle#65076
✅A-48 python/paddle/optimizer/radam.py 2 ✅@DrRyanHuang PaddlePaddle/Paddle#65085
✅A-49 python/paddle/optimizer/rmsprop.py 2 ✅@DrRyanHuang PaddlePaddle/Paddle#65085
✅A-50 python/paddle/optimizer/rprop.py 2 ✅@DrRyanHuang PaddlePaddle/Paddle#65085
✅A-51 python/paddle/optimizer/sgd.py 2 ✅@ooooo-create
🙋@DrRyanHuang
PaddlePaddle/Paddle#65076
✅A-52 python/paddle/hapi/model.py 2 ✅@ooooo-create PaddlePaddle/Paddle#65755
✅A-53 python/paddle/hapi/model_summary.py 1 ✅@DrRyanHuang PaddlePaddle/Paddle#65086
✅A-54 python/paddle/nn/functional/activation.py 36 ✅@gsq7474741 PaddlePaddle/Paddle#65191
✅A-55 python/paddle/nn/functional/common.py 15 ✅@gsq7474741 PaddlePaddle/Paddle#65191
✅A-56 python/paddle/nn/functional/conv.py 6 ✅@gsq7474741 PaddlePaddle/Paddle#65191
✅A-57 python/paddle/nn/functional/distance.py 2 ✅@DrRyanHuang PaddlePaddle/Paddle#65071
✅A-58 python/paddle/nn/functional/extension.py 4 ✅@Asthestarsfalll PaddlePaddle/Paddle#65380
✅A-59 python/paddle/nn/functional/flash_attention.py 6 ✅@Asthestarsfalll PaddlePaddle/Paddle#65380
✅A-60 python/paddle/nn/functional/input.py 2 ✅@sunzhongkai588 PaddlePaddle/Paddle#65317
✅A-61 python/paddle/nn/functional/loss.py 29 🙋@gsq7474741
✅@Asthestarsfalll
PaddlePaddle/Paddle#65376
✅A-62 python/paddle/nn/functional/norm.py 6 🙋@gsq7474741
✅@Asthestarsfalll
PaddlePaddle/Paddle#65454
✅A-63 python/paddle/nn/functional/pooling.py 17 ✅@Asthestarsfalll PaddlePaddle/Paddle#65460
✅A-64 python/paddle/nn/functional/sparse_attention.py 1 ✅@Liyulingyue PaddlePaddle/Paddle#65064
✅A-65 python/paddle/nn/functional/vision.py 5 🙋@gsq7474741
✅@Asthestarsfalll
PaddlePaddle/Paddle#65455
✅A-66 python/paddle/base/dygraph/math_op_patch.py 12 ✅@SigureMo PaddlePaddle/Paddle#65201
✅A-67 python/paddle/base/dygraph/tensor_patch_methods.py 20 ✅@SigureMo PaddlePaddle/Paddle#65201
✅A-68 python/paddle/regularizer.py 2 ✅@DrRyanHuang PaddlePaddle/Paddle#65226
✅A-69 python/paddle/optimizer/lr.py 18 ✅@SigureMo PaddlePaddle/Paddle#65209
✅A-70 python/paddle/hub.py 3 ✅@SigureMo PaddlePaddle/Paddle#65238
✅A-71 python/paddle/sysconfig.py 2 ✅@SigureMo PaddlePaddle/Paddle#65238
✅A-72 setup.py & python/setup.pyi 8 ✅@SigureMo PaddlePaddle/Paddle#65244
✅A-73 python/paddle/vision/models/alexnet.py 2 ✅@DrRyanHuang PaddlePaddle/Paddle#65283
✅A-74 python/paddle/vision/models/densenet.py 6 ✅@Asthestarsfalll PaddlePaddle/Paddle#65486
✅A-75 python/paddle/vision/models/googlenet.py 2 ✅@DrRyanHuang PaddlePaddle/Paddle#65290
✅A-76 python/paddle/vision/models/inceptionv3.py 2 ✅@DrRyanHuang PaddlePaddle/Paddle#65292
✅A-77 python/paddle/vision/models/lenet.py 1 ✅@DrRyanHuang PaddlePaddle/Paddle#65283
✅A-78 python/paddle/vision/models/mobilenetv1.py 2 ✅@DrRyanHuang PaddlePaddle/Paddle#65323
✅A-79 python/paddle/vision/models/mobilenetv2.py 2 ✅@DrRyanHuang PaddlePaddle/Paddle#65326
✅A-80 python/paddle/vision/models/mobilenetv3.py 4 ✅@enkilee PaddlePaddle/Paddle#65366
✅A-81 python/paddle/vision/models/resnet.py 14 ✅@Asthestarsfalll PaddlePaddle/Paddle#65487
✅A-82 python/paddle/vision/models/shufflenetv2.py 8 ✅@Asthestarsfalll PaddlePaddle/Paddle#65559
✅A-83 python/paddle/vision/transforms/functional.py 16 ✅@Asthestarsfalll PaddlePaddle/Paddle#65560
✅A-84 python/paddle/vision/models/squeezenet.py 3 ✅@DrRyanHuang PaddlePaddle/Paddle#65332
✅A-85 python/paddle/vision/models/vgg.py 5 ✅@86kkd PaddlePaddle/Paddle#65381
✅A-86 python/paddle/vision/datasets/cifar.py 2 ✅@86kkd PaddlePaddle/Paddle#65386
✅A-87 python/paddle/vision/datasets/flowers.py 1 ✅@enkilee PaddlePaddle/Paddle#65504
✅A-88 python/paddle/vision/datasets/folder.py 2 ✅@enkilee PaddlePaddle/Paddle#65532
✅A-89 python/paddle/vision/datasets/mnist.py 2 ✅@enkilee PaddlePaddle/Paddle#65553
✅A-90 python/paddle/vision/datasets/voc2012.py 1 ✅@Asthestarsfalll PaddlePaddle/Paddle#65567
✅A-91 python/paddle/metric/metrics.py 6 ✅@Asthestarsfalll PaddlePaddle/Paddle#65566
✅A-92 python/paddle/vision/image.py 3 ✅@86kkd PaddlePaddle/Paddle#65386
✅A-93 python/paddle/vision/ops.py 18 ✅@Asthestarsfalll PaddlePaddle/Paddle#65568
✅A-94 python/paddle/signal.py 2 ✅@Asthestarsfalll PaddlePaddle/Paddle#65569
✅A-95 python/paddle/fft.py 22 ✅@Asthestarsfalll PaddlePaddle/Paddle#65570
✅A-96 python/paddle/hapi/callbacks.py 8 ✅@zrr1999 PaddlePaddle/Paddle#65777
✅A-97 python/paddle/io/dataloader/batch_sampler.py 2 ✅@SigureMo PaddlePaddle/Paddle#66005
✅A-98 python/paddle/io/dataloader/dataset.py 8 ✅@SigureMo PaddlePaddle/Paddle#65779
✅A-99 python/paddle/io/dataloader/sampler.py 5 ✅@SigureMo PaddlePaddle/Paddle#66005
✅A-100 python/paddle/io/dataloader/worker.py 1 ✅@ooooo-create PaddlePaddle/Paddle#65645

🔚 第 2 批 🎉

序号 文件 API 数量 认领人 Github id PR 链接
✅B-01 python/paddle/io/reader.py 1 ✅@enkilee PaddlePaddle/Paddle#65587
✅B-02 python/paddle/distribution/bernoulli.py 2 ✅@Asthestarsfalll PaddlePaddle/Paddle#65727
✅B-03 python/paddle/distribution/beta.py 3 ✅@enkilee PaddlePaddle/Paddle#65600
✅B-04 python/paddle/distribution/binomial.py 2 ✅@Asthestarsfalll PaddlePaddle/Paddle#65727
✅B-05 python/paddle/distribution/categorical.py 3 ✅@enkilee PaddlePaddle/Paddle#65838
✅B-06 python/paddle/distribution/cauchy.py 2 ✅@megemini PaddlePaddle/Paddle#65765
✅B-07 python/paddle/distribution/chi2.py 1 ✅@megemini PaddlePaddle/Paddle#65766
✅B-08 python/paddle/distribution/continuous_bernoulli.py 2 ✅@megemini PaddlePaddle/Paddle#65767
✅B-09 python/paddle/distribution/dirichlet.py 2 ✅@megemini PaddlePaddle/Paddle#65768
✅B-10 python/paddle/distribution/distribution.py 1 ✅@megemini PaddlePaddle/Paddle#65769
✅B-11 python/paddle/distribution/exponential.py 3 ✅@megemini PaddlePaddle/Paddle#65770
✅B-12 python/paddle/distribution/exponential_family.py 2 ✅@megemini PaddlePaddle/Paddle#65771
✅B-13 python/paddle/distribution/gamma.py 3 ✅@megemini PaddlePaddle/Paddle#65772
✅B-14 python/paddle/distribution/geometric.py 2 ✅@megemini PaddlePaddle/Paddle#65773
✅B-15 python/paddle/distribution/gumbel.py 2 ✅@megemini PaddlePaddle/Paddle#65774
✅B-16 python/paddle/distribution/independent.py 2 ✅@megemini PaddlePaddle/Paddle#65775
✅B-17 python/paddle/distribution/kl.py 2 ✅@megemini PaddlePaddle/Paddle#65776
✅B-18 python/paddle/distribution/laplace.py 2 ✅@ooooo-create PaddlePaddle/Paddle#65784
✅B-19 python/paddle/distribution/lkj_cholesky.py 1 ✅@ooooo-create PaddlePaddle/Paddle#65785
✅B-20 python/paddle/distribution/lognormal.py 4 ✅@enkilee PaddlePaddle/Paddle#65843
✅B-21 python/paddle/distribution/multinomial.py 3 ✅@enkilee PaddlePaddle/Paddle#65844
✅B-22 python/paddle/distribution/multivariate_normal.py 2 ✅@enkilee PaddlePaddle/Paddle#65847
✅B-23 python/paddle/distribution/normal.py 2 ✅@enkilee PaddlePaddle/Paddle#65849
✅B-24 python/paddle/distribution/poisson.py 2 ✅@enkilee PaddlePaddle/Paddle#65852
✅B-25 python/paddle/distribution/student_t.py 3 ✅@enkilee PaddlePaddle/Paddle#65853
✅B-26 python/paddle/distribution/transform.py 13 ✅@enkilee PaddlePaddle/Paddle#65912
✅B-27 python/paddle/distribution/
transformed_distribution.py
4 ✅@enkilee PaddlePaddle/Paddle#65912
✅B-28 python/paddle/distribution/uniform.py 2 ✅@NKNaN PaddlePaddle/Paddle#65660
✅B-29 python/paddle/distribution/variable.py 2 ✅@NKNaN PaddlePaddle/Paddle#65620
✅B-30 python/paddle/device/__init__.py 22 ✅@enkilee PaddlePaddle/Paddle#66077
✅B-31 python/paddle/device/cuda/__init__.py 14 ✅@enkilee PaddlePaddle/Paddle#66090
✅B-32 python/paddle/device/xpu/__init__.py 1 ✅@enkilee PaddlePaddle/Paddle#66090
✅B-33 python/paddle/amp/amp_lists.py 2 ✅@enkilee PaddlePaddle/Paddle#65633
✅B-34 python/paddle/amp/auto_cast.py 7 ✅@enkilee PaddlePaddle/Paddle#66119
✅B-35 python/paddle/amp/debugging.py 10 ✅@enkilee PaddlePaddle/Paddle#66127
✅B-36 python/paddle/amp/grad_scaler.py 4 ✅@enkilee PaddlePaddle/Paddle#66189
✅B-37 python/paddle/amp/__init__.py 2 ✅@enkilee PaddlePaddle/Paddle#65633
✅B-38 python/paddle/autograd/autograd.py 2 🙋@Turingg
✅@enkilee
PaddlePaddle/Paddle#67179
✅B-39 python/paddle/autograd/saved_tensors_hooks.py 1 🙋@DrRyanHuang
✅@enkilee
PaddlePaddle/Paddle#67179
✅B-40 python/paddle/autograd/backward_mode.py 1 ✅@tlxd PaddlePaddle/Paddle#66277
✅B-41 python/paddle/autograd/ir_backward.py 3 ✅@Luohongzhige PaddlePaddle/Paddle#66890
✅B-42 python/paddle/autograd/py_layer.py 2 ✅@Fripping PaddlePaddle/Paddle#66328
✅B-43 python/paddle/framework/framework.py 2 ✅@zrr1999 PaddlePaddle/Paddle#65777
✅B-44 python/paddle/base/dygraph/base.py 7 ✅@SigureMo PaddlePaddle/Paddle#66006
✅B-45 python/paddle/framework/io.py 4 🙋@Luohongzhige
✅@SigureMo
PaddlePaddle/Paddle#66654
✅B-46 python/paddle/framework/io_utils.py 3 🙋@Luohongzhige
✅@SigureMo
PaddlePaddle/Paddle#66654
✅B-47 python/paddle/framework/random.py 5 ✅@SigureMo PaddlePaddle/Paddle#66306
✅B-48 python/paddle/cost_model/cost_model.py 1 ✅@Luohongzhige PaddlePaddle/Paddle#66890
✅B-49 python/paddle/base/layer_helper.py 1 ✅@ooooo-create PaddlePaddle/Paddle#66639
✅B-50 python/paddle/static/input.py 3 ✅@ooooo-create PaddlePaddle/Paddle#67047
✅B-51 python/paddle/static/io.py 16 ✅@ooooo-create PaddlePaddle/Paddle#67047
✅B-52 python/paddle/onnx/export.py 1 🙋@ooooo-create
✅@enkilee
PaddlePaddle/Paddle#66862
✅B-53 python/paddle/utils/cpp_extension/cpp_extension.py 2 ✅@megemini PaddlePaddle/Paddle#65818
✅B-54 python/paddle/utils/cpp_extension/
extension_utils.py
3 ✅@megemini PaddlePaddle/Paddle#65819
✅B-55 python/paddle/utils/install_check.py 1 ✅@megemini PaddlePaddle/Paddle#65821
✅B-56 python/paddle/utils/lazy_import.py 1 ✅@megemini PaddlePaddle/Paddle#65822
✅B-57 python/paddle/utils/dlpack.py 2 ✅@megemini PaddlePaddle/Paddle#65823
✅B-58 python/paddle/utils/download.py 1 ✅@megemini PaddlePaddle/Paddle#65824
✅B-59 python/paddle/text/viterbi_decode.py 2 ✅@enkilee PaddlePaddle/Paddle#65919
✅B-60 python/paddle/text/datasets/conll05.py 1 ✅@enkilee PaddlePaddle/Paddle#65993
✅B-61 python/paddle/text/datasets/imdb.py 1 ✅@enkilee PaddlePaddle/Paddle#66037
✅B-62 python/paddle/text/datasets/imikolov.py 1 ✅@enkilee PaddlePaddle/Paddle#66040
✅B-63 python/paddle/text/datasets/movielens.py 1 ✅@enkilee PaddlePaddle/Paddle#66054
✅B-64 python/paddle/text/datasets/uci_housing.py 1 ✅@enkilee PaddlePaddle/Paddle#66057
✅B-65 python/paddle/text/datasets/wmt14.py 1 ✅@enkilee PaddlePaddle/Paddle#66058
✅B-66 python/paddle/text/datasets/wmt16.py 1 ✅@enkilee PaddlePaddle/Paddle#66058
✅B-67 python/paddle/sparse/binary.py 10 ✅@megemini PaddlePaddle/Paddle#65865
✅B-68 python/paddle/sparse/creation.py 2 ✅@megemini PaddlePaddle/Paddle#65866
✅B-69 python/paddle/sparse/multiary.py 1 ✅@megemini PaddlePaddle/Paddle#65867
✅B-70 python/paddle/sparse/unary.py 25 ✅@megemini PaddlePaddle/Paddle#65868
✅B-71 python/paddle/sparse/nn/layer/activation.py 4 ✅@megemini PaddlePaddle/Paddle#65869
✅B-72 python/paddle/sparse/nn/layer/conv.py 4 ✅@megemini PaddlePaddle/Paddle#65870
✅B-73 python/paddle/sparse/nn/layer/norm.py 2 ✅@megemini PaddlePaddle/Paddle#65871
✅B-74 python/paddle/sparse/nn/layer/pooling.py 1 ✅@megemini PaddlePaddle/Paddle#65872
✅B-75 python/paddle/sparse/nn/functional/activation.py 4 ✅@megemini PaddlePaddle/Paddle#65873
✅B-76 python/paddle/sparse/nn/functional/conv.py 6 ✅@megemini PaddlePaddle/Paddle#65874
✅B-77 python/paddle/sparse/nn/functional/pooling.py 1 ✅@megemini PaddlePaddle/Paddle#65875
✅B-78 python/paddle/sparse/nn/functional/transformer.py 1 ✅@megemini PaddlePaddle/Paddle#65876
✅B-79 python/paddle/profiler/profiler.py 11 ✅@megemini
🙋@Luohongzhige
PaddlePaddle/Paddle#67391
✅B-80 python/paddle/profiler/profiler_statistic.py 2 ✅@megemini
🙋@Luohongzhige
PaddlePaddle/Paddle#67391
✅B-81 python/paddle/profiler/utils.py 3 ✅@megemini
🙋@Luohongzhige
PaddlePaddle/Paddle#67391
✅B-82 python/paddle/nn/quant/quant_layers.py 13 ✅@megemini PaddlePaddle/Paddle#65803
✅B-83 python/paddle/nn/quant/quantized_linear.py 4 ✅@megemini PaddlePaddle/Paddle#65805
✅B-84 python/paddle/nn/quant/stub.py 1 ✅@megemini PaddlePaddle/Paddle#65807
✅B-85 python/paddle/nn/utils/clip_grad_norm_.py 1 ✅@megemini PaddlePaddle/Paddle#65808
✅B-86 python/paddle/nn/utils/clip_grad_value_.py 1 ✅@megemini PaddlePaddle/Paddle#65809
✅B-87 python/paddle/nn/utils/spectral_norm_hook.py 1 ✅@megemini PaddlePaddle/Paddle#65810
✅B-88 python/paddle/nn/utils/transform_parameters.py 2 ✅@megemini PaddlePaddle/Paddle#65811
✅B-89 python/paddle/nn/utils/weight_norm_hook.py 2 ✅@megemini PaddlePaddle/Paddle#65812
✅B-90 python/paddle/inference/wrapper.py 7 🙋@Luohongzhige
✅@enkilee
PaddlePaddle/Paddle#67366
✅B-91 python/paddle/check_import_scipy.py 1 ✅@crazyxiaoxi PaddlePaddle/Paddle#66280
✅B-92 python/paddle/batch.py 1 ✅@Caogration PaddlePaddle/Paddle#66295
✅B-93 python/paddle/reader/decorator.py 10 ✅@SigureMo PaddlePaddle/Paddle#66305
✅B-94 python/paddle/hapi/dynamic_flops.py 3 🙋@Luohongzhige
✅@enkilee
PaddlePaddle/Paddle#67204
~~✅B-95~~ ~~python/paddle/hapi/static_flops.py~~ 1 🙋@Wizard-ZP
✅B-96 python/paddle/base/framework.py 4 ✅@SigureMo PaddlePaddle/Paddle#66301
✅B-97 python/paddle/distributed/collective.py 4 ✅@enkilee PaddlePaddle/Paddle#66392
✅B-98 python/paddle/distributed/entry_attr.py 3 ✅@enkilee PaddlePaddle/Paddle#66394
✅B-99 python/paddle/distributed/parallel.py 7 ✅@enkilee PaddlePaddle/Paddle#66432
✅B-100 python/paddle/distributed/parallel_with_gloo.py 3 ✅@enkilee PaddlePaddle/Paddle#66473

🔜 第 3 批

序号 文件 API 数量 认领人 Github id PR 链接
✅C-1 python/paddle/distributed/spawn.py 1 ✅@megemini
🙋@Luohongzhige
PaddlePaddle/Paddle#67428
✅C-2 python/paddle/distributed/rpc/rpc.py 7 ✅@megemini
🙋@Luohongzhige
PaddlePaddle/Paddle#67428
✅C-3 python/paddle/distributed/auto_parallel/api.py 4 ✅@megemini
🙋@Luohongzhige
PaddlePaddle/Paddle#67428
~~✅C-4~~ ~~python/paddle/distributed/auto_parallel/
interface.py~~
9 🚧@lwkhahaha PaddlePaddle/Paddle#66710
✅C-5 python/paddle/distributed/auto_parallel/
process_mesh.py
1 ✅@megemini PaddlePaddle/Paddle#66985
✅C-6 python/paddle/distributed/auto_parallel/random.py 1 ✅@megemini PaddlePaddle/Paddle#66985
~~✅C-7~~ ~~python/paddle/distributed/auto_parallel/
strategy.py~~
2 🚧@lwkhahaha PaddlePaddle/Paddle#66710
✅C-8 python/paddle/distributed/auto_parallel/static/
engine.py
1 ✅@megemini PaddlePaddle/Paddle#66989
~~✅C-9~~ ~~python/paddle/distributed/auto_parallel/
placement_type.py~~
3 🚧@lwkhahaha PaddlePaddle/Paddle#66710
✅C-10 python/paddle/distributed/checkpoint/
load_state_dict.py
1 ✅@megemini PaddlePaddle/Paddle#66986
✅C-11 python/paddle/distributed/checkpoint/
save_state_dict.py
1 ✅@megemini PaddlePaddle/Paddle#66986
✅C-12 python/paddle/distributed/communication/
all_gather.py
3 ✅@megemini PaddlePaddle/Paddle#66051
✅C-13 python/paddle/distributed/communication/
all_reduce.py
3 ✅@enkilee PaddlePaddle/Paddle#66505
✅C-14 python/paddle/distributed/communication/
all_to_all.py
3 ✅@enkilee PaddlePaddle/Paddle#66505
✅C-15 python/paddle/distributed/communication/
batch_isend_irecv.py
2 🙋@Lans1ot
✅@enkilee
PaddlePaddle/Paddle#66572
✅C-16 python/paddle/distributed/communication/
broadcast.py
3 🙋@Lans1ot
✅@enkilee
PaddlePaddle/Paddle#66575
✅C-17 python/paddle/distributed/communication/gather.py 2 ✅@BHmingyang PaddlePaddle/Paddle#66276
✅C-18 python/paddle/distributed/communication/group.py 6 🙋@lwkhahaha
✅@Lans1ot
PaddlePaddle/Paddle#67677
✅C-19 python/paddle/distributed/communication/recv.py 3 ✅@Lans1ot PaddlePaddle/Paddle#66694
✅C-20 python/paddle/distributed/communication/reduce.py 3 ✅@megemini
🙋@successfulbarrier
PaddlePaddle/Paddle#66803
✅C-21 python/paddle/distributed/communication/
reduce_scatter.py
3 ✅@Lans1ot
🙋@Whsjrczr
PaddlePaddle/Paddle#66864
✅C-22 python/paddle/distributed/communication/scatter.py 3 ✅@Lans1ot
🙋@Whsjrczr
PaddlePaddle/Paddle#66864
✅C-23 python/paddle/distributed/communication/send.py 3 ✅@Lans1ot
🙋@Whsjrczr
PaddlePaddle/Paddle#66864
✅C-24 python/paddle/distributed/communication/stream/
all_gather.py
1 ✅@Lans1ot
🙋@Whsjrczr
PaddlePaddle/Paddle#66864
✅C-25 python/paddle/distributed/communication/stream/
all_reduce.py
1 🙋@Whsjrczr
✅@enkilee
PaddlePaddle/Paddle#67112
✅C-26 python/paddle/distributed/communication/stream/
all_to_all.py
2 🙋@Whsjrczr
✅@enkilee
PaddlePaddle/Paddle#67112
✅C-27 python/paddle/distributed/communication/stream/
broadcast.py
1 ✅@megemini PaddlePaddle/Paddle#66993
✅C-28 python/paddle/distributed/communication/stream/
gather.py
1 ✅@megemini PaddlePaddle/Paddle#66993
✅C-29 python/paddle/distributed/communication/stream/
recv.py
1 ✅@megemini PaddlePaddle/Paddle#66993
✅C-30 python/paddle/distributed/communication/stream/
reduce.py
1 ✅@megemini PaddlePaddle/Paddle#66993
✅C-31 python/paddle/distributed/communication/stream/
reduce_scatter.py
1 ✅@megemini PaddlePaddle/Paddle#66993
✅C-32 python/paddle/distributed/communication/stream/
scatter.py
1 ✅@megemini PaddlePaddle/Paddle#66993
✅C-33 python/paddle/distributed/communication/stream/
send.py
1 ✅@megemini PaddlePaddle/Paddle#66993
✅C-34 python/paddle/distributed/fleet/fleet.py 4 ✅@Luohongzhige PaddlePaddle/Paddle#67624
~~✅C-35~~ ~~python/paddle/distributed/fleet/launch.py~~ 3 🙋@Luohongzhige
~~✅C-36~~ ~~python/paddle/distributed/fleet/launch_utils.py~~ 1
~~✅C-37~~ ~~python/paddle/distributed/fleet/model.py~~ 2
~~✅C-38~~ ~~python/paddle/distributed/fleet/optimizer.py~~ 2
~~✅C-39~~ ~~python/paddle/distributed/fleet/scaler.py~~ 2
✅C-40 python/paddle/distributed/fleet/base/
distributed_strategy.py
1 ✅@Whsjrczr PaddlePaddle/Paddle#67405
✅C-41 python/paddle/distributed/fleet/base/
role_maker.py
3 ✅@Whsjrczr PaddlePaddle/Paddle#67439
✅C-42 python/paddle/distributed/fleet/base/topology.py 2 ✅@Whsjrczr PaddlePaddle/Paddle#67439
✅C-43 python/paddle/distributed/fleet/base/
util_factory.py
1 ✅@Whsjrczr PaddlePaddle/Paddle#67469
✅C-44 python/paddle/distributed/fleet/data_generator/
data_generator.py
2 ✅@Whsjrczr PaddlePaddle/Paddle#67469
✅C-45 python/paddle/distributed/fleet/dataset/
dataset.py
5 ✅@megemini
🙋@lwkhahaha
PaddlePaddle/Paddle#66749
✅C-46 python/paddle/distributed/fleet/dataset/
index_dataset.py
1 ✅@megemini PaddlePaddle/Paddle#66760
✅C-47 python/paddle/distributed/fleet/utils/fs.py 2 ✅@megemini PaddlePaddle/Paddle#66766
~~✅C-48~~ python/paddle/distributed/fleet/utils/
hybrid_parallel_inference.py
1 🙋@megemini
~~✅C-49~~ python/paddle/distributed/fleet/utils/
mix_precision_utils.py
1 🙋@megemini
✅C-50 python/paddle/distributed/fleet/utils/ps_util.py 1 ✅@megemini PaddlePaddle/Paddle#66770
~~✅C-51~~ python/paddle/distributed/fleet/utils/
sequence_parallel_utils.py
1 🙋@megemini
✅C-52 python/paddle/distributed/fleet/utils/
__init__.py
1 ✅@megemini PaddlePaddle/Paddle#66771
✅C-53 python/paddle/distributed/launch/main.py 1 ✅@megemini PaddlePaddle/Paddle#66994
✅C-54 python/paddle/distributed/sharding/
group_sharded.py
2 ✅@megemini PaddlePaddle/Paddle#66994
✅C-55 python/paddle/incubate/autotune.py 1 ✅@megemini PaddlePaddle/Paddle#66009
✅C-56 python/paddle/incubate/asp/asp.py 6 ✅@megemini PaddlePaddle/Paddle#66010
✅C-57 python/paddle/incubate/asp/
supported_layer_list.py
2 ✅@megemini PaddlePaddle/Paddle#66011
✅C-58 python/paddle/incubate/asp/utils.py 10 ✅@megemini PaddlePaddle/Paddle#66012
✅C-59 python/paddle/incubate/autograd/functional.py 6 ✅@megemini PaddlePaddle/Paddle#66013
✅C-60 python/paddle/incubate/autograd/primapi.py 4 ✅@megemini PaddlePaddle/Paddle#66016
✅C-61 python/paddle/incubate/autograd/primx.py 1 ✅@megemini PaddlePaddle/Paddle#66015
✅C-62 python/paddle/incubate/autograd/utils.py 3 ✅@megemini PaddlePaddle/Paddle#66014
✅C-63 python/paddle/incubate/framework/random.py 3 ✅@megemini PaddlePaddle/Paddle#66028
✅C-64 python/paddle/incubate/layers/nn.py 15 ✅@megemini PaddlePaddle/Paddle#66029
✅C-65 python/paddle/incubate/multiprocessing/
reductions.py
1 ✅@megemini PaddlePaddle/Paddle#66030
✅C-66 python/paddle/incubate/nn/loss.py 1 ✅@uanu2002 PaddlePaddle/Paddle#66270
✅C-67 python/paddle/incubate/nn/functional/
blha_get_max_len.py
1 ✅@MikhayEeer PaddlePaddle/Paddle#66616
✅C-68 python/paddle/incubate/nn/functional/
block_multihead_attention.py
2 🙋@lwkhahaha
✅@Lans1ot
PaddlePaddle/Paddle#67677
✅C-69 python/paddle/incubate/nn/functional/
fused_dot_product_attention.py
1 ✅@MikhayEeer PaddlePaddle/Paddle#66616
✅C-70 python/paddle/incubate/nn/functional/
fused_dropout_add.py
1 ✅@Jeff114514 PaddlePaddle/Paddle#66279
✅C-71 python/paddle/incubate/nn/functional/
fused_ec_moe.py
1 ✅@MikhayEeer PaddlePaddle/Paddle#66616
✅C-72 python/paddle/incubate/nn/functional/
fused_gate_attention.py
1 ✅@MikhayEeer PaddlePaddle/Paddle#66616
✅C-73 python/paddle/incubate/nn/functional/
fused_layer_norm.py
1 ✅@haoyu2022 PaddlePaddle/Paddle#67159
✅C-74 python/paddle/incubate/nn/functional/
fused_matmul_bias.py
3 ✅@Betelgeu PaddlePaddle/Paddle#66656
✅C-75 python/paddle/incubate/nn/functional/
fused_rms_norm.py
1 ✅@haoyu2022 PaddlePaddle/Paddle#67555
✅C-76 python/paddle/incubate/nn/functional/
fused_rotary_position_embedding.py
1 ✅@haoyu2022 PaddlePaddle/Paddle#67556
✅C-77 python/paddle/incubate/nn/functional/
fused_transformer.py
4 🙋@lwkhahaha
✅@Lans1ot
PaddlePaddle/Paddle#67677
✅C-78 python/paddle/incubate/nn/functional/
masked_multihead_attention.py
1 ✅@haoyu2022 PaddlePaddle/Paddle#67558
✅C-79 python/paddle/incubate/nn/functional/swiglu.py 1 ✅@Turingg PaddlePaddle/Paddle#66987
✅C-80 python/paddle/incubate/nn/functional/
variable_length_memory_efficient_attention.py
1 ✅@inaomIIsfarell PaddlePaddle/Paddle#67197
✅C-81 python/paddle/incubate/nn/layer/
fused_dropout_add.py
1 ✅@Lans1ot PaddlePaddle/Paddle#67233
~~✅C-82~~ ~~python/paddle/incubate/nn/layer/
fused_dropout_nd.py~~
1
✅C-83 python/paddle/incubate/nn/layer/fused_ec_moe.py 1 🙋@Whsjrczr
✅@enkilee
PaddlePaddle/Paddle#67143
✅C-84 python/paddle/incubate/nn/layer/fused_linear.py 1 🙋@brcarry
✅@enkilee
PaddlePaddle/Paddle#67147
✅C-85 python/paddle/incubate/nn/layer/
fused_transformer.py
5 🙋@lwkhahaha
✅@enkilee
PaddlePaddle/Paddle#67178
✅C-86 python/paddle/incubate/operators/
graph_khop_sampler.py
1 ✅@inaomIIsfarell PaddlePaddle/Paddle#67197
✅C-87 python/paddle/incubate/operators/
graph_reindex.py
1 ✅@Wizard-ZP PaddlePaddle/Paddle#66475
✅C-88 python/paddle/incubate/operators/
graph_sample_neighbors.py
1 ✅@Sekiro-x PaddlePaddle/Paddle#67104
✅C-89 python/paddle/incubate/operators/
graph_send_recv.py
1 ✅@Sekiro-x PaddlePaddle/Paddle#67104
✅C-90 python/paddle/incubate/operators/resnet_unit.py 2 ✅@inaomIIsfarell PaddlePaddle/Paddle#66793
✅C-91 python/paddle/incubate/operators/
softmax_mask_fuse.py
1 ✅@inaomIIsfarell PaddlePaddle/Paddle#66867
✅C-92 python/paddle/incubate/operators/
softmax_mask_fuse_upper_triangle.py
1 ✅@inaomIIsfarell PaddlePaddle/Paddle#66867
✅C-93 python/paddle/incubate/operators/unzip.py 1 ✅@inaomIIsfarell PaddlePaddle/Paddle#66966
~~✅C-94~~ ~~python/paddle/incubate/optimizer/
distributed_fused_lamb.py~~
1
✅C-95 python/paddle/incubate/optimizer/lookahead.py 1 ✅@inaomIIsfarell PaddlePaddle/Paddle#67448
✅C-96 python/paddle/incubate/optimizer/modelaverage.py 1 ✅@inaomIIsfarell PaddlePaddle/Paddle#67448
~~✅C-97~~ ~~python/paddle/incubate/optimizer/
gradient_merge.py~~
1
~~✅C-98~~ ~~python/paddle/incubate/optimizer/
lars_momentum.py~~
1
✅C-99 python/paddle/incubate/optimizer/lbfgs.py 1 ✅@inaomIIsfarell PaddlePaddle/Paddle#67448
~~✅C-100~~ ~~python/paddle/incubate/optimizer/pipeline.py~~ 1
~~✅C-101~~ ~~python/paddle/incubate/optimizer/recompute.py~~ 1
✅C-102 python/paddle/incubate/optimizer/functional/
bfgs.py
1 ✅@Lans1ot PaddlePaddle/Paddle#67233
✅C-103 python/paddle/incubate/optimizer/functional/
lbfgs.py
1 ✅@Lans1ot PaddlePaddle/Paddle#67233
~~✅C-104~~ ~~python/paddle/incubate/passes/
fuse_resnet_unit_pass.py:fuse_resnet_unit_pass~~
1
✅C-105 python/paddle/incubate/tensor/math.py 4 ✅@Lans1ot PaddlePaddle/Paddle#67233
✅C-106 python/paddle/base/backward.py 2 ✅@megemini PaddlePaddle/Paddle#67132
🚧C-107 python/paddle/base/compiler.py 4 🚧@megemini
🚧@haoyu2022
🙋@lwkhahaha
PaddlePaddle/Paddle#67767
PaddlePaddle/Paddle#67699
~~✅C-108~~ ~~python/paddle/base/data_feed_desc.py~~ 1
✅C-109 python/paddle/base/data_feeder.py 1 ✅@megemini PaddlePaddle/Paddle#67132
~~✅C-110~~ ~~python/paddle/base/dataset.py~~ 3 🙋@lwkhahaha
✅C-111 python/paddle/base/executor.py 3 🙋@lwkhahaha
✅@SigureMo
PaddlePaddle/Paddle#66996
✅C-112 python/paddle/base/initializer.py 1 ✅@SigureMo PaddlePaddle/Paddle#67001
~~✅C-113~~ ~~python/paddle/base/lod_tensor.py~~ 2
✅C-114 python/paddle/base/param_attr.py 2 ✅@SigureMo PaddlePaddle/Paddle#67001
✅C-115 python/paddle/geometric/math.py 4 🙋@haoyu2022
🙋@lwkhahaha
✅@enkilee
PaddlePaddle/Paddle#67644
✅C-116 python/paddle/geometric/reindex.py 2 ✅@successfulbarrier PaddlePaddle/Paddle#66792
✅C-117 python/paddle/geometric/sampling/neighbors.py 2 ✅@successfulbarrier PaddlePaddle/Paddle#66792
✅C-118 python/paddle/geometric/message_passing/
send_recv.py
3 🙋@lwkhahaha
✅@enkilee
PaddlePaddle/Paddle#67644
✅C-119 python/paddle/quantization/config.py 1 ✅@enkilee PaddlePaddle/Paddle#66684
✅C-120 python/paddle/quantization/base_quanter.py 1 ✅@enkilee PaddlePaddle/Paddle#66686
✅C-121 python/paddle/quantization/base_observer.py 1 ✅@enkilee PaddlePaddle/Paddle#66693
✅C-122 python/paddle/quantization/factory.py 1 ✅@enkilee PaddlePaddle/Paddle#66693
✅C-123 python/paddle/quantization/ptq.py 1 ✅@enkilee PaddlePaddle/Paddle#66693
✅C-124 python/paddle/quantization/qat.py 1 ✅@enkilee PaddlePaddle/Paddle#66693
✅C-125 python/paddle/audio/backends/init_backend.py 3 ✅@SigureMo PaddlePaddle/Paddle#67002
✅C-126 python/paddle/audio/backends/wave_backend.py 3 ✅@SigureMo PaddlePaddle/Paddle#67002
✅C-127 python/paddle/audio/datasets/esc50.py 1 ✅@enkilee PaddlePaddle/Paddle#67067
✅C-128 python/paddle/audio/datasets/tess.py 1 ✅@enkilee PaddlePaddle/Paddle#67067
✅C-129 python/paddle/audio/features/layers.py 4 ✅@enkilee PaddlePaddle/Paddle#67079
✅C-130 python/paddle/audio/functional/functional.py 7 ✅@enkilee PaddlePaddle/Paddle#67079
✅C-131 python/paddle/audio/functional/window.py 1 ✅@enkilee PaddlePaddle/Paddle#67079
✅C-132 python/paddle/nn/clip.py 3 ✅@megemini PaddlePaddle/Paddle#67031
✅C-133 python/paddle/nn/decode.py 2 ✅@megemini PaddlePaddle/Paddle#67031
✅C-134 python/paddle/nn/initializer/lazy_init.py::LazyGuard 1 ✅@megemini PaddlePaddle/Paddle#67031
✅C-135 python/paddle/distributed/fleet/layers/mpu/mp_ops.py::split 1 ✅@megemini PaddlePaddle/Paddle#67089
✅C-136 python/paddle/distributed/fleet/recompute/
recompute.py::recompute_sequential
1 ✅@megemini PaddlePaddle/Paddle#67089
✅C-137 python/paddle/distributed/fleet/recompute/
recompute_hybrid.py::recompute_hybrid
1 ✅@megemini PaddlePaddle/Paddle#67089

注意:上述 API 数量仅为参考,若多个模块之间相互引用,会导致统计数量增多。


⭐️ 提交PR 模版 ⭐️:

  • // ------- PR 标题 --------
[Typing][A-1] Add type annotations for `paddle/tensor/array.py`

或者多个任务:

[Typing][A-1,A-2,A-3] Add type annotations for `paddle/tensor/*`

⭐️ 认领方式 ⭐️: 请大家以 comment 的形式认领任务,如:

【报名】:A-1、A-3

状态介绍: ✅:已经完全迁移,所有单测都OK! 🟢:审核完毕待合入,合入之后完全迁移! 🔵:可认领! 🟡:当前阶段不需要人力继续跟进,下阶段推进 🚧:迁移中,单测还没有过,还没有审核完。

大致正常流程为: 🔵 -> 🚧 -> 🟢 -> ✅

异常流程为: 🔵 -> 🚧 -> 🟡

看板信息

📊任务数量 🔵可认领 🚧迁移中 🟢待合入 ✅完成 🟡下阶段推进 🏁完成率
337 0 1 0 336 0 99.7%

排名不分先后 @zrr1999(6) @gouzil(12) @Asthestarsfalll(25) @SigureMo(23) @ooooo-create(20) @megemini(85) @liyongchao911(2) @DrRyanHuang(15) @enkilee(65) @gsq7474741(3) @sunzhongkai588(1) @Liyulingyue(1) @86kkd(3) @NKNaN(2) @tlxd(1) @Luohongzhige(3) @Fripping(1) @crazyxiaoxi(1) @Caogration(1) @BHmingyang(1) @Lans1ot(12) @Whsjrczr(5) @uanu2002(1) @MikhayEeer(4) @Jeff114514(1) @haoyu2022(4) @Betelgeu(1) @Turingg(1) @inaomIIsfarell(9) @Wizard-ZP(1) @Sekiro-x(2) @successfulbarrier(2)

megemini avatar Jun 10 '24 10:06 megemini

✨️ 大家好!✨️

此次任务为 为 Paddle 框架 API 添加类型提示(Type Hints) 的子任务。

即,将原本的 函数:

def log(x, name=None):
    ...

标注为:

def log(x: paddle.Tensor, name: str | None = None) -> paddle.Tensor:
    ...

Python 在 3.5 版本通过 PEP 484 – Type Hints 正式规范了 类型提示 功能,可以提升开发者的使用体验并提高代码质量。Paddle 目前最低支持的 Python 版本 3.8 已经可以较好的支持 类型提示,特发起此次任务,旨在完成 Paddle 目前公开 API 的类型标注!

欢迎大家参与! 非常感谢!:) 🎉🎉🎉

此次参与的流程大致为:

  • 任务认领
  • 修改接口
  • 提交 PR
  • 任务收尾

✨ 点击下列标题查看详情!✨

✨ 任务认领

直接在 ISSUE 下回复认领的任务 ID 即可,如:

【报名】:A-25

✨ 修改接口

Python 的类型标注是个较为庞杂的体系,此次任务主要做以下几个事情:

  • 添加接口的类型标注
  • 统一接口的类型标注与文档中的类型说明
  • 通过类型检查工具对于相关接口示例代码的类型检测
➡️ 参考示例

paddle.log 这个接口为例,原代码为:

def log(x, name=None):
    r"""
    Calculates the natural log of the given input Tensor, element-wise.

    .. math::

        Out = \ln(x)

    Args:
        x (Tensor): Input Tensor. Must be one of the following types: int32, int64, float16, bfloat16, float32, float64, complex64, complex128.
        name (str|None): The default value is None. Normally there is no need for user to set this property. For more information, please refer to :ref:`api_guide_Name`

    Returns:
        Tensor: The natural log of the input Tensor computed element-wise.

    Examples:
        ...
    """
    ...

需要修改为:

from __future__ import annotations

...

def log(x: paddle.Tensor, name: str | None = None) -> paddle.Tensor:
    r"""
    Calculates the natural log of the given input Tensor, element-wise.

    .. math::

        Out = \ln(x)

    Args:
        x (Tensor): Input Tensor. Must be one of the following types: int32, int64, float16, bfloat16, float32, float64, complex64, complex128.
        name (str|None, optional): The default value is None. Normally there is no need for user to set this property. For more information, please refer to :ref:`api_guide_Name`

    Returns:
        Tensor: The natural log of the input Tensor computed element-wise.

    Examples:
        ...
    """
    ...

此处需要注意以下几个地方:

  • 添加 from __future__ import annotations

    由于目前 Paddle 的最低 Python 支持版本为 3.8,而此次任务希望尽可能的使用较新 Python 版本的标注特性,因此,有必要添加此模块。

  • def log(x, name=None) 添加类型标注

    输入参数与输出参数均需要添加。

  • 对齐 docstring 中的类型与实际参数类型

    此处,name=None ,实际的参数标注为 name: str | None = None,对应 docstring 应为 name (str|None, optional)

    注意,Python 中原有的类型 Optional 与 docstring 中的 optional 意义不同:

    • 前者,表示输入可以为 None
    • 后者,表示此参数有默认值。此默认值可以是 None ,也可以是任意值。

    另外,docstring 的写法应注意:

    • Args 尽量保持简洁,如 paddle.Tensor 可写为 Tensor
    • Returns 的格式应写为: return type, description 的形式,如 Tensor, The natural log of the input Tensor computed element-wise.

更多详细的说明,请参考本 ISSUE 下面的 《标注 Q&A》《常用类型参照》

➡️ 相关工具
  • https://github.com/megemini/ArgsTyping

利用这个工具,可以初步的进行类型标注,如:

> python args_typing.py -i /home/shun/Documents/Projects/paddle/megemini/Paddle/python/paddle/tensor/math.py

注意:此工具通过解析 docstring 中的 Args/ParametersReturns/Yields 进行类型标注,因此,存在标注错误与不完整的可能。

  • tools/type_checking.py

利用这个工具,可以在本地对修改的接口进行检查,如:

> python type_checking.py paddle.abs

注意:此用法依赖 PR https://github.com/PaddlePaddle/Paddle/pull/64991 的完成。

✨ 提交 PR

每个任务需要提交至少一个 PR:

PR 标题:

  • [Typing][A-1] Add type annotations for paddle/tensor/array.py

一个 PR 里提交多个任务可以使用

  • [Typing][A-1,A-2,A-3] Add type annotations for paddle/tensor/*

上面的 xxx 可以是其他补充信息,如,多次提交 PR 的话,可以补充此 PR 的主要文件。

注意:标题中请务必标注 [Typing] ,以触发 CI 流水线检查。另外,请务必统一标题格式,以方便后续进行统计。

提交 PR 可直接复制以下模板,修改相应的 xxx 部分,也可补充说明:

<!-- TemplateReference: https://github.com/PaddlePaddle/Paddle/wiki/PULL-REQUEST-TEMPLATE--REFERENCE -->
<!-- Demo: https://github.com/PaddlePaddle/Paddle/pull/24810 -->

### PR Category
<!-- One of [ User Experience | Execute Infrastructure | Operator Mechanism | CINN | Custom Device | Performance Optimization | Distributed Strategy | Parameter Server | Communication Library | Auto Parallel | Inference | Environment Adaptation ] -->
User Experience

### PR Types
<!-- One of [ New features | Bug fixes | Improvements | Performance | BC Breaking | Deprecations | Docs | Devs | Not User Facing | Security | Deprecations | Others ] -->
Improvements

### Description
<!-- Describe what you’ve done -->

类型标注:

- xxx.py
- xxx.py

### Related links
 
- https://github.com/PaddlePaddle/Paddle/issues/65008

@SigureMo @megemini 

另外,由于类型标注存在接口依赖的可能,因此,如果 PR 由于其他接口而阻塞,请告知 reviewer 安排处理。

✨ 任务收尾

PR 提交之后,需要在 CI 流水线中确认:

  • PR-CI-Static-Check

    • 是否修改的文件,其中 所有 的 API 都已进行测试 type checking
    • 是否上面 所有 的测试都已通过。

最后,任务收尾:

  • Merge/Close Paddle 代码的 PR

顺便提醒 reviewer 在此 ISSUE 中 check 任务的状态~

至此,一个任务就算是圆满完工!🎉🎉🎉

最后,再次感谢大家的参与与贡献 ~ 🏆️🏆️🏆️

参考项目:

  • https://github.com/cattidea/paddlepaddle-stubs

优秀 PR 赏析:

  • https://github.com/PaddlePaddle/Paddle/pull/64867
  • https://github.com/PaddlePaddle/Paddle/pull/64954

关联链接:

Python 文档:

@SigureMo @zrr1999 @Asthestarsfalll @gouzil @gsq7474741 @sunzhongkai588 @luotao1

megemini avatar Jun 10 '24 10:06 megemini

标注 Q&A

问: 我该如何下手?

答:Python 的类型标注特性一直在完善,目前已经是个相对庞大的体系了。

可以先学习一下 Python 官方的文档:Static Typing with Python,熟悉一下相关的 PEP 。

通过 CI 检查 作为最基础的实现目标。

另外,目前 Paddle 添加了 _typing 模块,对于一些常用的公用类型做了统一整理,如:

# python/paddle/_typing/layout.py
DataLayout2D: TypeAlias = Literal["NCHW", "NHCW"]
DataLayout3D: TypeAlias = Literal["NCDHW", "NDHWC"]

标注时应尽量使用 _typing 模块中的类型,以方便后续维护。

问: docstring 中的 Args 与 type annotation 有什么区别?

答:Paddle 之前的版本未统一进行类型标注,而在 docstring 中描述了参数类型。 docstring 中 Args 的参数类型以方便用户理解为目的,在与 type annotation 不冲突的前提下,可以保持简洁。如:

def test(a: int | list[int] | tuple[int, ...]) -> None:
    """
    ...
    
    Args:
        a (int|list|tuple): xxx

    Returns:
        None, xxx

    ...
    """
问: docstring 中的 Args 与 type annotation 不一致怎么办?

答:首先需要保证 type annotation 的正确性,如果 docstring 原有 Args 中的类型不正确,需要进行修改,并且,同时检查此接口的 中文文档 (即 docs)是否正确,如发现错误,需要对 docs 单独提 PR 进行修改。

问: 该使用 Union 还是 |

答:尽可能的使用 |

由于目前 Paddle 支持的 Python 最低版本为 3.8 ,因此,| 只能在类型标注的情况下使用,而不能在表达式中使用,如:

from __future__ import annotations
def test(a: int | str): ...

而在表达式中仍使用 Union

from typing import Union
t = Union[int, str]
问: 如果测试无法通过怎么办?

答:可以使用 # type: ignore 进行规避。

此次任务通过工具 (如 mypy) 对接口的示例代码进行检查,进而保证类型标注的正确性。

类型标注的过程中,难免产生接口依赖问题,如果依赖的是 私有接口外部接口 ,则可以使用 # type: ignore 规避相应的类型检查,如:

>>> import abcde # type: ignore
>>> print('ok')

或者规避整个代码检查:

>>> # type: ignore
>>> import abcde
>>> print('ok')
问: 能否使用 Any 类型?

答:可以,但应尽量避免。

问: 如果出现 circular import 错误怎么办?

答:出现此情况可以参考以下处理方法:

  • 添加 from __future__ import annotations

  • 将类型单独通过 typing.TYPE_CHECKING 引入,如:

    from typing import TYPE_CHECKING
    if TYPE_CHECKING:
        import paddle.xxx as xxx
    
    def tmp() -> xxx: ...
    

    另外,如果标注的类型仅用作 type hints,也尽可能的使用 TYPE_CHECKING ,以减少不必要的模块导入。

问: 使用 Tensor 还是 Variable

答:尽量使用 Tensor ,不将静态图的 Variable/Value 概念暴露给用户。

更详细的讨论可以参考 https://github.com/PaddlePaddle/community/pull/858#discussion_r1564552690

问: 如果遇到需要根据不同输入类型有不同输出类型的函数怎么办?

答:出现此情况可以参考以下处理方法:

  • 添加 from typing import overload

  • 标注多个同名函数,并用装饰器装饰,如:

    from typing import overload
    
    @overload
    def array_length(array: list[Any]) -> int:...
    
    @overload
    def array_length(array: paddle.Tensor) -> paddle.Tensor:...
    
    def array_length(array): ... # 具体实现的代码,不再进行标注
    
问: 什么时候用 Sequence ,什么时候用 listtuple

答:Python 的 PEP 中有提示:

Note: Dict, DefaultDict, List, Set and FrozenSet are mainly useful for annotating return values. For arguments, prefer the abstract collection types defined below, e.g. Mapping, Sequence or AbstractSet.

也就是说,输入中用 Sequence ,返回值用 list

但是,如果代码中使用到了 list 的方法,如 append ,或者明确表示此输入只能是 list ,则不应再使用 Sequence

问: 标注的时候用 Tensor 还是 paddle.Tensor

答:两者皆可。

若文件中出现较多 paddle.Tensor ,出于简洁的考虑,可以使用 Tensor 代替,但是需要在导入包时注意:

if TYPE_CHECKING:
    from paddle import Tensor

可参考讨论:https://github.com/PaddlePaddle/Paddle/pull/65073#discussion_r1636116450

问: 该用 paddle.framework.Block 还是 paddle.pir.Block

答:统一使用 paddle.pir.Block

可参考讨论:https://github.com/PaddlePaddle/Paddle/pull/65095#discussion_r1637570850

megemini avatar Jun 10 '24 10:06 megemini

python 3.8 至 python 3.9 类型标注映射表

旧类型 新类型
typing.Tuple tuple
typing.List list
typing.Dict dict
typing.Set set
typing.FrozenSet frozenset
typing.Type type
typing.Deque collections.deque
typing.DefaultDict collections.defaultdict
typing.OrderedDict collections.OrderedDict
typing.Counter collections.Counter
typing.ChainMap collections.ChainMap
typing.Awaitable collections.abc.Awaitable
typing.Coroutine collections.abc.Coroutine
typing.AsyncIterable collections.abc.AsyncIterable
typing.AsyncIterator collections.abc.AsyncIterator
typing.AsyncGenerator collections.abc.AsyncGenerator
typing.Iterable collections.abc.Iterable
typing.Iterator collections.abc.Iterator
typing.Generator collections.abc.Generator
typing.Reversible collections.abc.Reversible
typing.Container collections.abc.Container
typing.Collection collections.abc.Collection
typing.Callable collections.abc.Callable
typing.AbstractSet collections.abc.Set
typing.MutableSet collections.abc.MutableSet
typing.Mapping collections.abc.Mapping
typing.MutableMapping collections.abc.MutableMapping
typing.Sequence collections.abc.Sequence
typing.MutableSequence collections.abc.MutableSequence
typing.MappingView collections.abc.MappingView
typing.KeysView collections.abc.KeysView
typing.ItemsView collections.abc.ItemsView
typing.ValuesView collections.abc.ValuesView
typing.ContextManager contextlib.AbstractContextManager
typing.AsyncContextManager contextlib.AbstractAsyncContextManager
typing.Pattern re.Pattern
typing.re.Pattern re.Pattern
typing.Match re.Match
typing.re.Match re.Match

常用类型标注参照

变量 标注
name=None name: str | None = None

megemini avatar Jun 10 '24 12:06 megemini

【报名】:A16,A17,A18,A19

liyongchao911 avatar Jun 12 '24 03:06 liyongchao911

【报名】:A-60

sunzhongkai588 avatar Jun 12 '24 03:06 sunzhongkai588

【报名】:A-2,A-4

zrr1999 avatar Jun 12 '24 04:06 zrr1999

【报名】:A-64

Liyulingyue avatar Jun 12 '24 05:06 Liyulingyue

【报名】:A-47

ooooo-create avatar Jun 12 '24 06:06 ooooo-create

【报名】:A-57

DrRyanHuang avatar Jun 12 '24 07:06 DrRyanHuang

【报名】:A-8

SigureMo avatar Jun 12 '24 08:06 SigureMo

【报名】:A-48、A-49、A-50、A-51

DrRyanHuang avatar Jun 12 '24 08:06 DrRyanHuang

【报名】:A-3

zrr1999 avatar Jun 12 '24 10:06 zrr1999

【报名】:A-3

zrr1999 avatar Jun 12 '24 10:06 zrr1999

@Liyulingyue @liyongchao911 @DrRyanHuang @ooooo-create @zrr1999 @gouzil @sunzhongkai588

之前 @SigureMo 有一个开源项目,是单独的 stub 标注文件的,可以参考一下里面已经标注的部分 ~

https://github.com/cattidea/paddlepaddle-stubs

megemini avatar Jun 12 '24 13:06 megemini

【报名】:A-29、A-53、A-34

DrRyanHuang avatar Jun 12 '24 14:06 DrRyanHuang

【报名】:A-39、A-51

ooooo-create avatar Jun 14 '24 07:06 ooooo-create

【报名】:A-16、A-19

SigureMo avatar Jun 14 '24 13:06 SigureMo

【报名】:A-15

megemini avatar Jun 14 '24 16:06 megemini

【报名】:A-54、A-55、A-56、A-61、A-62、A-65

gsq7474741 avatar Jun 14 '24 17:06 gsq7474741

【报名】:A-68

DrRyanHuang avatar Jun 17 '24 07:06 DrRyanHuang

【报名】:A-40、A-41、A-42

ooooo-create avatar Jun 17 '24 14:06 ooooo-create

【报名】:A10、A11、A12

ooooo-create avatar Jun 18 '24 10:06 ooooo-create

【报名】:A-5

gouzil avatar Jun 19 '24 03:06 gouzil

【报名】:A-6、A-7

Asthestarsfalll avatar Jun 19 '24 03:06 Asthestarsfalll

【报名】:A-77、A-73

DrRyanHuang avatar Jun 19 '24 07:06 DrRyanHuang

【报名】:A-75、A-76、A-78、A-79、A-84

DrRyanHuang avatar Jun 19 '24 11:06 DrRyanHuang

【报名】:A-20、A-21、A-22、A-23、A-24、A-25

Asthestarsfalll avatar Jun 21 '24 06:06 Asthestarsfalll

【报名】:A-14、A-26、A-37、A-38

ooooo-create avatar Jun 21 '24 07:06 ooooo-create

佬们,写了一个解析docstring自动生成type hiting的脚本,虽然代码写得拉但是实测基本是可用的。不过格式会乱,需要单独生成再把函数签名复制过去。

代码如下:

here
from __future__ import annotations
import ast
import inspect
import re
import typing
from collections import defaultdict
from types import ModuleType
from typing import Any, Callable, Dict, List, Optional, Tuple, Type

import astor
import paddle
from astpretty import pprint

NoneType = Type[None]

class ReduceMode: ...


class SizeD: ...

# 是否通过参数名覆盖docsting中的类型

OVERWRITE = {"kernel_size"}

#  参数名与类型的映射
ARGS_NAME_MAPPING: dict[str, list] = {
    "input": [paddle.Tensor],
    "label": [paddle.Tensor],
    "logit": [paddle.Tensor],
    "reduction": [ReduceMode],
    "x": [paddle.Tensor],
    "y": [paddle.Tensor],
    "kernel_size": [SizeD],
}

#  docstring 中的字符与类型的映射
TYPE_MAPPING = {
    "Tensor": paddle.Tensor,
    "float": float,
    "int": int,
    "list": list,
    "tuple": tuple,
    "bool": bool,
    "str": str,
    "string": str,
    "None": NoneType,
    "function": Callable,
    "ParamAttr": paddle.base.param_attr.ParamAttr,
}

# 函数与返回值的映射
RETURN_MAPPING = {
    "__init__" : "None",
    "__str__" : "str",
    "__repr__": "str",
    "forward": "Tensor",
    "extra_repr": "str"

}

for name in typing.__all__:
    if not name[0].isupper():
        continue
    TYPE_MAPPING[name] = getattr(typing, name)


IMPORT_TEMPLETE = """
from typing import Any, Dict, List, Optional, Tuple, Union, Callable, TYPE_CHECKING
from collections.abc import Iterable

if TYPE_CHECKING:
    from paddle import Tensor
"""

DEFAULT_RETURN  = "Tensor"

HAS_IMPORTED = defaultdict(bool)

MODUELS = [paddle.nn.layer.loss]

GLOBAL_FILE_PATH = None
SOURCE_FILE = None

args_pattern = re.compile(
    r"(Args|Parameters)\s*:\s*(.*?)(?=\s*(Returns|Examples|$))", re.DOTALL
)

pattern = re.compile(r"(.*\(.*\)):")


def load_ast(file_path: str):
    with open(file_path, "r") as file:
        code_str = file.read()
    tree = ast.parse(code_str)
    return tree


def find_function_by_name(tree, func_name):
    for node in ast.walk(tree):
        if isinstance(node, ast.FunctionDef) and node.name == func_name:
            return node


def find_method_in_class(tree, class_name, method_name):
    for node in ast.walk(tree):
        if isinstance(node, ast.ClassDef) and node.name == class_name:
            for subnode in node.body:
                if isinstance(subnode, ast.FunctionDef) and subnode.name == method_name:
                    return subnode


def write_code(file_path, code):
    with open(file_path, "w") as f:
        f.write("")

    if not HAS_IMPORTED[file_path]:
        with open(file_path, "w") as f:
            f.write(IMPORT_TEMPLETE)
        HAS_IMPORTED[file_path] = True

    with open(file_path, "a") as f:
        f.write(code)
    global SOURCE_FILE
    SOURCE_FILE = file_path


def gname(name):
    return ast.Name(name, ast.Load())


def gsub(name, type_nodes):
    return ast.Subscript(
        value=gname(name),
        slice=ast.Index(
            value=(
                ast.Tuple(elts=type_nodes)
                if not isinstance(type_nodes, ast.AST)
                else type_nodes
            )
        ),
        ctx=ast.Load(),
    )


def gimport():
    pass


def gbitor(node):
    left = node[0]
    for i in range(len(node) - 1):
        right = node[i + 1]
        res = ast.BinOp(op=ast.BitOr(), left=left, right=right)
        left = res
    return res


def gnone():
    return ast.Constant(value=None, kind=None)

def gen_annotation(type_info: List[Any], is_optional):
    try:
        node = [gname(getattr(i, "__name__", str(i).split(".")[-1])) for i in type_info]
    except:
        breakpoint()
    if len(node) > 1:
        node = gbitor(node)
    elif len(node) == 0:
        print("No Valid Type")
        return None
    else:
        node = node[0]
    if is_optional:
        node = gbitor([node, gnone()])

    return node


def get_type_by_name(name: str):
    t = TYPE_MAPPING.get(name, None)
    idx = 0
    while t is None:
        t = getattr(MODUELS[idx], name, None)
        idx += 1
        if idx >= len(MODUELS):
            break
    if t is None:
        print(f"CANNOT FIND TYPE OF {name}")
    return t


def parse_args_from_docstring(docstring):
    if docstring is None:
        print("NO docstring")
        return {}, {}
    matches = args_pattern.findall(docstring)
    if len(matches) == 0:
        print("No Args")
        return {}, {}
    matches: List[str] = pattern.findall(matches[0][1])
    matches = [m.strip() for m in matches]
    args = {}
    is_optional = defaultdict(bool)
    for m in matches:
        k, v = m.split("(", 1)
        k = k.strip()
        v = v[:-1].split(",")[0].split("|")
        v = [get_type_by_name(i) for i in v]
        v = [i for i in v if i is not None]
        if NoneType in v:
            is_optional["k"] = True
            v.remove(NoneType)
        args[k] = v
    return args, is_optional


def determine_by_default_value(): ...


def convert_func(
    m,
    func_name,
    class_name=None,
    args_and_type: Optional[Dict[str, List[Any]]] = None,
):
    if args_and_type is None:
        args_and_type = parse_args_from_docstring(m.__doc__)
    args_and_type, args_optional = args_and_type
    parameters = inspect.signature(m).parameters
    try:
        # file_path = inspect.getsourcefile(m)
        assert SOURCE_FILE is not None
        tree = load_ast(SOURCE_FILE)
    except:
        print(f"can not load ast of {func_name}, {class_name}")
        return
    if class_name is None:
        node = find_function_by_name(tree, func_name)
    else:
        node = find_method_in_class(tree, class_name, func_name)
    if node is None:
        print("Cant no find code difinitions")
        return
    node_args = {i.arg: i for i in node.args.args}
    for param_name, param_info in parameters.items():
        if param_name == "self":
            continue
        if param_name not in args_and_type and param_name not in ARGS_NAME_MAPPING:
            print(f"CANNOT FIND param in docstring {param_name}")
            continue
        try:
            node_arg = node_args[param_name]
        except KeyError:
            return
        default_value = param_info.default
        type_info = []
        if other_info:=ARGS_NAME_MAPPING.get(param_name, False):
            type_info.extend(other_info)
        if param_name not in OVERWRITE:
            type_info.extend(args_and_type.get(param_name, []))
        node_arg.annotation = gen_annotation(
            set(type_info), args_optional.get(param_name, False) or default_value is None
        )
    if func_name in RETURN_MAPPING:
        node.returns = gname(RETURN_MAPPING[func_name])
    elif DEFAULT_RETURN is not None:
        node.returns = gname(DEFAULT_RETURN)
    # code_str = astor.to_source(tree)
    code_str = ast.unparse(tree)
    write_code(GLOBAL_FILE_PATH or file_path, code_str)


def convert_class(m, class_name):
    args_and_type = parse_args_from_docstring(m.__doc__)

    convert_func(m.__init__, "__init__", class_name, args_and_type)
    # __str__ ......

    methods = inspect.getmembers(m, predicate=inspect.isfunction)

    parent_class = m.__mro__[1]
    non_inheritted_methods = [
        (name, func)
        for name, func in methods
        if func != getattr(parent_class, name, None)
    ]

    for name, method in non_inheritted_methods:
        if name == "__init__":
            continue
        if filter_by_name(name):
            continue
        print(f"CONVERT {class_name}.{name}")
        convert_func(method, name, class_name)
        print()


def convert_var(m): ...


def filter_by_name(name: str) -> bool:
    if name.startswith("_") and not name.startswith("__"):
        return True
    if name in [
        "Tensor",
        "Variable",
        "LayerHelper",
        "default_main_program",
        "check_type",
        "_C_ops",
        "TYPE_CHECKING",
        "check_variable_and_dtype",
    ]:
        return True
    if name in typing.__all__:
        return True
    if name.startswith("in"):  # for in_dynamic_or_pir_mode ....
        return True
    return False


def is_function_defined_in_file(m):
    try:
        source_file = inspect.getsourcefile(m)
    except TypeError:
        return False
    return source_file != SOURCE_FILE


def filter(name: str, m) -> bool:
    if filter_by_name(name):
        return True
    # if is_function_defined_in_file(m):
    #     return True
    return False


def convert_module(module: ModuleType, target_file:str):
    assert isinstance(module, ModuleType)
    global SOURCE_FILE, GLOBAL_FILE_PATH
    SOURCE_FILE = module.__file__
    GLOBAL_FILE_PATH = target_file
    members = inspect.getmembers(module)
    for name, m in members:
        if filter(name, m):
            # print(f"SKIP {name}")
            continue
        if inspect.isfunction(m):
            print(f"CONVERT {name}")
            convert_func(m, name)
        elif inspect.isclass(m):
            print(f"CONVERT {name}")
            convert_class(m, name)
        elif inspect.isdatadescriptor(m):
            ...
        else:
            convert_var(m)


if __name__ == "__main__":
    convert_module(paddle.nn.functional.pooling, './pooling.py')

Asthestarsfalll avatar Jun 22 '24 05:06 Asthestarsfalll

【报名】:A-85

86kkd avatar Jun 22 '24 10:06 86kkd