onnxruntime icon indicating copy to clipboard operation
onnxruntime copied to clipboard

[JS/WebGPU] Creating devices with subgroup features enabled if possible

Open jiangzhaoming opened this issue 1 year ago • 4 comments

This CL make WebGPU backend support subgroup features and thus allow using subgroup optimizations in the future.

Description

With this CL WebGPU backends will create devices with subgroups and subgroups-f16 features (both are under origin trial in Chrome) or chromium-experimental-subgroups feature enabled whenever available.

Motivation and Context

This CL would allow WebGPU operator shaders to use subgroup optimizations in the future, and might get some significant speedup with these optimization.

jiangzhaoming avatar Aug 22 '24 17:08 jiangzhaoming

@qjia7, @fs-eire, @guschmue, @gyagp PTAL

jiangzhaoming avatar Aug 22 '24 17:08 jiangzhaoming

PTAL, thanks

jiangzhaoming avatar Aug 26 '24 06:08 jiangzhaoming

is there any disadvantage of appending "enable subgroups;" in a program that does not use subgroup feature?

fs-eire avatar Aug 26 '24 07:08 fs-eire

is there any disadvantage of appending "enable subgroups;" in a program that does not use subgroup feature?

For now I don't know there would be overhead to enable subgroups.

jiangzhaoming avatar Aug 26 '24 07:08 jiangzhaoming

/azp run Windows ARM64 QNN CI Pipeline,Windows x64 QNN CI Pipeline,Windows CPU CI Pipeline,Windows GPU CUDA CI Pipeline,Windows GPU DML CI Pipeline,Windows GPU Doc Gen CI Pipeline,Windows GPU TensorRT CI Pipeline,ONNX Runtime Web CI Pipeline,Linux CPU CI Pipeline,Linux CPU Minimal Build E2E CI Pipeline

fs-eire avatar Nov 04 '24 11:11 fs-eire

/azp run Linux GPU CI Pipeline,Linux GPU TensorRT CI Pipeline,Linux OpenVINO CI Pipeline,Linux QNN CI Pipeline,MacOS CI Pipeline,orttraining-linux-ci-pipeline,orttraining-linux-gpu-ci-pipeline,onnxruntime-binary-size-checks-ci-pipeline,Big Models,Linux Android Emulator QNN CI Pipeline

fs-eire avatar Nov 04 '24 11:11 fs-eire

/azp run Android CI Pipeline,iOS CI Pipeline,ONNX Runtime React Native CI Pipeline,CoreML CI Pipeline,Linux DNNL CI Pipeline,Linux MIGraphX CI Pipeline,Linux ROCm CI Pipeline

fs-eire avatar Nov 04 '24 11:11 fs-eire

Azure Pipelines successfully started running 1 pipeline(s).

azure-pipelines[bot] avatar Nov 04 '24 11:11 azure-pipelines[bot]

Azure Pipelines successfully started running 1 pipeline(s).

azure-pipelines[bot] avatar Nov 04 '24 11:11 azure-pipelines[bot]

Azure Pipelines successfully started running 1 pipeline(s).

azure-pipelines[bot] avatar Nov 04 '24 11:11 azure-pipelines[bot]

/azp run Windows ARM64 QNN CI Pipeline,Windows x64 QNN CI Pipeline,Windows CPU CI Pipeline,Windows GPU CUDA CI Pipeline,Windows GPU DML CI Pipeline,Windows GPU Doc Gen CI Pipeline,Windows GPU TensorRT CI Pipeline,ONNX Runtime Web CI Pipeline,Linux CPU CI Pipeline,Linux CPU Minimal Build E2E CI Pipeline

fs-eire avatar Nov 07 '24 08:11 fs-eire

/azp run Linux GPU CI Pipeline,Linux GPU TensorRT CI Pipeline,Linux OpenVINO CI Pipeline,Linux QNN CI Pipeline,MacOS CI Pipeline,orttraining-linux-gpu-ci-pipeline,onnxruntime-binary-size-checks-ci-pipeline,Big Models,Linux Android Emulator QNN CI Pipeline,Android CI Pipeline

fs-eire avatar Nov 07 '24 08:11 fs-eire

/azp run iOS CI Pipeline,ONNX Runtime React Native CI Pipeline,CoreML CI Pipeline,Linux DNNL CI Pipeline,Linux MIGraphX CI Pipeline,Linux ROCm CI Pipeline

fs-eire avatar Nov 07 '24 08:11 fs-eire

Azure Pipelines successfully started running 1 pipeline(s).

azure-pipelines[bot] avatar Nov 07 '24 08:11 azure-pipelines[bot]

Azure Pipelines successfully started running 1 pipeline(s).

azure-pipelines[bot] avatar Nov 07 '24 08:11 azure-pipelines[bot]

Azure Pipelines successfully started running 1 pipeline(s).

azure-pipelines[bot] avatar Nov 07 '24 08:11 azure-pipelines[bot]