model-optimization icon indicating copy to clipboard operation
model-optimization copied to clipboard

Add MbyN sparsity schedulers

Open psunn opened this issue 4 years ago • 5 comments

This PR introduce two new schedulers for M_by_N sparsity   - PolynomialDecayMbyNSparsity   - ConstantMbyNSparsity

ConstantMbyNSparsity

Pruning model at step prune_step, that is,  m_by_n sparsity masks calculate only at prune_step then remain constant and keep fully apply on the weights.

# apply 2 by 4 sparsity with a constant scheduler
# sparsity masks only calculate at step 1, they remain constant and
# keep fully apply on the weights till training finished.
pruning_params = {
    'pruning_schedule': tfmot.sparsity.keras.ConstantMbyNSparsity(prune_step=1),
    'sparsity_m_by_n': (2, 4),
}

PolynomialDecayMbyNSparsity

Pruning the model between range [begin_step, end_step] in every frequency number of steps. m_by_n sparsity masks been update in every _should_prune_in_step steps, and apply on the weights according to coverage_ratio.

# apply 2 by 4 sparsity with polynomial decay scheduler
# sparsity masks update between [1, 10] in every training step, and
# keep apply partially to weights according to coverage_ratio
# coverage_ratio increase linearly(power=1.0) from 0.0 to 1.0
# at the end_step, 2 by 4 sparsity masks fully apply on weights
pruning_params = {
    'pruning_schedule':
        tfmot.sparsity.keras. PolynomialDecayMbyNSparsity(
            initial_coverage_ratio=0.0, begin_step=1,
            end_step=10, power=1.0, frequency=1),
    'sparsity_m_by_n': (2, 4),
}

(with a 2D 4x4 weight tensor, scheduled 2 by 4 mask could be example)

psunn avatar Oct 01 '21 14:10 psunn

Hi do you have any relevant experiment data that you could add to this PR? This seems useful, but hard to see how to set the power value.

daverim avatar Oct 07 '21 02:10 daverim

All (the pull request submitter and all commit authors) CLAs are signed, but one or more commits were authored or co-authored by someone other than the pull request submitter.

We need to confirm that all authors are ok with their commits being contributed to this project. Please have them confirm that by leaving a comment that contains only @googlebot I consent. in this pull request.

Note to project maintainer: There may be cases where the author cannot leave a comment, or the comment is not properly detected as consent. In those cases, you can manually confirm consent of the commit author(s), and set the cla label to yes (if enabled on your project).

ℹ️ Googlers: Go here for more info.

google-cla[bot] avatar Oct 13 '21 16:10 google-cla[bot]

All (the pull request submitter and all commit authors) CLAs are signed, but one or more commits were authored or co-authored by someone other than the pull request submitter.

We need to confirm that all authors are ok with their commits being contributed to this project. Please have them confirm that by leaving a comment that contains only @googlebot I consent. in this pull request.

Note to project maintainer: There may be cases where the author cannot leave a comment, or the comment is not properly detected as consent. In those cases, you can manually confirm consent of the commit author(s), and set the cla label to yes (if enabled on your project).

ℹ️ Googlers: Go here for more info.

googlebot avatar Oct 13 '21 16:10 googlebot

CLAs look good, thanks!

ℹ️ Googlers: Go here for more info.

googlebot avatar Oct 13 '21 16:10 googlebot

CLAs look good, thanks!

ℹ️ Googlers: Go here for more info.

google-cla[bot] avatar Oct 13 '21 16:10 google-cla[bot]

I believe this feature would be useful, and I invite you to take a look at it. If you're interested, we could reopen the issue. Thank you.

Since there has been no activity on this issue, I proceed to close it for now.

psunn avatar Feb 23 '23 13:02 psunn