tuning_playbook icon indicating copy to clipboard operation
tuning_playbook copied to clipboard

A playbook for systematically maximizing the performance of deep learning models.

Results 40 tuning_playbook issues
Sort by recently updated
recently updated
newest added

Hello, thank you for the great work. I'm trying to translate this playbook into Chinese to help more people. Should I create a Pull Request for it?

This is a great doc, thanks for putting together so much accumulated wisdom in one place! In the section "[Changing the batch size requires re-tuning most hyperparameters](https://github.com/google-research/tuning_playbook#changing-the-batch-size-requires-re-tuning-most-hyperparameters)" I think it...

Hello, thank you for the great work. I know this paper: https://arxiv.org/abs/2007.01547 that benchmarks a lot of optimizers with a lot of configuration. I believe this paper will greatly benefit...

FAQs -> What are the update rules for all the popular optimization algorithms? -> Nesterov A closing parenthesis is missing in the third equation. It's currently $$\theta_{t+1} = \theta_{t} -...

Some links in the table-of-content are broken. This pull request will fix them.

Hi, Thanks for making such a comprehensive document. I created a PDF version of it. Hope this helps!

### Discussed in https://github.com/google-research/tuning_playbook/discussions/3 Originally posted by **madaan** January 19, 2023 Thanks, the playbook looks pretty cool! I am curious about: > Normalization should be the last operation before the...

e.g., https://docs.scipy.org/doc/scipy/reference/generated/scipy.stats.qmc.Halton.html#scipy.stats.qmc.Halton This is likely to be better maintained than the MLCommons code.

[![workerB](https://img.shields.io/endpoint?url=https%3A%2F%2Fworkerb.linearb.io%2Fv2%2Fbadge%2Fprivate%2FU2FsdGVkX1YQWN50d0DxI0Qv05LLyUOW3LGFSc%2Fcollaboration.svg%3FcacheSeconds%3D60)](https://workerb.linearb.io/v2/badge/collaboration-page?magicLinkId=McYGxlD) One major typo fixed (Papers -> People). Added comma wherever missing.