state-of-open-source-ai icon indicating copy to clipboard operation
state-of-open-source-ai copied to clipboard

Inference Optimization Chapter

Open Anindyadeep opened this issue 2 years ago • 1 comments

Type

new chapter

Chapter/Page

Something else

Description

Doing training or inference models are fairly easy, when we have smaller number of parameters. But when the scale of parameters and data increases, it becomes increasingly difficult to optimize that (interms of compute and performance). So a dedicated chapter on inference optimization and arithmetics on resource calculations becomes very useful.

Some reference links:

Anindyadeep avatar Nov 27 '23 07:11 Anindyadeep

https://twitter.com/amanrsanger/status/1728502445184274710?s=46

filopedraz avatar Nov 27 '23 07:11 filopedraz