intel-extension-for-pytorch
intel-extension-for-pytorch copied to clipboard
intel-extension-for-pytorch vs. intel-extension-for-transformers
Describe the issue
Hello,
I noticed that there exists intel-extension-for-transformers, though intel-extension-for-pytorch also allows the deployment of LLMs. What is the difference between the two, and which one is best for deploying LLama2-70b?
Thanks!