About powerinfer-2
Prerequisites
Before submitting your issue, please ensure the following:
- [x] I am running the latest version of PowerInfer. Development is rapid, and as of now, there are no tagged versions.
- [x] I have carefully read and followed the instructions in the README.md.
- [x] I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
Feature Description
I have read the paper PowerInfer-2: Fast Large Language Model Inference on a Smartphone. Will the related code open-sourced?
By the way, the core innovation of the work is how to use the heterogeneous compute system on mobile phone to run a large model. Are there any tutorials about how to use NPU/GPU of Snapdragon 8 Gen 3?
I too am interested in the powerinfer 2 source code
I found this: https://x.com/hodlenx/status/1800788808272937297 they said they were working on open sourcing it edit after reading the powerinfer paper i believe the fastest way to open source it would be to create a different repository since from what i can see, i believe powerinfer-2 to be a big departure from the original powerinfer