api-inference-community
                                
                                 api-inference-community copied to clipboard
                                
                                    api-inference-community copied to clipboard
                            
                            
                            
                        Add support for neural compressor models
neural_compressor is going to become a
frameworkright ? (library_tagin the README) .
Makes sense to me ! An other option could be to have optimum as library_tag, and have neural-compressor / openvino / onnx as tags, if needed we can also add the model loading logic to optimum (mentionning ONNX and OpenVINO as I would like to add their support as well)
What do you think would make more sense @Narsil @osanseviero ?
What do you think would make more sense @Narsil @osanseviero ?
Both are fine to me. I don't necessarily know the scoping of this vs optimum and such.
Also ignore the failing tests if they work locally for you. Somehow the CI has issues with docker signals.
Perfect thanks @Narsil, I need to wait for updates from the Intel collaboration before merging, will change the PR status to draft temporarily