image-quality-assessment
image-quality-assessment copied to clipboard
Http Server
Added a dockerfile which exposed the model via rest api.
flask server will run on port 5005
URL - POST /query/{model} - model can be aesthetics or technical Parameters - [ "<image_url_1>", "<image_url_2>" ]
Response -
[
{
"image_id": "<image_name_1>",
"mean_score_prediction": 4.780154329499055
},
{
"image_id": "<image_name_2>",
"mean_score_prediction": 4.780154329499055
}
]
Hi, thanks a lot for the PR! It's a great idea to add a Flask server to the repo - there are a few things that need to be addressed in your PR:
- Every time a request is send to an endpoint the respective model needs to be loaded which is inefficient. I would suggest to have only one
prediction
endpoint, in which the model only gets loaded once. You can achieve this e.g. with
def load_model(config):
global model
model = Nima(config['base_model_name'])
model.build()
model.nima_model.load_weights(config['weights_file'])
model.nima_model._make_predict_function() # https://github.com/keras-team/keras/issues/6462
model.nima_model.summary()
...
if __name__ == '__main__':
load_model(config)
app.run(host='0.0.0.0', port=PORT)
The model should be provided as an argument when starting the Flask server. This has the downside that not both models are available in the same server, but makes it a lot more flexible for other users to serve their own models.
-
There are a lot of unneeded imports in
src.server.py
-
Please add a description in the
README
of how to build the docker image and serve predictions
Many thanks!
Hi, thanks for considering it! Please have a look at the server.py changes and let me know if anything else needs to be done!
Up :D
It needed a few more bug fixes, but I'm using this now as well. Sorry not to push the fixes but I needed some personal customisations too. Anyway, thanks. Hope this gets merged soon!
Hi guys, thank you for the great work and sharing it ! Is there an exec file to make the Flask server run ?