ml-commons
ml-commons copied to clipboard
[META] ML Inference Processor Flexibility Enhancements
Is your feature request related to a problem?
- [ ] Currently, search response processor can add prediction results to new fields to extensions. Try to enable option to save the model prediction to the extension part in the search response.
- [ ] For rerank use cases, we need to pass on the query_text to the model input dataset along with the documents in the search response. Try to enable option to add query_text to the model_config.
What solution would you like? A clear and concise description of what you want to happen.
What alternatives have you considered? A clear and concise description of any alternative solutions or features you've considered.
Do you have any additional context? Add any other context or screenshots about the feature request here.
closing this issue as features are implemented in 2.18