azureml-examples
azureml-examples copied to clipboard
How to add a custom metric to text generation pipeline + enable MLflow tracking
Describe your suggestion
I am following this example of a text generation pipeline for finetuning.
I see from the yaml api that there is an evaluation config parameters argument that includes computing metrics.
However, it's not clear how to structure this argument. If I have my own custom compute_metrics function, how do I structure the arguments of the pipeline to utilize it? (i.e. the argument for evaluation_config is uri_file type, while it's a json serialized string for the evaluation_config_params argument).
An additional question, is there a way to turn on mlflow tracking with this pipeline so that the metrics get logged?
Additional details
No response