Metrics
Metrics copied to clipboard
Machine learning evaluation metrics, implemented in Python, R, Haskell, and MATLAB / Octave
https://github.com/benhamner/Metrics/blob/9a637aea795dc6f2333f022b0863398de0a1ca77/Python/ml_metrics/average_precision.py#L32 Hello: I notice that the reuslt from running `apk([1, 1, 1], [1, 1, 1], 3)` does not return 1. I wonder if it should be `if p in actual...
when tried to install with pip in virtual environment, it throws an error as below. >pip install ml_metrics Collecting ml_metrics Using cached ml_metrics-0.1.4.tar.gz (5.0 kB) Preparing metadata (setup.py) ... error...
* We get this error when we try to install ml_metrics through pip ``` pip install ml_metrics Collecting ml_metrics Downloading ml_metrics-0.1.4.tar.gz (5.0 kB) Preparing metadata (setup.py) ... error error: subprocess-exited-with-error...
This PR fixes #49 According to the [Wikipedia page of Average Precision](https://en.wikipedia.org/wiki/Evaluation_measures_(information_retrieval)#Average_precision) the equation is defined as follow:  where `rel(k)` is an indicator function equaling 1 if the item...
# After I run the code in my anaconda3 > pip install ml_metrics Collecting ml_metrics Requirement already satisfied: numpy in /home/westwood/anaconda3/lib/python3.7/site-packages (from ml_metrics) (1.15.1) Requirement already satisfied: pandas in /home/westwood/anaconda3/lib/python3.7/site-packages...
Closes #52
There is a small typo in Python/ml_metrics/custom/kdd_average_precision.py. Should read `precision` rather than `prescision`.
Optimized code
it should return return score / num_hits rather than return score / min(len(actual), k)
Multiplying n_pos and n_neg can result in integer overflow. A little algebraic manipulation can avoid the multiplication, removing the possibility of overflow.