Rust
Rust copied to clipboard
Suggestion: Adding Loss functions of Machine Learning in maths folder
I would like to suggest adding Loss functions in this repo.
The loss function estimates how well a particular algorithm models the provided data.
Add Loss Functions to: machine_learning/loss_functions
Task List:
- [x] Cross-Entropy
- [x] Hinge loss
- [x] Huber loss.
- [x] MSE
- [ ] NLL
- [x] MAE
- [ ] Marginal Ranking
- [x] KL Divergence
Hi @GreatRSingh , Thanks for the suggestion. Just to give an update, Some of them already implemented in maths section. Please check that. I have taken this up to add as many functions as possible.
@Navaneeth-Sharma Can you put up a list of loss functions that you are going to implement and which you have already implemented?
Sorry @GreatRSingh , I think I miss read loss as activation functions. Loss functions aren't implemented. But have a plans to do that specially the basic ones like Cross Entropy, MSE, RMSE. You can take it up some of those, if you like to contribute.
@Navaneeth-Sharma ok I will do that.
Hi, Just an info so that we dont implement the same losses, I will try to take up these loss functions
- Cross-Entropy loss (bin class, multi class)
- Hinge loss
- Huber loss. Let me know if any of them are already taken @GreatRSingh
@Navaneeth-Sharma No none of them are already taken.
I will take up MSE, NLL, MAE, MarginRanking, KLDivergence.
@Navaneeth-Sharma @siriak I have added a list of Loss Functions in the description. Keep Suggesting any other if needed.
This issue has been automatically marked as abandoned because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Working on adding Hinge Loss
This issue has been automatically marked as abandoned because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Are all these functions taken ?
No, you can start working on them.
On Fri, Jan 5, 2024, 12:21 AM Eddy Oyieko @.***> wrote:
Are all these functions taken ?
— Reply to this email directly, view it on GitHub https://github.com/TheAlgorithms/Rust/issues/559#issuecomment-1877597795, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALY6WHSWG3XIZJPTNJ44IDLYM32Z5AVCNFSM6AAAAAA5ZIPZLCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQNZXGU4TONZZGU . You are receiving this because you were mentioned.Message ID: @.***>
Opened a PR to add KL divergence loss as a sub task of this issue. #656 Kindly review.
I have opened a PR that implements Huber loss function, which is mentioned is this issue #697 :rocket:. Let's take a look on this PR :hugs:.
Are there any function that can still be worked on?
I think NLL and Marginal Ranking from the list are still not implemented
Finished implementing both, but have a question regarding the PR. Should I open 2 separate PR's or 1 PR containing both algorithms?
2 separate PRs please
When opening a PR, please make sure that your PR is on a separate branch like feat/ml/loss/nll{marginal_ranking} instead of the master branch (on your fork). This helps the git history clean and avoid accidents when merging the code from the original to your fork repository. Please commit changes with meaningful messages that reflect changes on the source code, not try to make many redundant commits for a single change.