Machine-Learning-Explained
Machine-Learning-Explained copied to clipboard
Learn the theory, math and code behind different machine learning algorithms and techniques.
Machine-Learning-Explained
This repository contains explanations and implementations of machine learning algorithms and concepts. The explanations are also available as articles on my website.
Machine Learning Algorithms
- Linear Regression
- Logistic Regression
- K Nearest Neighbors
- Decision Tree
- KMeans
- Mean Shift
- DBSCAN
- Random Forest
- Adaboost
- Gradient Boosting
- Principal Component Analysis (PCA)
- Kernel PCA
- Linear Discriminant Analysis (LDA)
Optimizers
- Gradient Descent
- Adagrad
- Adadelta
- RMSprop
- Adam
- AdaMax
- Nadam
- AMSGrad
- AdamW
- QHM
- QHAdam
- RAdam
Activation Functions
- ELU
- GELU
- Leaky RELU
- Mish
- RELU
- SELU
- Sigmoid
- SILU
- Softmax
- Softplus
- Tanh
Metrics
- Binary Cross Entropy
- Categorical Crossentropy
- Accuracy Score
- Confusion Matrix
- Precision
- Recall
- F1-Score
- Receiver operating characteristic (ROC)
- Area under the ROC curve (AUC)
- Hinge Loss
- KL Divergence
- Brier Score
- Mean Squared Error
- Mean Squared Logaritmic Error
- Mean Absolute Error
- Mean Absolute Percentage Error
- Median Absolute Error
- Cosine Similartiy
- R2 Score
- Tweedie Deviance
- D^2 Score
- Huber loss
- Log Cosh Loss
Ensemble Methods
- Averaging
- Bagging
- Blending
- Majority Vote
- Stacking
- Stacking retrained
- Weighted Average
Contributing
Contributions to Machine-Learning-Explained are always welcome, whether code or documentation changes. For contribution guidelines, please see the CONTRIBUTING.md file.
License
This project is licensed under the MIT License - see the LICENSE.md file for details.