Machine Learning Algorithms from Scratch
This is an ongoing project with the idea to showcase various supervised and unsupervised machine learning algorithms coded from scratch using basic Python libraries such as Numpy, Pandas and Matplotlib.
Part-1: Simple Linear Regression using various Gradient Descent Approaches
The goal is to build a Simple Linear Regression model from scratch using various Gradient Descent approaches such as Stochastic, Batch and Mini-Batch gradient descent.
Part-2: Multiple Linear Regression
The goal is to build a Multiple Linear Regression model from scratch using Mini-Batch Gradient Descent approach.
Part-3: Polynomial Regression
The goal is to build a Polynomial Regression model from scratch using Mini-Batch Gradient Descent approach.
Part-4: Logistic Regression
The goal is to build a Logistic Regression model from scratch using Mini-Batch Gradient Descent approach.
Part-5: Regularization: Ridge, Lasso and Elastic Net
The goal is to create a naive framework for building linear and logistic regression models with 𝑙1 (lasso), 𝑙2 (ridge), 𝑙1 𝑎𝑛𝑑 𝑙2 (elastic net) penalties. The framework is then used to build and fit various models with and without regularization using different datasets.
Part-6: K-Nearest Neighbors
The goal is to build a K-Nearest Neighbor classification model from scratch.
Part-7: Naive Bayes Classifier
The goal is to build a Gaussian Naive Bayes classifier from scratch. We will use the Rice Seed classification dataset to build a model and evaluate its performance.
Part-8: Support Vector Classifier
The goal is to build a naive Support Vector Classifier from scratch.
Part-9: Decision Tree Classifier
The goal is to build a naive Decision Tree Classifier from scratch.
Part-10: Random Forest Classifier
The goal is to build a naive Random Forest Classifier from scratch.