maykulkarni/Machine-Learning-Notebooks

Repository files navigation

Helpful jupyter noteboks that I compiled while learning Machine Learning and Deep Learning from various sources on the Internet.

  1. NumPy Basics
  1. Feature Selection: Imputing missing values, Encoding, Binarizing.

  2. Feature Scaling: Min-Max Scaling, Normalizing, Standardizing.

  3. Feature Extraction: CountVectorizer, DictVectorizer, TfidfVectorizer.

  1. Linear & Multiple Regression

  2. Backward Elimination: Method of Backward Elimination, P-values.

  3. Polynomial Regression

  4. Support Vector Regression

  5. Decision Tree Regression

  6. Random Forest Regression

  7. Robust Regression using Theil-Sen Regression

  8. Pipelines in Scikit-Learn

  1. Logistic Regression

  2. Regularization

  3. K Nearest Neigrs

  4. Support Vector Machines

  5. Naive Bayes

  6. Decision Trees

  1. KMeans

  2. Minibatch KMeans

  3. Hierarchical Clustering

  4. Application of Clustering - Image Quantization

  5. Application of Custering - Outlier Detection

  1. Cross Validation and its types

  2. Confusion Matrix, Precision, Recall

  3. R Squared

  4. ROC Curve, AUC

  5. Silhoutte Distance

  1. Apriori Algorithm

  2. Eclat Model

  1. Upper Confidence Bound Algorithm

  2. Thompson Sampling

  1. Sentiment Analysis
  1. What are Activation Functions

  2. Vanilla Neural Network

  3. Backpropagation Derivation

  4. Backpropagation in Python

  5. Convolutional Neural Networks

  6. Long Short Term Memory Neural Networks (LSTM)

  1. Machine Learning by Andrew Ng ()
  2. Machine Learning A-Z ()
  3. Deep Learning A-Z ()
  4. Neural Networks by Geoffrey (Hinton )
  5. Scikit-learn Cookbook (Second Edition) - Julian Avila et. al