Week 1 |
1 |
Artificial Intelligence, Introduction to Machine Learning, Its implications with real-world examples, Concept of self-learning |
|
2 |
Types of learning {Supervised learning (Classification, Regression), Unsupervised learning, Reinforcement learning) |
Week 2 |
3 |
Introduction to Datasets, Statistical Analysis of data (Descriptive & Predictive), Data Collection (Numerical data, Categorical data) |
|
4 |
Data Preprocessing and its techniques (Data cleaning, Data integration, Data Argumentation, Data reduction, Data transformation) |
Week 3 |
5 |
Introduction to python and libraries |
|
6 |
Implementation of data preprocessing using python |
Week 4 |
7 |
Regression problem Linear Regression (linear equation, slop of the line, relationship between attributes, intercept, ordinary least square, residual error) |
|
8 |
Implementation of simple linear regression Evaluate the relation between attributes using plotting. |
Week 5 |
9 |
Multiple linear regression (Dummy variable, multicollinearity, dummy variable trap, building a model using backward elimination) |
|
10 |
Implementation of multiple linear regression using stat library in python. |
Week 6 |
11 |
Polynomial Regression and implementation (Degree of polynomial) |
|
12 |
Logistic Regression intuition and real-world example |
Week 7 |
13 |
K-nearest neighbor intuition and solved example using real-world dataset. |
|
14 |
Implementation of K-NN using python |
Week 8 |
1 hours |
Mid Term |
Week 9 |
15 |
Decision Tree (Entropy, information gain) |
|
16 |
Mathematical implementation using dataset on python |
Week 10 |
17 |
|
|
18 |
|
Week 11 |
19 |
Supervised Algorithm-Naïve Bayes (Conditional probability, Bayes theorem) Derive theorem mathematically |
|
20 |
Support vector machine (linearly separable data, non-linearly separable data) Linear SVM implementation using example |
Week 12 |
21 |
Kernel function intuition (RBF kernel function, Sigmoid function) Implementation of SVM |
|
22 |
Unsupervised learning, Clustering (k-mean clustering intuition- k mean++, the elbow method implementation using example) |
Week 13 |
23 |
Implementation of K-mean EM Algorithm/DBSCAN intuition |
|
24 |
Ensemble learning (Bagging, Boosting) Bagging(Bootstrap aggregation, row sampling with replacement), Random forest |
Week 14 |
25 |
Bias-Variance trade off, Boosting (Adaboost) |
|
26 |
Stacking Optimization algorithm (Gradient Descent, Stochastic Gradient Descent) |
Week 15 |
27 |
Evaluation Matrices, Reinforcement learning (Tuning model complexity) |
|
28 |
Natural language processing (Bag of words, stemming, lemmatization) |
Week 16 |
29 |
Implementation of NLP using dataset using python |
|
30 |
Neural network (how brain works, perceptron, the neuron) |
Week 17 |
2 hours |
Final Term |