Introduction to Artificial Neural Network
Course Content
Week 1
- Lecture 1 : Introduction to machine learning
- What is Machine learning?
- How it works?
- How it is different from the conventional programming?
- Different kinds of machine learning algorithms
- Supervised learning
- Unsupervised learning
- Semi Supervised learning
- Reinforcement learning
- One shot learning
- Few shot learning
- Active learning
- Incremental learning
- Lecture 2 : Introduction to machine learning (Cont.)
- Different terimonologies
- Artificial intelligence
- Data science
- Machine learning
- Natural language processing
- Computer vision
- Predictive modelling
- Generative AI
- Different positions in Artificial intelligence
- Data Scientist
- ML Engineer
- Data Engineer
- MLOps (Machine Learning in Operations)
- ML/DL/NLP Researcher
- Gen AI expert
Week 2
- Lecture 3 : Introduction to Python
- Setup & Installation
- Installing Python and Setting Up the Environment
- Introduction to Python IDEs (e.g., PyCharm, VS Code)
- Hello World
- Variables
- Variables and Data Types (int, float, string, bool)
- Type Conversion and Casting
- Basic Operators (Arithmetic, Comparison, Logical)
- Working with Strings and String Operations
- Control Structures
- Conditional Statements: if, elif, else
- Loops: for, while
- Nested Loops and Conditions
- Break, Continue, and Pass Statements
- Lecture 4 : Python - Data Types
- Data types
- Lists: Creation, Indexing, Slicing, and Modifying
- List Comprehensions
- Tuples: Creation, Indexing, and Immutable Properties
- Common List and Tuple Methods
- Dictionaries: Key-Value Pairs, Accessing, Adding, and Modifying Data
- Common Dictionary Methods
- Sets: Creation, Operations, and Methods
- Working with Complex Data Structures
Week 3
- Lecture 5 : Python - Functions and packages
- Functions and packages
- Defining and Calling Functions
- Function Parameters and Return Values
- Scope of Variables (Local and Global)
- Introduction to Python Modules and Libraries
- Importing and Using Modules
- File Handling
- Opening, Reading, and Writing Files
- Working with Text Files
- Handling File Exceptions
- File Methods and Context Managers (with statement)
- Introduction to OOP Concepts: Classes and Objects
- Defining Classes and Methods
- Inheritance and Polymorphism
- Encapsulation and Abstraction
- Lecture 6 : Python - Data Analysis
- Data Analysis
- Introduction to Pandas and NumPy Libraries
- Working with DataFrames and Series
- Data Cleaning and Manipulation
- Basic Data Visualization using Matplotlib
Week 4
- Lecture 7 : Supervised Machine Learning
- Multi layer perceptron
- Perceptron learning
- Linear activation functions
- Non-linear activation functions
- Lecture 8 : Loss Functions
- Regression Loss
- Mean Squared Error (MSE)
- Mean Absolute Error (MAE)
- Huber Loss
- Log-Cosh Loss
Week 5
- Lecture 9 : Loss Functions (cont.)
- Classification Loss
- Binary Cross-Entropy
- Categorical Cross-Entropy
- Sparse Categorical Cross-Entropy
- Kullback-Leibler (KL) Divergence
- Hinge Loss
- Lecture 10 : Loss Functions (cont.)
- Ranking Loss
- Contrastive Loss
- Triplet Loss
- Other Specialized Loss
- Cosine Similarity Loss
- Focal Loss
- Dice Loss
Week 6
- Lecture 11 : Activation Functions
- Types of Activation Functions
- Linear Activation Function
- Step Function (Binary Thresholding)
- Sigmoid Function (Logistic Activation)
- Tanh (Hyperbolic Tangent) Function
- Hard Sigmoid
- Lecture 12 : Activation Functions (Cont.)
- Types of Activation Functions
- Softmax
- Swish
- Maxout
- ReLU (Rectified Linear Unit)
- Leaky ReLU
- Parametric ReLU (PReLU)
- Exponential Linear Unit (ELU)
Week 7
- Lecture 11 : Optimizers
- Types of Optimizers
- Gradient Descent
- Batch Gradient Descent
- Stochastic Gradient Descent (SGD)
- Mini-batch Gradient Descent
- Lecture 12 : Optimizers (Cont.)
- Types of Optimizers
- Momentum
- Nesterov Accelerated Gradient (NAG)
- Adagrad (Adaptive Gradient)
Week 8
- Lecture 13 : Optimizers (Cont.)
- Types of Optimizers
- RMSprop (Root Mean Square Propagation)
- Adadelta
- Adam (Adaptive Moment Estimation)
- Lecture 14 : Optimizers (Cont.)
- Types of Optimizers
- Adamax
- AMSGrad
- SGD with Warm Restarts (SGDR)
Week 9
- Lecture 13 : Model Validation
- Cross-Validation
- Leave-P-Out Cross-Validation
- Bootstrap Sampling
- Hold-Out Method
- Confusion Matrix
- Lecture 14 : Model Validation (Cont.)
- Receiver Operating Characteristic (ROC) Curve and AUC (Area Under the Curve)
- Precision-Recall Curve
- Log-Loss (Logarithmic Loss) / Cross-Entropy Loss
- Mean Squared Error (MSE) and Root Mean Squared Error (RMSE)
- Mean Absolute Error (MAE)
Week 10
- Lecture 15 : Model Validation (Cont.)
- R-squared (Coefficient of Determination)
- Adjusted R-squared
- F-Score / F-Test
- Lecture 16 : Model Deployment
- Overview of ML Model Deployment
- Model Deployment Architectures (on Premise, Cloud, Hybrid)
- Deployment Strategies
- Deployment Workflow
Week 11
- Lecture 15 : Model Monitoring and Performance Metrics
- Importance of Model Monitoring
- Key Monitoring Metrics
- Types of Monitoring
- Lecture 16 : Model Drift, Retraining, and Continuous Learning
- Understanding Model Drift
- Detecting Drift
- Continuous Learning and Model Retraining
- Tools for Retraining Pipelines
Week 12
- Lecture 17 : Model Tracking, Versioning, and Governance
- Importance of Model Tracking and Versioning
- Tools for Model Tracking
- Model Versioning
- Model Governance and Regulatory Compliance
- Lecture 18 : AI Ethics and Fairness
- Introduction to AI Ethics
- Fairness in AI
- Bias in AI
- Transparency and Explainability
- Privacy and Data Protection