Generative AI

Sale!

Generative AI

Original price was: Rs.10,000.00.Current price is: Rs.2,000.00.

Please login to purchase the course.

18% GST Extra.

Generative AI Course: 15-Day, 1-Hour

Day 1: Introduction to AI & Generative AI 

The Evolution of Artificial Intelligence: (Brief overview: symbolic AI -> ML -> DL) AI/ML/DL/GenAI Hierarchy: Clearly define each and their relationship. Introduction of AI, Types of AI: (Narrow, General, Superintelligence) The Impact of AI on Everyday Life: (Examples) Understanding Generative AI (Initial Look): What it is, why it's different.

Day 2: Machine Learning & Deep Learning Fundamentals 

Understanding Machine Learning, Types of Machine Learning: (Supervised, Unsupervised, Reinforcement, Semi-supervised) Understanding Deep Learning: What makes it "deep"? Machine Learning Vs Deep Learning: Key differences and when to use each. Usages of Generative AI in Real-life: (More examples: art, text, music, code) Some Popular Generative AI Tools: (Brief mention: ChatGPT, Midjourney, Stable Diffusion, etc.)

Day 3: Linear Algebra - Vectors & Matrices 

Introduction to Linear Algebra: Why it's essential for AI/ML. Vector and Vector Operations: (Addition, scalar multiplication, dot product, norm) Vector Space (Linear Space): High-level concept. Matrix, Matrix Operations: (Addition, scalar multiplication, matrix multiplication - crucial!) Transpose of a Matrix: Importance in operations.

Day 4: Linear Algebra - Special Matrices & Transformations 

Types of Matrices: (Identity, diagonal, symmetric, sparse - relevant for efficiency) Adjoint of a Matrix, Inverse of a Matrix: (Conceptual understanding, important for solving systems, not deep computation) Types of Special Matrices: (Revisit and emphasize relevance) Linear Transformation: How matrices transform vectors (fundamental to neural network layers).

Day 5: Probability Theory - Basics & Distributions 

Introduction to Probability: Basic definitions. Sample Space, Outcome, Event (E): Clear definitions. Probability, Probability Rules: (Addition, multiplication rules) Conditional Probability, Bayes’ Theorem: (Crucial for VAEs, understanding model uncertainty) Random Variable, Probability Distributions: (Focus on Bernoulli, Categorical, Gaussian/Normal - most relevant for GenAI losses/priors)

Day 6: Statistics - Data Understanding & Inference 

Introduction of Statistics: Role in data analysis. Descriptive Statistics: Summarizing data. Measures of Central Tendency: (Mean, Median, Mode - why each is used) Measures of Dispersion: (Variance, Standard Deviation - crucial for data understanding and model evaluation) Inferential Statistics, Descriptive vs. Inferential Statistics: High-level difference. Estimation in Statistics, Hypothesis Testing: (Very brief conceptual overview of parameter estimation)

Day 7: Calculus - Optimization & Gradient Descent 

Introduction of Calculus: Why derivatives are critical for learning. Derivative, Derivative Rules: (Chain rule – fundamental for backpropagation) Derivative of Vector Functions, Derivative of Matrix Functions: (Conceptual understanding for multi-variable functions) Cost function/Loss Function: What it is and why we minimize it. Optimization Algorithms, Gradient descent, Types Of Gradient Descent: (Batch, Stochastic, Mini-batch – core learning mechanism)

Day 8: Neural Network and Deep Learning Basics

 Neural Networks: Basic structure (input, hidden, output layers, neurons, weights, biases). Difference Between Neural Network and Machine Learning: Reiterate. Activation Functions: (ReLU, Sigmoid, Tanh – their purpose) Forward Propagation: How input flows through the network to make a prediction. Backward Propagation: (Conceptual explanation of how errors are used to update weights via gradients).

Day 9: Introduction to Generative Models & GANs Basics 

Introduction to Generative Models: What they do and why they are powerful. Discriminative Models vs Generative Models: Key differences with examples. Generative AI (Recap): Re-emphasize its place. Introduction to Generative Adversarial Networks (GANs): The core idea of generator vs. discriminator. Architecture of GANs: Generator and Discriminator networks. Adversarial Process in GANs: The game theory aspect.

Day 10: GANs - Deep Dive & Conceptual Implementation

Objective Function for GANs: Minimax game (conceptual). Loss Functions: (Binary Cross-Entropy for discriminator and generator) Applications of GANs: (Image generation, style transfer, super-resolution) Types of GANs: (Brief mention of DCGAN, WGAN, CycleGAN for context) Limitations of GANs: (Mode collapse, training instability) Implementation using PyTorch: (Conceptual walkthrough of a simple GAN's structure: GeneratorDiscriminator classes, training loop).

Day 11: Autoencoders & Introduction to VAEs

 Introduction to Autoencoders: Purpose (dimensionality reduction, feature learning). Architecture of Autoencoders: Encoder, Latent Space, Decoder. Working Principle of Autoencoder: Encoding and decoding process. Undercomplete Autoencoder vs. Overcomplete Autoencoder, Use Cases: (Denoising, anomaly detection) Advantages of Autoencoders: Implementation of Autoencoder using PyTorch: (Conceptual walkthrough: defining encoder/decoder, basic training). Introduction to Variational Autoencoders: How VAEs differ from standard AEs (probabilistic latent space).

Day 12: VAEs - Deep Dive & Conceptual Implementation 

Architecture of VAE: Encoder (mean & variance), Latent Space (sampling), Decoder. Loss Functions: (Reconstruction Loss + KL Divergence – explain purpose of each). Applications of VAEs: (Image generation, disentanglement, latent space interpolation). Implementation of Variational Autoencoder(VAE) using PyTorch: (Conceptual walkthrough: reparameterization trick, computing KL divergence, full loss).

Day 13: Autoregressive Models - Statistical & Early Deep Learning 

Introduction to Autoregressive Models: Predicting next based on previous. Importance of Autoregression: Time series, sequential data. Statistical Models of Autoregression: (High-level conceptual understanding of AR, ARMA, ARIMA - focus on the idea not the math). Deep Learning Models of Autoregression: (Briefly mention PixelCNN, WaveNet as early DL AR models). Recurrent Neural Networks (RNNs): Basic concept of sequential processing. Long Short-Term Memory Neural Networks (LSTMs): Addressing vanishing gradients in RNNs (conceptual understanding of gates).

Day 14:Introduction to Transformers

 Introduction to Transformers: Why they emerged (parallelization, handling long-range dependencies better than RNNs). Transformer Architecture: High-level Encoder-Decoder structure.

Day 15: Transformers - Attention & Conceptual Implementation

Transformer Implementation from Scratch using Pytorch: (Conceptual breakdown of key PyTorch modules: MultiheadAttentionTransformerEncoderLayerTransformerEncoderPositionalEncoding – how these build up the transformer).

Benefits:
  • Time-saving & Cost-effective
  • Get trained via industry experts (having 10+ years of experience in the same field, corporate trainers)
  • Full of hands-on practical exposure for better understanding
  • Adding super solid value in your professional career
  • Weekend Doubt clearing sessions.

For inquiry call:  8953463074

Online Live Training Program 2025