Neural Networks - Course Details
Deep dive into neural network architectures, backpropagation, and training techniques. Understand how neural networks learn from data.
- 4.5 Rating
What you'll learn?
- Perceptrons and activation functions
- Backpropagation algorithm
- Architecture design
- Regularization techniques
- Hyperparameter tuning
Requirements
- Linear algebra basics
- Python programming
- Introduction to neural networks and artificial neurons
- Perceptron: single-layer network, binary classification, limitations
- Multilayer Perceptron (MLP): architecture, hidden layers, forward propagation
- Sigmoid, Tanh, ReLU, Leaky ReLU, Softmax
- Choosing the right activation function
- Mean Squared Error (MSE), Cross-Entropy, Hinge Loss
- Gradient descent, Stochastic Gradient Descent (SGD), Adam, RMSProp
- Regularization: L1/L2, Dropout
- Learning rate scheduling
- Forward propagation and backward propagation
- Weight updates and chain rule
- Vanishing and exploding gradients
- CNN architecture: convolution, pooling, fully connected layers
- Modern CNN architectures: DenseNet
- Applications in image classification
- RNN concepts and sequence modeling
- Long Short-Term Memory (LSTM) networks
- Gated Recurrent Unit (GRU)
- Applications in text, speech, and time-series prediction
- Perceptron implementation
- MLP classifier for classification tasks
- CNN for digit/image recognition (e.g., MNIST)
- RNN/LSTM for sequence prediction, text, or time-series tasks
- Hyperparameter tuning (layers, neurons, learning rate)
- Overfitting vs underfitting
- Early stopping and batch normalization
- End-to-end neural network project integrating CNN and/or RNN
- Focus on dataset preprocessing, model building, evaluation, and optimization
Document
No FAQ items for this course yet.
No videos for this course yet.
- Materials 1
- Format Document