..
19CSE456
NNDL final project review PPT
- Abstract
- Uncomplimentary nature of edge computing and CNNs
- Model compression using Knowledge Distillation
- ATCam modules
- Feature Map Accumulation (FMA)
- Maximum Intensity Projection (MIP)
- Saliency Maps
- Student model gained an accuracy boost of 9.86% over traditional KD methods
- Student matched and exceeded Teacher model by 1.52% while being 43.2% in size
- Literature Survey
- Dropping
- Alippi et. al, Yao. S dropped elements probabilistically
- Molochanov used a more analytical approach (Second-order Taylor series)
- Enlarging
- Han et. Al – Ghost Maps
- Hardware-based Solutions
- Xing Wang – LCNNs in high silicon HI3519
- Alwyn Burger – FGPA
- Knowledge Distillation proposed by Hinton et. al
- Papers by Huang et. al, Crowley et. al, and Furnello et. al propose some novel methods.
- Dropping
- Detailed Proposed Methodology a. Work-flow diagram b. Algorithm c. Utilized Deep Learning models details
- Experimental Environment and Evaluation metrics
- Results and Discussion a. Comparison of results in Table format b. Learning curves over the epochs (Accuracy, Loss, and Other metrics)
- Conclusion and Future work
- References
Important Questions for Lab-based Components
- Construction of MLP with and without using predefined functions
- Construction of CNN with different hyperparameters.
- Constructions of different types of Auto-encoders
- Constructions of RNN and LSTM with different hyper-parameters.
- Constructions of PNN.
- Different metrics and criteria for evaluating the performance of the Auto-encoders, CNN, and others.
- Visualization of metric curves and feature maps of all the above models.
Component | Marks |
---|---|
Lab | 40 |
Theory | 45 |
Viva | 15 |
Viva will be asked from the whole portion
Coding environment:- Google colab
Location :- in ab2/ab1 lab (exact location will be shared tomorrow)
Theory Paper pattern: 5 question (including problems) 410=40M 15=5M Each question may have sub divisions
Concentrate more on unit-2 and unit-3 (for both lab and theory)
Problems will be coming from
- Perceptron
- Convolution and CNN
- HNN
- HMM
- CRF
Also, u have to remember one question and problem for each topic. lab split: 15 +15 +10 15 - CNN/ RNN 15 - Auto Encoders 10 - MLP