..

19CSE456

NNDL final project review PPT

  1. Abstract
    • Uncomplimentary nature of edge computing and CNNs​
    • Model compression using Knowledge Distillation​
    • ATCam modules​
    • Feature Map Accumulation (FMA)​
    • Maximum Intensity Projection (MIP)​
    • Saliency Maps​
    • Student model gained an accuracy boost of 9.86% over traditional KD methods​
    • Student matched and exceeded Teacher model by 1.52% while being 43.2% in size
  2. Literature Survey
    • Dropping​
      • Alippi et. al, Yao. S dropped elements probabilistically​
      • Molochanov used a more analytical approach (Second-order Taylor series)​
    •  Enlarging​
      • Han et. Al – Ghost Maps​
    •  Hardware-based Solutions​
      • Xing Wang – LCNNs in high silicon HI3519​
      • Alwyn Burger – FGPA​
    •  Knowledge Distillation proposed by Hinton et. al​
      • Papers by Huang et. al, Crowley et. al, and Furnello et. al propose some novel methods.
  3. Detailed Proposed Methodology a. Work-flow diagram b. Algorithm c. Utilized Deep Learning models details
  4. Experimental Environment and Evaluation metrics
  5. Results and Discussion a. Comparison of results in Table format b. Learning curves over the epochs (Accuracy, Loss, and Other metrics)
  6. Conclusion and Future work
  7. References

Important Questions for Lab-based Components

  1. Construction of MLP with and without using predefined functions
  2. Construction of CNN with different hyperparameters.
  3. Constructions of different types of Auto-encoders
  4. Constructions of RNN and LSTM with different hyper-parameters.
  5. Constructions of PNN.
  6. Different metrics and criteria for evaluating the performance of the Auto-encoders, CNN, and others.
  7. Visualization of metric curves and feature maps of all the above models.
Component Marks
Lab 40
Theory 45
Viva 15

Viva will be asked from the whole portion

Coding environment:- Google colab

Location :- in ab2/ab1 lab (exact location will be shared tomorrow)

Theory Paper pattern: 5 question (including problems) 410=40M 15=5M Each question may have sub divisions

Concentrate more on unit-2 and unit-3 (for both lab and theory)

Problems will be coming from

  1. Perceptron
  2. Convolution and CNN
  3. HNN
  4. HMM
  5. CRF

Also, u have to remember one question and problem for each topic. lab split: 15 +15 +10 15 - CNN/ RNN 15 - Auto Encoders 10 - MLP