Title | Transferring Knowledge between Neural Network Architectures |
Author | Suraj SRINIVAS |
Director of thesis | Dr. Francois Fleuret |
Co-director of thesis | |
Summary of thesis | Deep Neural Nets are powerful tools for statistical modelling. However, they are plagued by several problems, including, poor performance on small datasets, high cost of training and having large model sizes. Knowledge transfer methods have recently emerged as a unified framework to solve these problems. This involves transferring representations from an previously trained “teacher” model to an untrained “student” model. For our research, we propose to build algorithmic tools that improve knowledge transfer between neural networks. We intend to obtain improvements by using architecture independent quantities, which are useful for knowledge transfer between neural nets of different architectures. In our initial work we apply one such quantity, the Jacobian, and find that it indeed improves knowledge transfer. |
Status | middle |
Administrative delay for the defence | 2021 |
URL | https://surajsrinivas.wordpress.com/ |