Light-weight network, depth estimation, knowledge distillation, real-time depth estimation, auxiliary data

Boosting Light-Weight Depth Estimation Via Knowledge Distillation, https://arxiv.org/abs/2105.06143
Junjie Hu, Chenyou Fan, Hualie Jiang, Xiyue Guo, Yuan Gao, Xiangyong Lu, Tin Lun Lam
Introduction
This repo provides trained models and evaluation code for the light weight model for depth estimation.
We aim to enable depth estimation to be both computationally efficient and accurate.
We developed a knowledge distillation method with auxiliary unlabeled/labeled data for this purpose.
We found that:
-
Even without access to the original training set, we can still successfully apply KD with enough auxiliary unlabeled samples as long as they