Inter-region affinity distillation for road marking segmentation
We study the problem of distilling knowledge from a large deep teacher network to a much smaller student net- work for the task of road marking segmentation. In this work, we explore a novel knowledge distillation (KD) ap- proach that can transfer ‘knowledge’ on scene structure more effectively...
Saved in:
Main Authors: | , , , , |
---|---|
Other Authors: | |
Format: | Conference or Workshop Item |
Language: | English |
Published: |
2022
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/161798 https://openaccess.thecvf.com/menu |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | We study the problem of distilling knowledge from a
large deep teacher network to a much smaller student net-
work for the task of road marking segmentation. In this
work, we explore a novel knowledge distillation (KD) ap-
proach that can transfer ‘knowledge’ on scene structure
more effectively from a teacher to a student model. Our
method is known as Inter-Region Affinity KD (IntRA-KD).
It decomposes a given road scene image into different re-
gions and represents each region as a node in a graph. An
inter-region affinity graph is then formed by establishing
pairwise relationships between nodes based on their sim-
ilarity in feature distribution. To learn structural knowl-
edge from the teacher network, the student is required to
match the graph generated by the teacher. The proposed
method shows promising results on three large-scale road
marking segmentation benchmarks, i.e., ApolloScape, CU-
Lane and LLAMAS, by taking various lightweight mod-
els as students and ResNet-101 as the teacher. IntRA-
KD consistently brings higher performance gains on all
lightweight models, compared to previous distillation meth-
ods. Our code is available at https://github.com/
cardwing/Codes-for-IntRA-KD. |
---|