Knowledge Concentration: Learning 100K Object Classifiers in a Single CNN

Type: Preprint

Publication Date: 2017-01-01

Citations: 25

DOI: https://doi.org/10.48550/arxiv.1711.07607

Locations

  • arXiv (Cornell University) - View
  • DataCite API - View

Similar Works

Action Title Year Authors
+ Feature Matters: A Stage-by-Stage Approach for Knowledge Transfer. 2018 Mengya Gao
Yujun Shen
Quanquan Li
Chen Change Loy
Xiaoou Tang
+ Beyond Classification: Knowledge Distillation using Multi-Object Impressions 2021 Gaurav Kumar Nayak
Monish Keswani
Sharan Seshadri
Anirban Chakraborty
+ Beyond Classification: Knowledge Distillation using Multi-Object Impressions. 2021 Gaurav Kumar Nayak
Monish Keswani
Sharan Seshadri
Anirban Chakraborty
+ An Embarrassingly Simple Approach for Knowledge Distillation 2018 Mengya Gao
Yujun Shen
Quanquan Li
Junjie Yan
Liang Wan
Dahua Lin
Chen Change Loy
Xiaoou Tang
+ An Embarrassingly Simple Approach for Knowledge Distillation 2018 Mengya Gao
Yujun Shen
Quanquan Li
Junjie Yan
Liang Wan
Dahua Lin
Chen Change Loy
Xiaoou Tang
+ Distilling Object Detectors With Global Knowledge 2022 Sanli Tang
Zhongyu Zhang
Zhanzhan Cheng
Jing LĂź
Yunlu Xu
Yi Niu
Fan He
+ PDF Chat Distilling Knowledge via Knowledge Review 2021 Pengguang Chen
Shu Liu
Hengshuang Zhao
Jiaya Jia
+ HEAD: HEtero-Assists Distillation for Heterogeneous Object Detectors 2022 Luting Wang
Xiaojie Li
Yue Liao
Zeren Jiang
Jianlong Wu
Fei Wang
Qian Chen
Si Liu
+ Instance-Conditional Knowledge Distillation for Object Detection 2021 Zijian Kang
Peizhen Zhang
Xiangyu Zhang
Jian Sun
Nanning Zheng
+ PDF Chat Knowledge Distillation for Object Detection via Rank Mimicking and Prediction-Guided Feature Imitation 2022 Gang Li
Xiang Li
Yujie Wang
Shanshan Zhang
Yichao Wu
Liang Ding
+ Hybrid Knowledge Routed Modules for Large-scale Object Detection 2018 Chenhan Jiang
Hang Xu
Xiangdan Liang
Liang Lin
+ Better Teacher Better Student: Dynamic Prior Knowledge for Knowledge Distillation 2022 Zengyu Qiu
Xinzhu Ma
Kun‐Lin Yang
Liu Chun-ya
Jun Hou
Shuai Yi
Wanli Ouyang
+ PDF Chat Knowledge Distillation for Object Detection via Rank Mimicking and Prediction-guided Feature Imitation 2021 Gang Li
Xiang Li
Yujie Wang
Shanshan Zhang
Yichao Wu
Liang Ding
+ Knowledge Distillation for Object Detection via Rank Mimicking and Prediction-guided Feature Imitation 2021 Gang Li
Xiang Li
Yujie Wang
Shanshan Zhang
Yichao Wu
Liang Ding
+ PDF Chat Refine Myself by Teaching Myself: Feature Refinement via Self-Knowledge Distillation 2021 Mingi Ji
Seungjae Shin
Seunghyun Hwang
Gibeom Park
Il‐Chul Moon
+ Refine Myself by Teaching Myself: Feature Refinement via Self-Knowledge Distillation 2021 Mingi Ji
Seungjae Shin
Seunghyun Hwang
Gibeom Park
Il‐Chul Moon
+ Learning Lightweight Object Detectors via Multi-Teacher Progressive Distillation 2023 Shengcao Cao
Mengtian Li
James Hays
Deva Ramanan
Yixiong Wang
Liang-Yan Gui
+ Localization Distillation for Object Detection 2022 Zhaohui Zheng
Rongguang Ye
Ping Wang
Jun Wang
Dongwei Ren
Wangmeng Zuo
+ PDF Chat Localization Distillation for Dense Object Detection 2022 Zhaohui Zheng
Rongguang Ye
Ping Wang
Dongwei Ren
Wangmeng Zuo
Qibin Hou
Ming‐Ming Cheng
+ PDF Chat Frequency Attention for Knowledge Distillation 2024 Cuong The Pham
Van Anh Nguyen
Trung Le
Dinh Phung
Gustavo Carneiro
Thanh-Toan Do