DropBlock: A regularization method for convolutional networks

Type: Preprint

Publication Date: 2018-01-01

Citations: 414

DOI: https://doi.org/10.48550/arxiv.1810.12890

Locations

  • arXiv (Cornell University) - View
  • DataCite API - View

Similar Works

Action Title Year Authors
+ DropFilter: A Novel Regularization Method for Learning Convolutional Neural Networks 2018 Hengyue Pan
Hui Jiang
Xin Niu
Yong Dou
+ DropCluster: A structured dropout for convolutional networks 2020 Liyan Chen
Philip Gautier
Sergül Aydöre
+ DropFilter: Dropout for Convolutions. 2018 Zhengsu Chen
Jianwei Niu
Qi Tian
+ Effective and Efficient Dropout for Deep Convolutional Neural Networks 2019 Shaofeng Cai
Jinyang Gao
Meihui Zhang
Wei Wang
Gang Chen
Beng Chin Ooi
+ R-Block: Regularized Block of Dropout for convolutional networks 2023 Liqi Wang
Qiya Hu
+ AutoDropout: Learning Dropout Patterns to Regularize Deep Networks 2021 Hieu Pham
Quoc V. Le
+ AutoDropout: Learning Dropout Patterns to Regularize Deep Networks 2021 Hieu Pham
Quoc V. Le
+ PDF Chat AutoDropout: Learning Dropout Patterns to Regularize Deep Networks 2021 Hieu Pham
Quoc V. Le
+ PDF Chat Avoiding Overfitting: A Survey on Regularization Methods for Convolutional Neural Networks 2022 Claudio Filipi Gonçalves dos Santos
João Paulo Papa
+ PDF Chat FocusedDropout for Convolutional Neural Network 2022 Minghui Liu
Tianshu Xie
Xuan Cheng
Jiali Deng
Meiyi Yang
Xiaomin Wang
Ming Liu
+ FocusedDropout for Convolutional Neural Network 2021 Tianshu Xie
Minghui Liu
Jiali Deng
Xuan Cheng
Xiaomin Wang
Ming Liu
+ Shakeout: A New Approach to Regularized Deep Neural Network Training 2019 Guoliang Kang
Jun Li
Dacheng Tao
+ MaxDropout: Deep Neural Network Regularization Based on Maximum Output Values 2020 Claudio Filipi Goncalves Santos
Danilo Colombo
Mateus Roder
João Paulo Papa
+ PDF Chat MaxDropout: Deep Neural Network Regularization Based on Maximum Output Values 2021 Claudio Filipi Goncalves do Santos
Danilo Colombo
Mateus Roder
João Paulo Papa
+ PDF Chat TargetDrop: A Targeted Regularization Method for Convolutional Neural Networks 2022 Hui Zhu
Xiaofang Zhao
+ Dropout Reduces Underfitting 2023 Zhuang Liu
Zhiqiu Xu
Joseph W. Jin
Zhiqiang Shen
Trevor Darrell
+ Generalized Dropout 2016 Suraj Srinivas
R. Venkatesh Babu
+ Generalized Dropout 2016 Suraj Srinivas
R. Venkatesh Babu
+ TargetDrop: A Targeted Regularization Method for Convolutional Neural Networks 2020 Hui Zhu
Xiaofang Zhao
+ Dropout with Tabu Strategy for Regularizing Deep Neural Networks 2018 Zongjie Ma
Abdul Sattar
Jun Zhou
Qingliang Chen
Kaile Su

Works That Cite This (131)

Action Title Year Authors
+ An overview of mixing augmentation methods and augmentation strategies 2022 Dominik Lewy
Jacek Mańdziuk
+ PDF Chat Beyond Dropout: Feature Map Distortion to Regularize Deep Neural Networks 2020 Yehui Tang
Yunhe Wang
Yixing Xu
Boxin Shi
Chao Xu
Chunjing Xu
Chang Xu
+ PDF Chat TransFER: Learning Relation-aware Facial Expression Representations with Transformers 2021 Fanglei Xue
Qiangchang Wang
Guodong Guo
+ PDF Chat Instance-Aware, Context-Focused, and Memory-Efficient Weakly Supervised Object Detection 2020 Zhongzheng Ren
Zhiding Yu
Xiaodong Yang
Ming-Yu Liu
Yong Jae Lee
Alexander G. Schwing
Jan Kautz
+ PDF Chat Exploring Complementary Strengths of Invariant and Equivariant Representations for Few-Shot Learning 2021 Mamshad Nayeem Rizve
Salman Khan
Fahad Shahbaz Khan
Mubarak Shah
+ PDF Chat Deeply-Supervised Knowledge Synergy 2019 Dawei Sun
Anbang Yao
Aojun Zhou
Hao Zhao
+ PDF Chat Scheduled DropHead: A Regularization Method for Transformer Models 2020 Wangchunshu Zhou
Tao Ge
Furu Wei
Ming Zhou
Ke Xu
+ PDF Chat Convolutional Neural Networks With Dynamic Regularization 2020 Yi Wang
Zhen-Peng Bian
Junhui Hou
Lap‐Pui Chau
+ PDF Chat Tensor Dropout for Robust Learning 2021 Arinbjörn Kolbeinsson
Jean Kossaifi
Yannis Panagakis
Adrian Bulat
Animashree Anandkumar
Ioanna Tzoulaki
Paul M. Matthews
+ PDF Chat NATS-Bench: Benchmarking NAS Algorithms for Architecture Topology and Size 2021 Xuanyi Dong
Lu Liu
Katarzyna Musiał
Bogdan Gabryś