Smooth activations and reproducibility in deep networks

Type: Preprint

Publication Date: 2020-01-01

Citations: 7

DOI: https://doi.org/10.48550/arxiv.2010.09931

Locations

  • arXiv (Cornell University) - View
  • DataCite API - View

Similar Works

Action Title Year Authors
+ Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs) 2015 Djork-Arné Clevert
Thomas Unterthiner
Sepp Hochreiter
+ Piecewise Linear Units Improve Deep Neural Networks 2021 Jordan Inturrisi
Suiyang Khoo
Abbas Z. Kouzani
Riccardo M. Pagliarella
+ When does gradient descent with logistic loss interpolate using deep networks with smoothed ReLU activations? 2021 Niladri S. Chatterji
Philip M. Long
Peter L. Bartlett
+ Gradients explode - Deep Networks are shallow - ResNet explained 2017 George Philipp
Dawn Song
Jaime Carbonell
+ PDF Chat Hysteresis Activation Function for Efficient Inference 2024 Moshe Kimhi
Idan Kashani
Avi Mendelson
Chaim Baskin
+ Householder-Absolute Neural Layers For High Variability and Deep Trainability. 2021 Yueyao Yu
Yin Zhang⋆
+ E-swish: Adjusting Activations to Different Network Depths 2018 Eric Alcaide
+ Farkas layers: don't shift the data, fix the geometry. 2019 Aram-Alexandre Pooladian
Chris Finlay
Adam M. Oberman
+ E-swish: Adjusting Activations to Different Network Depths. 2018 Eric Alcaide
+ Farkas layers: don't shift the data, fix the geometry 2019 Aram-Alexandre Pooladian
Christopher C. Finlay
Adam M. Oberman
+ Adaptive Blending Units: Trainable Activation Functions for Deep Neural Networks 2018 Leon René Sütfeld
Flemming Brieger
Holger Finger
Sonja FĂĽllhase
Gordon Pipa
+ Adaptive Blending Units: Trainable Activation Functions for Deep Neural Networks 2018 Leon René Sütfeld
Flemming Brieger
Holger Finger
Sonja FĂĽllhase
Gordon Pipa
+ PDF Chat Stable and Robust Deep Learning By Hyperbolic Tangent Exponential Linear Unit (TeLU) 2024 Alfredo Fernandez
Ankur Mali
+ FReLU: Flexible Rectified Linear Units for Improving Convolutional Neural Networks 2017 Suo Qiu
Xiangmin Xu
Bolun Cai
+ Flexible Rectified Linear Units for Improving Convolutional Neural Networks 2017 Suo Qiu
Bolun Cai
+ Empirical Evaluation of Rectified Activations in Convolutional Network 2015 Bing Xu
Naiyan Wang
Tianqi Chen
Mu Li
+ Rational neural networks 2020 Nicolas Boullé
Yuji Nakatsukasa
Alex Townsend
+ Rational neural networks 2020 Nicolas Boullé
Yuji Nakatsukasa
Alex Townsend
+ Learning specialized activation functions with the Piecewise Linear Unit 2021 Yucong Zhou
Zezhou Zhu
Zhao Zhong
+ Data-aware customization of activation functions reduces neural network error 2023 Fuchang Gao
Boyu Zhang