本页使用了标题或全文手工转换

激活函数

维基百科,自由的百科全书
跳到导航 跳到搜索

计算网络中, 一个节点的激活函数定义了该节点在给定的输入或输入的集合下的输出。标准的计算机芯片电路可以看作是根据输入得到(1)或(0)输出的数字网络激活函数。这与神经网络中的线性感知机的行为类似。然而,只有非線性激活函數才允許這種網絡僅使用少量節點來計算非平凡問題。 在人工神經網絡中,這個功能也被稱為傳遞函數

函數[编辑]

下表列出了幾個激活函數,它們是前幾層的一個fold x的函數。

名稱 函數圖形 方程式 導數 區間 Order of continuity 單調 Derivative Monotonic Approximates identity near the origin
恆等函數 Activation identity.svg
單位階躍函數 Activation binary step.svg
邏輯函數 (也被稱為S函數) Activation logistic.svg [1]
雙曲正切函數 Activation tanh.svg
反正切函數 Activation arctan.svg
Softsign[1][2] Activation softsign.png
Inverse square root unit (ISRU)[3] File:Activation ISRU.svg
線性整流函數(ReLU)
帶泄露線性整流函數(Leaky ReLU) Activation prelu.svg
參數化線性整流函數(PReLU)[4] Activation prelu.svg Yes iff Yes iff
帶泄露隨機線性整流函數(RReLU)[5] Activation prelu.svg [2]
指數線性函數(ELU)[6] Activation elu.svg Yes iff Yes iff Yes iff
擴展指數線性函數(SELU)[7]

with and

S 型線性整流激活函數(SReLU)[8]
are parameters.
反平方根線性函數(ISRLU)[3] File:Activation ISRLU.svg
自適應分段線性函數(APL)[9] [3]
SoftPlus[10] Activation softplus.svg
彎曲恆等函數 Activation bent identity.svg
Sigmoid-weighted linear unit (SiLU)[11] (也被稱為Swish[12]) [4] [5]
SoftExponential[13] Activation soft exponential.svg Yes iff
正弦函數 Activation sinusoid.svg
Sinc函數 Activation sinc.svg
高斯函數 Activation gaussian.svg
^ 此處H單位階躍函數
^ α是在訓練時間從均勻分佈中抽取的隨機變量,並且在測試時間固定為分佈的期望值
^ ^ ^ 此處邏輯函數

下表列出了幾個激活函數,它們不是前幾層的一個fold x的函數。

名稱 方程式 導數 區間 Order of continuity
Softmax    for i = 1, …, J [6]
Maxout[14]

^ 此處δ克羅內克δ函數

參見[编辑]

參考資料[编辑]

  1. ^ Bergstra, James; Desjardins, Guillaume; Lamblin, Pascal; Bengio, Yoshua. Quadratic polynomials learn better image features". Technical Report 1337. Département d’Informatique et de Recherche Opérationnelle, Université de Montréal. 2009. 
  2. ^ Glorot, Xavier; Bengio, Yoshua, Understanding the difficulty of training deep feedforward neural networks (PDF), International Conference on Artificial Intelligence and Statistics (AISTATS’10), Society for Artificial Intelligence and Statistics, 2010 
  3. ^ 3.0 3.1 Carlile, Brad; Delamarter, Guy; Kinney, Paul; Marti, Akiko; Whitney, Brian. Improving Deep Learning by Inverse Square Root Linear Units (ISRLUs). 2017-11-09. arXiv:1710.09967 [cs.LG]. 
  4. ^ He, Kaiming; Zhang, Xiangyu; Ren, Shaoqing; Sun, Jian. Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification. 2015-02-06. arXiv:1502.01852 [cs.CV]. 
  5. ^ Xu, Bing; Wang, Naiyan; Chen, Tianqi; Li, Mu. Empirical Evaluation of Rectified Activations in Convolutional Network. 2015-05-04. arXiv:1505.00853 [cs.LG]. 
  6. ^ Clevert, Djork-Arné; Unterthiner, Thomas; Hochreiter, Sepp. Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs). 2015-11-23. arXiv:1511.07289 [cs.LG]. 
  7. ^ Klambauer, Günter; Unterthiner, Thomas; Mayr, Andreas; Hochreiter, Sepp. Self-Normalizing Neural Networks. 2017-06-08. arXiv:1706.02515 [cs.LG]. 
  8. ^ Jin, Xiaojie; Xu, Chunyan; Feng, Jiashi; Wei, Yunchao; Xiong, Junjun; Yan, Shuicheng. Deep Learning with S-shaped Rectified Linear Activation Units. 2015-12-22. arXiv:1512.07030 [cs.CV]. 
  9. ^ Forest Agostinelli; Matthew Hoffman; Peter Sadowski; Pierre Baldi. Learning Activation Functions to Improve Deep Neural Networks. 21 Dec 2014. arXiv:1412.6830 [cs.NE]. 
  10. ^ Glorot, Xavier; Bordes, Antoine; Bengio, Yoshua. Deep sparse rectifier neural networks (PDF). International Conference on Artificial Intelligence and Statistics. 2011. 
  11. ^ Sigmoid-Weighted Linear Units for Neural Network Function Approximation in Reinforcement Learning
  12. ^ Searching for Activation Functions
  13. ^ Godfrey, Luke B.; Gashler, Michael S. A continuum among logarithmic, linear, and exponential functions, and its potential to improve generalization in neural networks. 7th International Joint Conference on Knowledge Discovery, Knowledge Engineering and Knowledge Management: KDIR. 2016-02-03, 1602: 481–486. Bibcode:2016arXiv160201321G. arXiv:1602.01321. 
  14. ^ Goodfellow, Ian J.; Warde-Farley, David; Mirza, Mehdi; Courville, Aaron; Bengio, Yoshua. Maxout Networks. JMLR WCP. 2013-02-18, 28 (3): 1319–1327. Bibcode:2013arXiv1302.4389G. arXiv:1302.4389.