跳转到内容

艾力克斯·格雷夫斯 (计算机科学家)

维基百科,自由的百科全书
艾力克斯·格雷夫斯
Alex Graves
母校爱丁堡大学BSc
IDSIA英语Dalle Molle Institute for Artificial Intelligence ResearchPhD
知名于连接主义时间分类英语Connectionist temporal classification
神经图灵机英语Neural Turing machine
可微分神经计算机英语Differentiable neural computer
科学生涯
研究领域计算机科学
机构DeepMind
博士导师于尔根·施密德胡伯

艾力克斯·格雷夫斯(英语:Alex Graves)是一名计算机科学家。在DeepMind担任研究科学家之前,他在爱丁堡大学获得理论物理学学士学位,并在IDSIA英语Dalle Molle Institute for Artificial Intelligence Research于尔根·施密德胡伯指导下获得了人工智能博士学位[1]。他还曾在慕尼黑工业大学的施密德胡伯和多伦多大学杰弗里·辛顿手下做过博士后[2]

在IDSIA,格雷夫斯通过一种称为连接主义时间分类英语Connectionist temporal classification(CTC)的新方法训练长短期记忆神经网络[3]。这种方法在某些应用中的表现优于传统的语音识别模型[4]。2009年,他的CTC训练的LSTM是第一个赢得模式识别比赛的循环神经网络,并赢得连接手写识别方面的几个比赛[5][6]。这种方法已经变得非常流行。Google在智能手机上使用CTC训练的LSTM进行语音识别[7][8]

格雷夫斯也是神经图灵机英语Neural Turing machine[9]和密切相关的可微分神经计算机英语Differentiable neural computer的创造者[10][11]

参考资料

[编辑]
  1. ^ Alex Graves. Canadian Institute for Advanced Research. (原始内容存档于1 May 2015). 
  2. ^ Marginally Interesting: What is going on with DeepMind and Google?. Blog.mikiobraun.de. 28 January 2014 [May 17, 2016]. (原始内容存档于2016-05-22). 
  3. ^ Alex Graves, Santiago Fernandez, Faustino Gomez, and Jürgen Schmidhuber (2006). Connectionist temporal classification: Labelling unsegmented sequence data with recurrent neural nets. Proceedings of ICML’06, pp. 369–376.
  4. ^ Santiago Fernandez, Alex Graves, and Jürgen Schmidhuber (2007). An application of recurrent neural networks to discriminative keyword spotting. Proceedings of ICANN (2), pp. 220–229.
  5. ^ Graves, Alex; and Schmidhuber, Jürgen; Offline Handwriting Recognition with Multidimensional Recurrent Neural Networks, in Bengio, Yoshua; Schuurmans, Dale; Lafferty, John; Williams, Chris K. I.; and Culotta, Aron (eds.), Advances in Neural Information Processing Systems 22 (NIPS'22), December 7th–10th, 2009, Vancouver, BC, Neural Information Processing Systems (NIPS) Foundation, 2009, pp. 545–552
  6. ^ A. Graves, M. Liwicki, S. Fernandez, R. Bertolami, H. Bunke, J. Schmidhuber. A Novel Connectionist System for Improved Unconstrained Handwriting Recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 31, no. 5, 2009.
  7. ^ Google Research Blog. The neural networks behind Google Voice transcription. August 11, 2015. By Françoise Beaufays http://googleresearch.blogspot.co.at/2015/08/the-neural-networks-behind-google-voice.html页面存档备份,存于互联网档案馆
  8. ^ Google Research Blog. Google voice search: faster and more accurate. September 24, 2015. By Haşim Sak, Andrew Senior, Kanishka Rao, Françoise Beaufays and Johan Schalkwyk – Google Speech Team http://googleresearch.blogspot.co.uk/2015/09/google-voice-search-faster-and-more.html页面存档备份,存于互联网档案馆
  9. ^ Google's Secretive DeepMind Startup Unveils a "Neural Turing Machine". [May 17, 2016]. (原始内容存档于2016-08-13). 
  10. ^ Graves, Alex; Wayne, Greg; Reynolds, Malcolm; Harley, Tim; Danihelka, Ivo; Grabska-Barwińska, Agnieszka; Colmenarejo, Sergio Gómez; Grefenstette, Edward; Ramalho, Tiago. Hybrid computing using a neural network with dynamic external memory. Nature. 2016-10-12, 538 (7626): 471–476 [2023-02-23]. Bibcode:2016Natur.538..471G. ISSN 1476-4687. PMID 27732574. S2CID 205251479. doi:10.1038/nature20101. (原始内容存档于2022-10-02) (英语). 
  11. ^ Differentiable neural computers | DeepMind. DeepMind. [2016-10-19]. (原始内容存档于2023-07-04).