No CrossRef data available.
Article contents
Lyndon White, Roberto Togneri, Wei Liu and Mohammed Bennamoun, Neural Representations of Natural Language. Singapore, Springer, 2019. XIV + 122 pages, ISBN:9789811300615
Review products
Lyndon White, Roberto Togneri, Wei Liu and Mohammed Bennamoun, Neural Representations of Natural Language. Singapore, Springer, 2019. XIV + 122 pages, ISBN:9789811300615
Published online by Cambridge University Press: 22 May 2020
Abstract
An abstract is not available for this content so a preview has been provided. Please use the Get access link above for information on how to access this content.
- Type
- Book Review
- Information
- Copyright
- © The Author(s), 2020. Published by Cambridge University Press
References
Goodfellow, I., Bengio, Y. and Courville, A. (2016). Deep learning. Cambridge, MA: MIT Press.Google Scholar
Kalchbrenner, N. and Blunsom, P (2013). Recurrent continuous translation models. In Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, pp. 1700–1709.Google Scholar
Le, Q. and Mikolov, T. (2014). Distributed representations of sentences and documents. In Proceedings of the 31st International Conference on Machine Learning (ICML-14), pp. 1188–1196.Google Scholar
Sonada, S. and Murata, N. (2017). Neural network with unbounded activation functions is universal approximator. Applied and Computational Harmonic Analysis 43(2), 233–268.CrossRefGoogle Scholar
Sutskever, I., Vinyals, O. and Le, Q.V. (2014). Sequence to sequence learning with neural networks. In Advances in Neural Information Processing Systems, pp. 3104–3112.Google Scholar
Szalay, A. and Gray, J. (2006). Science in an exponential world. Nature 440, 413–414.CrossRefGoogle Scholar