We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
Lyndon White, Roberto Togneri, Wei Liu and Mohammed Bennamoun, Neural Representations of Natural Language. Singapore, Springer, 2019. XIV + 122 pages, ISBN:9789811300615
Review products
Lyndon White, Roberto Togneri, Wei Liu and Mohammed Bennamoun, Neural Representations of Natural Language. Singapore, Springer, 2019. XIV + 122 pages, ISBN:9789811300615
Published online by Cambridge University Press:
22 May 2020
An abstract is not available for this content so a preview has been provided. Please use the Get access link above for information on how to access this content.
Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)
References
Goodfellow, I., Bengio, Y. and Courville, A. (2016). Deep learning. Cambridge, MA: MIT Press.Google Scholar
Kalchbrenner, N. and Blunsom, P (2013). Recurrent continuous translation models. In Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, pp. 1700–1709.Google Scholar
Le, Q. and Mikolov, T. (2014). Distributed representations of sentences and documents. In Proceedings of the 31st International Conference on Machine Learning (ICML-14), pp. 1188–1196.Google Scholar
Sonada, S. and Murata, N. (2017). Neural network with unbounded activation functions is universal approximator. Applied and Computational Harmonic Analysis43(2), 233–268.CrossRefGoogle Scholar
Sutskever, I., Vinyals, O. and Le, Q.V. (2014). Sequence to sequence learning with neural networks. In Advances in Neural Information Processing Systems, pp. 3104–3112.Google Scholar
Szalay, A. and Gray, J. (2006). Science in an exponential world. Nature440, 413–414.CrossRefGoogle Scholar