Article
Version 1
Preserved in Portico This version is not peer-reviewed
LSTM Neural Network for Textual Ngrams
Version 1
: Received: 23 November 2018 / Approved: 26 November 2018 / Online: 26 November 2018 (10:06:05 CET)
How to cite: D'Souza, S.C. LSTM Neural Network for Textual Ngrams. Preprints 2018, 2018110579. https://doi.org/10.20944/preprints201811.0579.v1 D'Souza, S.C. LSTM Neural Network for Textual Ngrams. Preprints 2018, 2018110579. https://doi.org/10.20944/preprints201811.0579.v1
Abstract
Cognitive neuroscience is the study of how the human brain functions on tasks like decision making, language, perception and reasoning. Deep learning is a class of machine learning algorithms that use neural networks. They are designed to model the responses of neurons in the human brain. Learning can be supervised or unsupervised. Ngram token models are used extensively in language prediction. Ngrams are probabilistic models that are used in predicting the next word or token. They are a statistical model of word sequences or tokens and are called Language Models or Lms. Ngrams are essential in creating language prediction models. We are exploring a broader sandbox ecosystems enabling for AI. Specifically, around Deep learning applications on unstructured content form on the web.
Supplementary and Associated Material
https://doi.org/10.6084/m9.figshare.7344092.v2: LSTM Neural Network for Textual Ngrams
Keywords
Deep learning, Cognitive, LSTM, Neural network, Ngrams
Subject
Computer Science and Mathematics, Artificial Intelligence and Machine Learning
Copyright: This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Comments (0)
We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.
Leave a public commentSend a private comment to the author(s)
* All users must log in before leaving a comment