Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

LSTM Neural Network for Textual Ngrams

Version 1 : Received: 23 November 2018 / Approved: 26 November 2018 / Online: 26 November 2018 (10:06:05 CET)

How to cite: D'Souza, S.C. LSTM Neural Network for Textual Ngrams. Preprints 2018, 2018110579. D'Souza, S.C. LSTM Neural Network for Textual Ngrams. Preprints 2018, 2018110579.


Cognitive neuroscience is the study of how the human brain functions on tasks like decision making, language, perception and reasoning. Deep learning is a class of machine learning algorithms that use neural networks. They are designed to model the responses of neurons in the human brain. Learning can be supervised or unsupervised. Ngram token models are used extensively in language prediction. Ngrams are probabilistic models that are used in predicting the next word or token. They are a statistical model of word sequences or tokens and are called Language Models or Lms. Ngrams are essential in creating language prediction models. We are exploring a broader sandbox ecosystems enabling for AI. Specifically, around Deep learning applications on unstructured content form on the web.

Supplementary and Associated Material LSTM Neural Network for Textual Ngrams


Deep learning, Cognitive, LSTM, Neural network, Ngrams


Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0

Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.