Preprint Article Version 1 This version is not peer-reviewed

LSTM Neural Network for Textual Ngrams

Version 1 : Received: 23 November 2018 / Approved: 26 November 2018 / Online: 26 November 2018 (10:06:05 CET)

How to cite: D'Souza, S.C. LSTM Neural Network for Textual Ngrams. Preprints 2018, 2018110579 (doi: 10.20944/preprints201811.0579.v1). D'Souza, S.C. LSTM Neural Network for Textual Ngrams. Preprints 2018, 2018110579 (doi: 10.20944/preprints201811.0579.v1).

Abstract

Cognitive neuroscience is the study of how the human brain functions on tasks like decision making, language, perception and reasoning. Deep learning is a class of machine learning algorithms that use neural networks. They are designed to model the responses of neurons in the human brain. Learning can be supervised or unsupervised. Ngram token models are used extensively in language prediction. Ngrams are probabilistic models that are used in predicting the next word or token. They are a statistical model of word sequences or tokens and are called Language Models or Lms. Ngrams are essential in creating language prediction models. We are exploring a broader sandbox ecosystems enabling for AI. Specifically, around Deep learning applications on unstructured content form on the web.

Supplementary and Associated Material

https://doi.org/10.6084/m9.figshare.7344092.v2: LSTM Neural Network for Textual Ngrams

Subject Areas

Deep learning, Cognitive, LSTM, Neural network, Ngrams

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our diversity statement.

Leave a public comment
Send a private comment to the author(s)
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.