Preprint
Article

This version is not peer-reviewed.

Tensor Logic of Embedding Vectors in Neural Networks

Submitted:

05 February 2026

Posted:

06 February 2026

You are already at the latest version

Abstract
Current Artificial Neural Networks based on Large Language Models (LLMs) primarily use statistical token prediction, often lacking rigorous structural semantic consistency and illocutionary force. This paper introduces the \textbf{Tensor Functional Language Logic (T-FLL)} as a formal bridge between symbolic reasoning and continuous neural manifolds. We redefine linguistic units as functional noemes and propose a mapping of logical operators onto tensor operations. Sentences are translated into \emph{noematic formulae}, and we show that the attention mechanism driving the semantics of a dialog can be reformulated more efficiently if directed by the noematic formulae. In this way, we outline a path toward more explainable and structurally sound AI architectures.
Keywords: 
;  ;  ;  ;  
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated