Submitted:
03 February 2025
Posted:
04 February 2025
You are already at the latest version
Abstract
MycelialNet is a novel deep neural network (DNN) architecture inspired by natural mycelial networks. Mycelia, the vegetative part of fungi, form extensive underground networks that connect in a very efficient way biological entities, transport nutrients and signals, and dynamically adapt to environmental conditions. Drawing inspiration from these properties, MycelialNet integrates dynamic connectivity, self-optimization, and resilience into its artificial structure. This paper explores how mycelial-inspired neural networks can enhance big data analysis, particularly in mineralogy, petrology, and other Earth disciplines, where exploration and exploitation must be efficiently balanced during the process of data mining. We validate our approach by applying MycelialNet to synthetic data first, and then to a large petrological database of volcanic rock samples, demonstrating its superior feature extraction, clustering, and classification capabilities with respect to other conventional machine learning methods.
Keywords:
1. Introduction
2. Methodology
2.1. Key Components of MycelialNets
- 1)
- “MicelialLayer”: this is a dynamic layer that adjusts its connectivity during training, pruning weak connections while regenerating new ones to optimize learning pathways.
- 2)
- Dynamic Connectivity: this functionality is inspired by mycelial exploration strategies. The network restructures itself iteratively, mirroring the adaptability of fungal networks.
- 3)
- Self-Monitoring Mechanism: the MycelialNet model incorporates self-reflection mechanisms. This aspect is inspired to the self-awareness of biological brain, as discussed in previous works [16]). Adjusting its connectivity ratio based on performance metrics such as accuracy, the MycelialNet model can continuously monitor itself, adapting its own architecture to dynamic environmental conditions on time-varying data sets.
- 4)
- Exploration Factor: this is an additional component that encourages the model to explore diverse configurations and hyperparameters. It provides a dynamic balance between exploration/exploitation ratio, when the hyper-parameter space is explored by the network model, with the final goal to set an optimal MycelialNet architecture.
2.2. Mathematical Formulation
- Wt is the weight matrix at time t,
- Wt-1 is the weight matrix at the previous time step t-1,
- Mt ∈{0,1}n×d is a binary mask matrix controlling active connections at time t,
- n is the number of features, m is the number of samples (as anticipate earlier),
- d is the number of neurons in the layer.
- ⊙ represents the Hadamard (elementwise) product,
- η is the learning rate,
- is the gradient of the loss function L at time t.
- The total loss function is
- m is the number of samples (e.g., rock samples), as state earlier,
- k is the number of output classes (e.g., types of rock), as state earlier,
- is the true label (one-hot encoded, where for the true class and otherwise),
- s the predicted probability for the j-th class for the i-th sample, computed using the Softmax function.
- is the activation from the previous neuronal layer,
- is the dynamically adjusted weight matrix,
- is the bias vector,
- is an activation function (e.g., Rectified Linear Unit, briefly ReLU, or sigmoid, as well as other activation functions settable by the user).
3. Simulations
3.1. First Synthetic Test
3.2. Addressing Non-Linear Classification Challenges with MycelialNet
4. Test on a Real Data Set of Rock Samples
4.1. Introducing the Test
4.2. The Data Set
4.3. Workflow
4.4. Supervised Learning and Classification Results
5. Discussion
6. Conclusions
Supplementary Materials
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Russell, S. and Norvig, P., Artificial Intelligence: A Modern approach, Global Edition, published by Pearson Education, Inc. 2016, publishing as Prentice Hall.
- Raschka, S. and Mirjalili, V., Python Machine Learning: Machine Learning and Deep Learning with Python, scikit-learn, and TensorFlow, 2nd Edition, PACKT Books, 2017.
- Ravichandiran, S., Deep Reinforcement learning with Python, Packt Publishing, 2020.
- Ribeiro, C. and Szepesvári, C., Q-learning combined with spreading: Convergence and results. In Proceedings of the ISRF-IEE International Conference: Intelligent and Cognitive Systems (Neural Networks Symposium), 1996, pages 32–36.
- Barnes, A.E.; Laughlin, K.J., Investigation of methods for unsupervised classification of seismic data. In Expanded Abstracts; SEG Technical Program: Salt Lake City, UT, USA 2002; pp. 2221–2224. [CrossRef]
- Bestagini, P.; Lipari, V.; Tubaro, S., A machine learning approach to facies classification using well logs. In Expanded Abstracts; SEG Technical Program: Houston, TX, USA 2017; pp. 2137–2142. [CrossRef]
- Dell’Aversana, P. Comparison of different Machine Learning algorithms for lithofacies classification from well logs, Bull. Geophys. Oceanogr. 2017, 60, 69–80. [Google Scholar] [CrossRef]
- Sheldrake, M., Entangled life: how fungi make our worlds, change our minds & shape our futures. First US edition 2020, New York, Random House.
- Damasio, A., Self Comes to Mind: Constructing the Conscious Brain; Pantheon: New York, NY, USA 2010.
- Edelman, G.M., Neural Darwinism: The Theory of Neuronal Group Selection; Basic Books: New York, NY, USA 1987; ISBN 0-19-286089-5.
- Edelman, G.M, Bright Air, Brilliant Fire: On the Matter of the Mind; Reprint Edition 1993; Basic Books: New York, NY, USA 1992; ISBN 0-465-00764-3.
- Tononi, G.; Boly, M.; Massimini, M.; Koch, C., Integrated information theory: From consciousness to its physical substrate. Nat. Rev. Neurosci. 2016, 17, 450–461. [CrossRef]
- Tononi, G.; Edelman, G.M., Consciousness and complexity. Science 1998, 282, 1846–1851.
- Panksepp, J. and Biven, L., The Archaeology of Mind: Neuroevolutionary Origins of Human Emotions (Norton Series on Interpersonal Neurobiology), 2012.
- Panksepp, J., & Moskal, J., Dopamine and SEEKING: Subcortical “reward” systems and appetitive urges. In A. J. Elliot (Ed.) 2008, Handbook of approach and avoidance motivation (pp. 67–87). Psychology Press.
- Dell’Aversana, P., Enhancing Deep Learning and Computer Image Analysis in Petrography through Artificial Self-Awareness Mechanisms. Minerals 2024, 14, 247. [CrossRef]
- Dell’Aversana, P., Deep Learning for automatic classification of mineralogical thin sections. Bull. Geophys. Oceanogr. 2021, 62, 455–466. [CrossRef]
- Hall, B. Facies classification using machine learning. Lead. Edge 2016, 35, 906–909. [CrossRef]
- She, Y.; Wang, H.; Zhang, X.; Qian, W., Mineral identification based on machine learning for mineral resources exploration. J. Appl. Geophys. 2019, 168, 68–77.
- Liu, K.; Liu, J.; Wang, K.; Wang, Y.; Ma, Y., Deep learning-based mineral classification in thin sections using convolutional neural network. Minerals 2020, 10, 1096.
- Mamani, M., Wörner, G., & Sempere, T., Geochemical variations in igneous rocks of the Central Andean orocline (13°S to 18°S): Tracing crustal thickening and magma generation through time and space. GSA Bulletin 2010 122(1–2), 162–182. [CrossRef]







| Method | Accuracy |
| Random Forest | 0.625 |
| Logistic Regression | 0.65 |
| Standard Neural Network | 0.69 |
| MycelialNet model | 0.875 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).