Preprint
Review

This version is not peer-reviewed.

Fish Farming 5.0: Advanced Tools for a Smart Aquaculture Management

A peer-reviewed article of this preprint also exists.

Submitted:

08 September 2025

Posted:

09 September 2025

You are already at the latest version

Abstract
The principal goal of Precision Fish Farming (PFF) is to use data and new technologies such as sensors, cameras and internet connections to optimise fish-aquaculture operations. PFF improves fish farming operations, making them data driven, accurate and repeatable, reducing the effects of subjective choices by farmers. Thus, the daily management of operators based on manual practices and experience is shifted to knowledge-based automated processes. Modern sensors and animal bio-markers can be used to monitor environmental conditions, fish behaviour, growth performance and key health indicators in real time, generating large data sets at low cost. The use of artificial intelligence provides useful insights from big data. Machine learning and modelling algorithms predict future outcomes such as fish growth, food requirements or disease risk. The Internet of Things set up networks between connected devices on the farm for communication. Smart management systems can automatically adjust instruments such as aerators or feeders in response to sensor inputs. This integration between sensors, internet connectivity and the use of automated controls enables real-time precision management.
Keywords: 
;  ;  ;  
1. lntroduction
Precision livestock production, introduced in the 1990s, is rapidly spreading in the animal production sector, owing to the development of innovative technologies such as (i) sensors capable of providing, in real time and at an affordable cost, large amounts of data related to environmental variables and animal production, (ii) reduction of processing costs, (iii) use of artificial intelligence (AI) algorithms capable of extracting useful information and building predictive models from the data and (iv) the evolution of smart management systems based on the Internet of Things (IoT) [1].
According to Føre et al. [2], the main objectives of precision fish farming (PFF) are to (i) increase the accuracy, precision and repeatability of fish farming operations; (ii) foster autonomous and continuous monitoring of variables related to fish production and (iii)provide reliable decision-support tools to reduce the dependence on manual labour and subjective farmer assessments.
The achievement of these objectives is based on the adoption of innovative technologies in sensors, computer vision and AI integrated in an inter-connected cloud system [3]. At the core of this model lies an IoT platform capable of collecting data from different sources on the farm, analyzing the data and returning useful operational information [4].
Currently, most activities in the different stages of fish production are manually performed by experienced operators. Farmers directly inspect fish visually or using data-acquisition tools such as video cameras and interpret this information on the basis of their experience. Intelligent management systems used in developed and mature sectors are Atlantic salmon, rainbow trout, sea bass and sea bream farming [5].

2. PFF Methods in the Aquaculture Sector

2.1. Computer Vision Methods

With the increasing use of optical cameras and computer technology in aquaculture, machine vision systems provide an automated and non-invasive method for analyzing fish characteristics [6]. Computer vision allows images to be acquired and processed mimicking human visual perception [7]. In the fish production phase, the increasing use of underwater cameras has developed various computer vision systems, which can be grouped into four main categories [8].

2.1.1. Artificial Vision Based on Visible Light

Monocular video cameras and stereo vision systems are used for artificial vision [9] A typical system based on visible light continuously acquires fish images for a set period, allowing constant monitoring. These tools enable automatic fish detection and recognition analyzing image information at the pixel level. An artificial vision system is based on the following steps: acquisition of image sequences, identification of fish characteristics or behaviours, and data analysis [10]. The optical flow method makes it possible to avoid problems caused by occlusion in the video.

2.1.2. Artificial Vision Based on Infrared Light

This technology involves the acquisition of images using a colour filter during the day and black and white at night [7]. Paustina et al. [11] used a three-dimensional (3D) near-infrared (NIR) vision system has been developed, with an accuracy of 98%. The main advantage of infrared light-based systems is that the light is less absorbed and dispersed in water.

2.1.3. Stereo Vision Systems

These systems use two cameras placed at a fixed distance with a slight difference in perspective [6,9]. This configuration allows a precise detection and measurement of depth by calculating the position of points in two and 3D space using trigonometric formulas [6]. Fish are simultaneously analyzed, providing detailed data on their movement and position, with an estimated margin of error of 3% - 5% [7].

2.1.4. Light Detection and Ranging (LiDAR) Technology

LiDAR is a remote sensing system that uses laser beams to measure distances and movements in the aquatic environment with extreme precision.
Image processing methods allow to measure fish morphological and dimensional characteristics. The background is removed from the acquired image and numerical data are extracted. The process comprises five main steps: acquisition, digitalisation, enhancement, segmentation and measurement [12]. A machine vision system generally comprises a lighting system, a camera, hardware and software [6].
In the aquaculture sector, these technologies have numerous applications, ranging from feeding monitoring and management to growth assessment and animal welfare management.

2.2. Acoustic Methods

Acoustic sensors can be active or passive.

2.2.1. Active Acoustics

Active acoustics is used to estimate fish biomass, analyze spatial distribution, track and monitor behaviour. The principle is based on the use of transmitters that emit sound waves of a certain frequency and capture the reflected echo [13]. In the aquaculture sector, the main tools for acoustic telemetry are sonars (sonar, split-beam sonar and multi-beam-sonar), echo sounders and underwater microphones.

2.2.1.1. Sonars

Sonars are used for the detection, classification, positioning and tracking of fish. A sonar comprises a transmitter/receiver that detects the reflected signal (echo), even in 3D environments and converts it into an analysable digital image [14]. Sonar is the main method used for tracking and detecting fish on a large scale. The advantage of this technique is that it provides high-resolution video even in murky waters.
Split-beam sonars are capable of tracking fish in three dimensions. A horizontal scanning sonar can be used to increase the sampling volume near the surface [14].
Multi-beam sonars improve the accuracy of fish measurements compared to split beam sonars. However, in the case of crowded groups of fish, problems with occlusion may arise [14].
In recent years, split-beam sonars and dual-frequency sonar have been used to analyze the fish vertical distribution [14]. Low-frequency sonars can be affected by underwater noise, while high-frequency sonars provide high-resolution images. Images obtained with sonars are similar to those obtained with optical systems. However, this technique provides a greater amount of data and allows to obtain the 3D position.

2.2.1.2. Echosounder

The echo sounder technique is based on the use of a transducer that transmits sound waves. If these waves encounter a fish, with a density different from that of water, echoes are generated and converted into an electrical signal [13]. By this way, the movement of fish groups is analyzed. The split-beam broadband echo sounder allows you to identify the position of fish groups and analyze the behaviour of individual fish.

2.2.2. Passive Acoustics

2.2.2.1. Underwater Microphones

Passive acoustic methods use underwater microphones to record sound waves produced by fish. The effectiveness of sound detection depends on signal intensity, distance and methods used to reduce the background noise [14]. These instruments are designed to listen and record underwater sounds, useful for analyzing fish behaviour or detecting environmental changes. These methods can be also used to estimate fish growth rate and biomass over the long term. The main limitation of methods based on sound recording is that these signals are not emitted continuously but only at night or during the feeding phase.

2.3. Sensor-Based Methods

In recent years, various sensors have been used to monitor water quality, fish behaviour and physiology [15]. Sensor-based methods use less data than image-based methods. Sensors can be used in situations where visual observations are difficult to obtain. Currently, various sensors are used to monitor water quality variables, fish respiratory rate, swimming direction and speed, and physiological stress.

2.3.1. Environmental Sensors

In fish farms, specific sensors are used to monitor water quality [16]. The most common sensors are multi-parameter optical immersion probes [17]. These probes are used for the simultaneous measurement of different chemical and physical water parameters, including pH, temperature, dissolved oxygen (DO), turbidity, ammonia, nitrite and nitrate concentrations [18].

2.3.2. Acoustic Transmitters

Acoustic telemetry can be used to monitor fish spatial distribution and characteristics in real time [19]. These instruments may contain various sensors for measuring pressure, temperature, or acceleration. Accelerometers are based on a transmitter inserted into the fish's body that emits specific acoustic pulses that are recorded and analyzed [20,21]. Accelerometers can also record the tiniest movements of fish. The data obtained, such as respiratory rate, acceleration, or heart rate measurement, can be associated with behavioural and stress-related information [2,20,22,23,24,25,26,27,28]. One limitation of this technique is that the accelerometer is difficult to install in small fish and can cause fish health problems [29,30,31]. The combined use of different sensors can be useful for monitoring fish stress [13]. For example, a gyroscope and an accelerometer can be used in combination to monitor fish behaviour [13].

2.3.3. Biosensors

Biosensors are used to monitor fish behaviour and health [32,33] Gesto et al. [33]. Biosensors are used to measure biologically active substances such as antibodies, enzymes and microorganisms in fish [5]. Biosensors are able to detect small variations in the resistance and current of biologically active substances and convert them into electrical impulses. These measurements are essential for maintaining optimal conditions for fish growth and health.

3. Automatic Monitoring and Data Analysis

A PFF system comprises three main phases: (i) real-time monitoring of environmental and production parameters (sensors and cameras continuously collect data in the farm), (ii) predictive modelling (data are processed to generate future analyzes and estimates) and process control (confirming that the system is operating in accordance with set objectives) and (iii) decision-making phase (Figure 1) [34,35,36].
Each stage of the production process in the PFF system is managed and guided using data and algorithms. It is an automatic control system with input and output data flows. To achieve precise control, it is essential to develop a mathematical model that can integrate all the information from the different sub-systems [15,37]. A practical example is the estimation of certain parameters such as fish feeding, effluent release into the environment or swimming speed, using input data obtained from sonar on the vertical distribution of fish biomass [38]. After design and development, the IoT system needs to be tested to verify its reliability in the field [39]. The functionalities of a process-control system depend on the operating system, data mining algorithms, machine learning (ML) and integrated modules, which might include input/output drivers, process database generators, a human–machine interface, scanning programmes, alarm systems, tag group editors, dynamic data exchange servers, trend analyzers, report and messaging generators and remote diallers [40]. The optimal physical conditions of the rearing environment are defined, and the process-control system constantly monitors target parameters [39]. It automatically activates actuators, sending an alert to farmers in the event of deviation from expected values. A stable, high-quality internet connection is required to ensure the thorough monitoring of the system [41]. All information collected on the farm including meteorological information is sent via the internet to data processing centres. All data, acquired from sensors or external sources (e.g. the web), can be included in the feedback loop and used to make targeted supply and farm-management decisions (Figure 2).

4. Tasks and Models of AI

ML and deep learning (DL) are fundamental tools for optimising the management of big data [42]. The analysis process starts with the collection of a training dataset, which is used to build an initial model. ML is based on learning non-linear relationships between input and output. Instead of explicitly programming these relationships, the ML model automatically learns patterns by observing a large number of features with their labels. Once trained, the model is able to predict the correct output for new inputs. DL, a branch of ML, is based on artificial neural networks (ANNs) and is particularly effective when the volume of data is substantially large. In ML, the statistical method K-means is used for clustering analysis [43]. This makes it possible to separate the outline of the fish's body from the background.
The main ML learning types are mentioned below.

4.1. Supervised Learning

This method uses labelled data (matching input–output) to acquire underlying rules. It is mainly used for the classification and regression of data. In image analysis, the complex structure is disintegrated into several progressive steps: the first level can identify edges, the second level identifies corners and contours and the third level identifies complex shapes, until the complete object is recognised [44,45]. Examples of models include convolutional neural networks (CNNs) for image analysis and recurrent neural networks (RNNs) for sequential data.

4.2. Unsupervised Learning

This method uses unlabelled data to identify patterns and groupings, classifying input resources based on their characteristics.

4.3. Semi-Supervised Learning

This method uses a small amount of labelled data with a large set of unlabelled data to reduce annotation costs.

4.4. Reinforcement Learning

The model learns by interacting with the environment and receiving rewards or penalties based on actions taken.
ML can be used in four main tasks: classification, regression, clustering and dimensionality reduction [46]. The main ML/DL models used in aquaculture are decision trees, support vector machines (SVMs), ANNs, k-nearest neighbours (k-NNs) and CNNs, specialised in image and computer vision processing using convolutional and pooling layers to extract feature hierarchies [47]: RNNs suitable for sequential data with loop connections that allow remembering previous information, region-based CNNs (R-CNNs) and long short-term memory (LSTM) [46].
The training of a neural network comprises the mentioned steps: estimation of weights leading to correct predictions; initialisation with random weights; calculation of the output on the training set; measurement of the error with a loss function; updating the weights via gradient descent and backpropagation and repetition of the cycle until convergence is reached [43,46]. DL integrates feature extraction and model building into a single process (end-to-end), unlike traditional ML, where these steps are separate. Deep hierarchical structures simplify the modelling of complex non-linear relationships. DL is particularly effective with large volumes of data and in managing complex big data. Several DL prediction models might lack robustness in certain applications, but they offer excellent self-learning, generalizability and non-linear approximation capabilities [46,48].
A hybrid ML method makes the combined use of two aspects: supervised and unsupervised learning [46]. A large amount of unlabelled input data are used with a small amount of labelled input data. These models are used in various production systems such as floating cages, ponds, hatcheries and intensive aquaculture facilities. Their applications include visual fish recognition, biomass estimation, behavioural monitoring, feeding optimisation and environmental-condition prediction.

5. Using AI in Water Quality Monitoring

Water quality monitoring is critical to the success of fish production activities, as the growth and health of fish closely depend on the conditions of the aquatic environment. The water quality monitoring process involves several steps: (i) collection of environmental data via sensors for temperature, DO, light, pH and other parameters; (ii)transmission of the collected data to a control centre; (iii) analysis of the data on a cloud platform; (iv) sending decisions to the control centre and (v) transmission of feedback to field instruments [49,50]. In intensive aquaculture farms, a first critical point is the control of the farm’s physical environment, which includes the mentioned parameters: monitoring of water-quality variables, water distribution, pumping indices and effluent and waste management.
Four essential process-control models are used in intensive fish production in the ascending order of complexity: data recording systems or closed-loop controllers, programmable logic controller, supervisory control and data acquisition and distributed control system [50]. By analyzing the data from the sensors, AI algorithms can perform the mentioned actions: detect patterns and anomalies in the system; generate timely alerts and send them to the farmer; use predictive models to estimate changes in water quality; and analyze historical data by correlating water quality, weather conditions and fish feeding cycles.
The main variables in fish farming are temperature and DO [51,52,53]. Temperature directly affects the growth and health of fish [54,55]. AI can detect abnormal temperature changes and send real-time alerts to farmers [56,57], allowing rapid interventions and maintenance of optimal conditions [2]. DO is crucial for optimal fish growth and welfare. Ta and Wei [58] proposed a CNN model to solve the problem of DO estimation in intensive fish farms. The advanced algorithms, based on environmental parameters, optimises farm production variables according to the needs of different species [59,60,61].
Currently, several AI models focus on short-term forecasts [62,63]. Lu et al. [64] developed an integrated water quality monitoring system integrated with AI. For long-term predictions, the future challenge is to use space–temporal relationships between water quality characteristics and external factors [65,66]. A few models such as LSTM and RNNs were reported [56]. RNN models exhibit better performance in estimating DO in the short and long-term than traditional methods [67].
The critical issues in the implementation of IoT systems in PFF systems are the lack of (i) standardisation of the different sensors and devices used and (ii) interoperability between the different systems and (iii) the excessively high installation and maintenance costs [67].

6. Use of AI in Fish Biomass Estimation

In fish production, biomass estimation is an important parameter for assessing the growth rate and health status of fish during the different rearing stages [9].
Traditionally, biomass estimation is performed manually by sampling and weighing fish. However, this method is slow and laborious with a considerable margin of error [9]. In addition, handling fish during the weighing phase can result in stress, with negative consequences such as impeding fish growth. To overcome these limitations, alternative biomass estimation techniques have been developed and studied, including the use of AI [68]. In particular, the combination of computer vision and ML allows more accurate estimation of the size, weight, number and other fish biological parameters [68].
Image processing using a CNN [69] has demonstrated the effectiveness of DL in estimating fish weights and its ability to capture complex patterns and distinctive characteristics between different species. Lopez-Tejeida et al. [70] developed a system that integrated hardware and software with infrared cameras to automatically detect fish and calculate their weights and lengths. Mittún et al. [71] used a system with synchronised converging cameras, which could perform 3D segmentation of fish images. Weight was estimated from the length using the weight–length relation.
Fish counting can be performed with good accuracy using the image segmentation method [9].

7. Use of AI in Fish Feeding Activities

Fish feed is a crucial item of expenditure in the rearing phase, accounting for 40%–50% of total maintenance costs [48]. In addition, it is estimated that ~60% of the feed fed is dispersed into the water as particulate matter, causing pollution, decreasing DO and releasing harmful substances (ammonia, nitrogen, etc.), which can reduce fish growth. Optimisation of fish rations as per appetite is a crucial factor in maximising farm productivity [48]. A complete monitoring feeding system must include the following components: i) an image/video/acoustic/biosensor acquisition unit; ii) a hardware processing unit; iii) an automated feeding system [72,73]. The main challenge of this feeding method is the seamless integration between the different modules to achieve a very accuracy and precision.
Currently, farmers use underwater cameras to manually observe feeding behaviour and adjust the quantities of fish feed [74]. Use of precision feeding systems might improve fish production performance, reducing costs and environmental impact. In its simplest form, the daily ration is calculated using tables based on the number and size of fish. Notably, these systems do not consider crucial dynamic variables such as environmental characteristics and the state of fish health. The current trend is to supplement food table data with continuous monitoring of environmental variables (temperature, DO, etc.) and fish behaviour data. Signals from the sensors allow the feeding of fish to be automatically interrupted or adjusted, improving fish production efficiency. In sea cages, the feedback system is easier to implement than flow-through farms and re-circulation systems [40]. In remote marine cage-farming sites or those exposed to strong currents, wind and waves, daily manual feeding is not possible, necessitating automatic or remote control.
The mentioned technologies are used for automatic feeding control: (i) artificial vision (single, stereo, 3D and NIR cameras in low light) to monitor fish feeding, (ii) acoustic systems (sonar and underwater microphones) to detect pellet consumption and fish behaviour and (iii) acoustic telemetry to track fish position and activity levels [9,73).
The aim is to provide the optimal amount of feed to meet fish nutritional requirements. However, it should be noted that, the optimal amount of feed is influenced also by both physiological factors and external environmental conditions. The feeding behaviour of fish varies depending on their appetite [75]. The fish swimming behaviour varies before and after feeding [76]. If fish are hungry, they are more active, swimming with greater frequency and speed [77]. Conversely, after feeding, groups of fish tend to reduce their activity. In addition, it is important to consider variables such as swimming acceleration, turning angles and tail stroke frequency [77]. The sounds produced by fish during feeding are due to the movement of their bodies in the water and the chewing and swallowing of food. Underwater microphones can be used to quantify the duration, frequency and intensity of the sounds. Background noise caused by the environment or the physiological activities of the fish may affect the accuracy of acoustic data. One possible solution to this problem is to use visual and acoustic data in combination [56]. The amount, frequency and timing of feeding depend on an accurate assessment of the fish's hunger and satiety levels. The artificial vision-based method uses images or videos of fish feeding behaviour and determines a model that identifies feeding status. Depending on how the characteristics are extracted, either the traditional method or the deep learning-based method is used. With the traditional method, the images are segmented and the characteristics are extracted manually, while with the latter, this is not necessary. The data can be processed using AI algorithms to determine the feeding states: continue, reduce or stop feeding [4]. The first phase consists of extracting image features and creating a model with a non-linear mapping between input data and target results through continuous iterative training. The amount of feed given to the fish is adjusted according to their feeding activity [77]. Images or videos of fish feeding behaviour are used in ML methods to build a model that objectively identifies feeding status (Table 1) [56,105].
Uneaten feed, present in the water or at the bottom of a fish farm, can also be used indirectly to identify the fish feeding status. To identify uneaten feed, the most commonly used method is artificial vision, while also acoustic telemetry can be used, but it is expensive [106]. Recently, the MobileNet algorithm has been used by numerous researchers, allowing neural networks with fewer parameters to be used thanks to the lightweight classification method [6,109]. YOLO, CNN, and R-CNN models are generally used in fish feeding applications [87,99,108,109,110]. The first is based on a single-stage classification algorithm, while the others are based on two stages. DL methods allow high- and low-level features to be extracted from fish images. Zhou et al. [110] used a CNN model and computer vision to study the feeding intensity of fish, achieving superior performance over traditional methods. Cai et al. [111,112] developed an innovative two-stage approach to fish feeding using in the initial stage, a YOLOv8 model with a multiscale feature extraction module. According to Måløy et al. [74], 3D-CNN and RNN models enable optimal spatial and temporal analyzes of fish feeding data, improving behaviour attributes (feeding/non-feeding pattern). Feng et al. [92] used machine vision and a lightweight 3D ResNet-GloRe methods to study fish feeding behaviour and competition for food. Gu et al. [112] used to study fish feeding intensity a multimodal fusion network and Liu et al. [113] and Wu et al. [83] a method based on a video transformer. Ma et al. [114] used a fusion model (time and frequency) to study fish feeding behaviour based on a six-axis inertial sensor to detect changes on the water surface caused by fish feeding, while Du et al. [115] used a lightweight LC-GhostNet and a multi-feature fusion strategy.
AI can be used to optimise fish feeding schedules based on temperature, DO, feed nutritional values and species-specific biological parameters [116]. Zhao et al. [3] used a machine learning model and two variables (temperature and DO levels) to calculate fish requirements. It is also possible to develop customized plans for each fish, considering genetic characteristics, age and weight [117]. The calculation can be made using a single factor or with multiple factors [77].

8. Stock Assessment of Farmed Fish

In addition to feed management, there exist other potential applications of data mining and ML algorithms at all stages of fish production, spanning from hatchery to harvest. These applications include image processing and pattern recognition for assessing the quality of eggs and the end product. In hatcheries, the separation of diseased or dead eggs and larvae is traditionally performed via manual or semi-automated methods, which are laborious and error prone. To improve the efficiency of these activities, a method based on image analysis and the SVM model was used in a rainbow trout hatchery [1]. The integration between these tools counted, calibrated and sorted the eggs rapidly and accurately, making the process highly suitable for large-scale management [118]. Similarly, counting fish at different production stages, from rearing to marketing, is crucial for the optimisation of farm management. Presently, a few image processing algorithms are used in commercial fish farms [118]. For optimal stock management, data on individual fish characteristics such as length, weight, skin colour and sex are useful. During the various growth phases, computer vision systems and acoustic technologies can offer a practical, real-time alternative to the invasive physical sampling and weighing methods, which involve fish stress, labour and time [34].
Costa et al. [119] developed a system that used optical telemetry with dual underwater cameras to capture the images of fish and analyzed their sizes and shapes. Processing was performed using neural networks, geometric algorithms and Fourier analysis, enabling remote monitoring of the growth rate.

9. Integration of Feeding Practices with Behaviour and Welfare Monitoring

The behaviour of farmed fish is analyzed using various technologies such as artificial vision, acoustics, and biosensors. Images and videos are the most commonly used methods, although they are limited in cases of poor lighting and high background noise. Fish behaviour includes both normal behaviour such as feeding, swimming, and aggregation, as well as abnormal behaviours such as cannibalism, stress, and disease. Poor management of fish farms can lead to stressful situations for fish. It is very important to ensure the welfare and reduced stress of the fish [120,121].
Video cameras can can be used to detect behavioural changes in fish related to stress or disease (e.g. decreased swimming activity or abnormal movements) [122,123]. Acoustic telemetry can provide continuous data on individuals, monitoring their physiology and behaviour (heart rates, blood composition, 3D position, swimming rates and food intake) [24,124,125). The system involves electronic transmitters equipped with sensors, implanted or fixed on fish. Data are sent via sound signals to underwater receivers. Although this technique requires fish manipulation and occasional surgery, it is the only method for continuous physiological monitoring of an individual fish. ML algorithms allow the analysis of complex fish behaviour patterns, such as the study of swimming trajectories and spatial distribution, providing useful information on optimal density and environmental preferences (Table 2) [1,87,98,122,123,127,132,133,134,135,136,137,138,139].
Recenly, various techniques were used to analyze the abnormal behaviour of individuals or small groups of fish, such as stereoscopic video analysis [6], 3D neural networks [140], and ML analysis [6,76,90,91,129,141]. Data obtained from different sources are integrated and analyzed through multimodal data fusion [56,81,82,95,115].
AI can analyze water quality data (temperature, pH, DO, etc.) to identify correlations with specific diseases [110]. Some models predict disease outbreaks using certain environmental and meteorological variables [142], enabling targeted preventive interventions [7]. RNNs are particularly useful in analyzing sequential data such as videos, capturing temporal variations [108]. Zhao et al. [94] developed an RNN-based method for detecting anomalous behaviour in fish groups. Image processing and computer vision techniques are creating new possibilities for automatic detection of fish disease. The body surface of fish can provide crucial information on the occurrence of new diseases [143]. Lesions, colour changes or abnormal behaviour can be detected by analyzing images captured using video cameras in the farm. A further advantage is associated with disease prevention: predictive models can detect early signs of stress or infection in farmed species by analyzing environmental data, thus enabling preventive intervention [144,145,146].

10. Challenges and Future Prospects

Despite the potential of precision aquaculture (PFF), there still exist critical issues to be addressed for its full implementation.

10.1. Quality and Availability of Real-Time Data

Several fish farms lack simple and reliable tools to continuously monitor important parameters such as weight gain or health status. However, studies are developing new, non-invasive and reliable sensors. The availability of accurate data on animal variables in real time is essential, without which decision support continues to remain limited. Furthermore, predictive models need to accurately represent complex biological responses. The development of such models requires extensive multi-disciplinary research based on extensive data for reliable calibration.
10.2. lntegration and Standardisation
The aquaculture sector is highly fragmented, with numerous small farms and a wide variety of sensors and proprietary software. This creates difficulties in integrating data flows between different systems. For example, a plant might have one system for water-quality monitoring and another for fish feeding. However, they might not be compatible. The absence of common standards for formats, metrics and fish welfare indicators hinders their widespread adoption. Therefore, technologies should be harmonised using defined operational guidelines, and standardised measurement methods for key variables (e.g. stress indicators) should be validated for application across all species and farming systems [40].

10.3. Implementation Costs

The initial investment in sensors, infrastructure and training can be high. However, falling prices and increased efficiency often allow for costs to be amortised in the medium term. The future of PFF is geared towards high automation. The goal is to evolve towards a digital aquaculture, where the entire farm management is monitored and controlled via AI and robotics, with minimal human intervention. The following are the examples of some applications that are in the experimental phase: (i) digital twins (virtual models parallel to the real farm) that simulate scenarios, optimise decisions and predict outcomes under different operating conditions [147], (ii) advanced AI that relies on predictive systems that can anticipate health problems days in advance or adapt feeding regimes on an hourly basis to maximise efficiency [40] and (iii) underwater robotics that studies applications such as drones that can inspect nets, remove waste or automatically eliminate diseased fish [148].

9. Conclusions

PFF represents a real paradigm shift, moving the industry towards an intelligent, data-driven production model. The development of next-generation sensors makes it possible to collect production, physiological and behavioural data from fish at the individual and group levels.
The integration of sensors, AI and IoT automation enabled PFF for highly controlled and optimised management, with significant benefits in production yield, sustainability and animal welfare. However, challenges concerning data quality, standardisation and system integration exist, although the rapidly evolving technologies and the continuous advancement of research are expanding application possibilities. An ever-widening spread of these precision aquaculture techniques can be expected in the future.

References

  1. Huang, M.; Zhou, Y.G.; Yang, X.G. Optimizing feeding frequencies in fish: a meta-analysis and machine learning approach. Aquaculture 2025, 595, 741678. [Google Scholar] [CrossRef]
  2. Føre, M.; Frank, K.; Norton, T.; Svendsen, E.; Alfredsen, J.A.; Dempster, T.; Eguiraun, H.; Watson, W.; Stahl, A.; Sunde, L.M.; Schellewald, C.; Skøien, K.R.; Alver, M.O.; Berckmans, D. Precision fish farming: a new framework to improve production in aquaculture. Biosyst. Eng. 2018, 173, 176–193. [Google Scholar] [CrossRef]
  3. Zhao, S.; Zhang, S.; Liu, J.; Wang, H.; Zhu, J.; Li, D.; Zhao, R. Application of machine learning in intelligent fish aquaculture: a review. Aquaculture 2021, 540, 736724. [Google Scholar] [CrossRef]
  4. Vo, T.T.E.; Ko, H.; Huh, J.H.; Kim, Y. Overview of smart aquaculture system: focusing on applications of machine learning and computer vision. Electronics 2021, 10, 2882. [Google Scholar] [CrossRef]
  5. Brijs, J.; Føre, M.; Gräns, A.; Clark, T.D.; Axelsson, M.; Johansen, J.L. Biosensing technologies in aquaculture: how remote monitoring can bring us closer to our farm animals. Philos. Trans. R. Soc. B 2021, 376, 20200218. [Google Scholar] [CrossRef]
  6. Zhang, Y.; Xu, C.; Du, R.; Kong, Q.; Li, D.; Liu, C. MSIF-MobileNetV3: an improved MobileNetV3 based on multi-scale information fusion for fish feeding behavior analysis. Aquac. Eng. 2023, 102, 102338. [Google Scholar] [CrossRef]
  7. Saberioon, M.; Gholizadeh, A.; Cisar, P.; Pautsina, A.; Urban, J. Application of machine vision systems in aquaculture with emphasis on fish: state-of-the-art and key issues. Rev. Aquac. 2017, 9, 369–387. [Google Scholar] [CrossRef]
  8. Boudhane, M.; Nsiri, B. Underwater image processing method for fish localization and detection in submarine environment. J. Vis. Commun. Image Represent. 2016, 39, 226–238. [Google Scholar] [CrossRef]
  9. Li, D.; Wang, Z.; Wu, S.; Miao, Z.; Du, L.; Duan, Y. Automatic recognition methods of fish feeding behavior in aquaculture: a review. Aquaculture 2020, 528, 735508. [Google Scholar] [CrossRef]
  10. Desai, N.P.; Balucha, M.F.; Makrariyab, A.; MusheerAziz, R. Image processing model with deep learning approach for fish species classification. Turk. J. Comput. Math. Educ. 2022, 13, 85–89. [Google Scholar]
  11. Huang, M.; Zhou, Y.G.; Yang, X.G. Optimizing feeding frequencies in fish: a meta-analysis and machine learning approach. Aquaculture 2025, 595, 741678. [Google Scholar] [CrossRef]
  12. Føre, M.; Frank, K.; Norton, T.; Svendsen, E.; Alfredsen, J.A.; Dempster, T.; Eguiraun, H.; Watson, W.; Stahl, A.; Sunde, L.M.; Schellewald, C.; Skøien, K.R.; Alver, M.O.; Berckmans, D. Precision fish farming: a new framework to improve production in aquaculture. Biosyst. Eng. 2018, 173, 176–193. [Google Scholar] [CrossRef]
  13. Delgado, M.L.; Smith, N.; Whoriskey, F.; Devitt, S.; Novaczek, E.; Morris, C.J.; Kess, T.; Bradbury, I.; Iverson, S.; Bentzen, P.; Ruzzante, D.E. Northern cod (Gadus morhua) movement: insights from acoustic telemetry and genomics. 2025 J. Fish. Biol. 4.
  14. Zhao, S.; Zhang, S.; Liu, J.; Wang, H.; Zhu, J.; Li, D.; Zhao, R. Application of machine learning in intelligent fish aquaculture: a review. Aquaculture 2021, 540, 736724. [Google Scholar] [CrossRef]
  15. Vo, T.T.E.; Ko, H.; Huh, J.H.; Kim, Y. Overview of smart aquaculture system: focusing on applications of machine learning and computer vision. Electronics 2021, 10, 2882. [Google Scholar] [CrossRef]
  16. Brijs, J.; Føre, M.; Gräns, A.; Clark, T.D.; Axelsson, M.; Johansen, J.L. Biosensing technologies in aquaculture: how remote monitoring can bring us closer to our farm animals. Philos. Trans. R. Soc. B 2021, 376, 20200218. [Google Scholar] [CrossRef]
  17. Zhang, Y.; Xu, C.; Du, R.; Kong, Q.; Li, D.; Liu, C. MSIF-MobileNetV3: an improved MobileNetV3 based on multi-scale information fusion for fish feeding behavior analysis. Aquac. Eng. 2023, 102, 102338. [Google Scholar] [CrossRef]
  18. Saberioon, M.; Gholizadeh, A.; Cisar, P.; Pautsina, A.; Urban, J. Application of machine vision systems in aquaculture with emphasis on fish: state-of-the-art and key issues. Rev. Aquac. 2017, 9, 369–387. [Google Scholar] [CrossRef]
  19. Boudhane, M.; Nsiri, B. Underwater image processing method for fish localization and detection in submarine environment. J. Vis. Commun. Image Represent. 2016, 39, 226–238. [Google Scholar] [CrossRef]
  20. Li, D.; Wang, Z.; Wu, S.; Miao, Z.; Du, L.; Duan, Y. Automatic recognition methods of fish feeding behavior in aquaculture: a review. Aquaculture 2020, 528, 735508. [Google Scholar] [CrossRef]
  21. Desai, N.P.; Balucha, M.F.; Makrariyab, A.; MusheerAziz, R. Image processing model with deep learning approach for fish species classification. Turk. J. Comput. Math. Educ. 2022, 13, 85–89. [Google Scholar]
  22. Føre, M.; Alfredsen, J.A.; Gronningsater, A. Development of two telemetry-based systems for monitoring the feeding behaviour of Atlantic salmon (Salmo salar L.) in aquaculture sea-cages. Comput. Electron. Agric. 2011, 76, 240–251. [Google Scholar] [CrossRef]
  23. Alfonso, S.; Zupa, W.; Spedicato, M.T.; Lembo, G.; Carbonara, P. Using telemetry sensors mapping the energetic costs in European sea bass (Dicentrarchus labrax) as a tool for welfare remote monitoring in aquaculture. Front. Anim. Sci. 2022, 3, 1–9. [Google Scholar] [CrossRef]
  24. Carbonara, P.; Alfonso, S.; Dioguardi, M.; Zupa, W.; Vazzana, M.; Dara, M.; Spedicato, M.T.; Lembo, G.; Cammarata, M. Calibrating accelerometer data as a promising tool for health and welfare monitoring in aquaculture: case study in European sea bass (Dicentrarchus labrax) in conventional or organic aquaculture. Aquac. Rep. 2021, 21, 100–113. [Google Scholar] [CrossRef]
  25. Føre, M.; Svendsen, E.; Alfredsen, J.A. Using acoustic telemetry to monitor the effects of crowding and delousing procedures on farmed Atlantic salmon (Salmo salar). Aquaculture 2017, 495, 757–765. [Google Scholar] [CrossRef]
  26. Gesto, M.; Zupa, W.; Alfonso, S.; Spedicato, M.T.; Lembo, G.; Carbonara, P. Using acoustic telemetry to assess behavioral responses to acute hypoxia and ammonia exposure in farmed rainbow trout of different competitive ability. Appl. Anim. Behav. Sci. 2020, 230, 105084. [Google Scholar] [CrossRef]
  27. Morgenroth, D.K.; Vaestad, B.; Økland, F.; Finstad, B.; Olsen, R.E.; Svendsen, E.; Rosten, C.; Axelsson, M.; Bloecher, N.; Føre, M.; Gräns, A. Under the sea: How can we use heart rate and accelerometers to remotely assess fish welfare in salmon aquaculture? Aquaculture 2024, 579, 740144. [Google Scholar] [CrossRef]
  28. Rosell-Moll, E.; Piazzon, M.C.; Sosa, J.; Ferrer, M.; Cabruja, E.; Vega, A.; Calduch-Giner, J.A.; Sitja-Bobadilla, A.; Lozano, M.; Montiel-Nelson, J.A.; Afonso, J.M.; Pérez-Sanchez, J. Use of accelerometer technology for individual tracking of activity patterns, metabolic rates and welfare in farmed gilthead sea bream (Sparus aurata) facing a wide range of stressors. Aquaculture 2021, 539, 736609. [Google Scholar] [CrossRef]
  29. Zupa, W.; Alfonso, S.; Gai, F.; Gasco, L.; Spedicato, M.T.; Lembo, G.; Carbonara, P. Calibrating accelerometer tags with oxygen consumption rate of rainbow trout (Oncorhynchus mykiss) and their use in aquaculture facility: A case study. Animals 2021, 11, 1496. [Google Scholar] [CrossRef]
  30. Macaulay, G.; Warren-Myers, F.; Barrett, L.; Oppedal, F.; Føre, M.; Dempster, T. Tag use to monitor fish behaviour in aquaculture: a review of benefits, problems and solutions. Rev. Aquac. 2021, 15, 1565–1582. [Google Scholar] [CrossRef]
  31. Munoz, L.; Aspillaga, E.; Palmer, M.; Saraiva, J.L.; Arechavala-Lopez, P. Acoustic telemetry: a tool to monitor fish swimming behavior in sea-cage aquaculture. Front. Mar. Sci. 2020, 7, 545896. [Google Scholar] [CrossRef]
  32. Palstra, A.P.; Arechavala-Lopez, P.; Xue, Y.; Roque, A. Accelerometry of seabream in a sea-cage: is acceleration a good proxy for activity? Front. Mar. Sci. 2021, 8, 639608. [Google Scholar] [CrossRef]
  33. Andrewartha, S.J.; Elliott, N.G.; McCulloch, J.W.; Frappell, P.B. Aquaculture sentinels: smart-farming with biosensor equipped stock. J. Aquac. Res. Dev. 2015, 7, 100393. [Google Scholar]
  34. Gesto, M.; Hernández, J.; López-Patiño, M.A.; Soengas, J.L.; Míguez, J.M. Is gill cortisol concentration a good acute stress indicator in fish? A study in rainbow trout and zebrafish. Comp. Biochem. Physiol. A Mol. Integr. Physiol. 2015, 188, 65–72. [Google Scholar] [CrossRef]
  35. Chiu, M.C.; Yan, W.M.; Bhat, S.A.; Huang, N.F. Development of smart aquaculture farm management system using IoT and AI-based surrogate models. J. Agric. Food Res. 2022, 9, 00357. [Google Scholar] [CrossRef]
  36. Zhao, J.; Xu, D.; Zhou, C.; Sun, C.; Yang, X. Simulation of collective swimming behavior of fish schools using a modified social force and kinetic energy model. Ecol. Model. 2017, 360, 200–210. [Google Scholar]
  37. Schraml, R.; Hofbauer, H.; Jalilian, E.; Bekkozhayeva, D.; Saberioon, M.; Cisar, P.; Uhl, A. Towards fish individuality-based aquaculture. IEEE Trans. Ind. Inform. 2021, 17, 4356–4366. [Google Scholar] [CrossRef]
  38. Fore, M.; Alver, M.; Alfredsen, J.A.; Marafioti, G.; Senneset, G.; Birkevold, J.; Willumsen, F.V.; Lange, G.; Espmark, A.; Terjesen, B.F. Modelling growth performance and feeding behaviour of Atlantic salmon (Salmo salar L.) in commercial-size aquaculture net pens: model details and validation through full-scale experiments. Aquaculture 2016, 464, 268–278. [Google Scholar] [CrossRef]
  39. Islam, M.M. Real-time IoT dataset of pond water for fish farming (multi-pond). Data Brief 2023, 49, 10911. [Google Scholar]
  40. Prapti, D.R.; Mohamed Shariff, A.R.; Che Man, H.; Ramli, N.M.; Perumal, T.; Shariff, M. Internet of Things (IoT)-based aquaculture: An overview of IoT application on water quality monitoring. Rev. Aquac. 2022, 14, 979–992. [Google Scholar] [CrossRef]
  41. Ma, F.; Fan, Z.; Nikolaeva, A.; Bao, H. Redefining aquaculture safety with artificial intelligence: design innovations, trends and future perspectives. Fishes 2025, 10, 88. [Google Scholar] [CrossRef]
  42. Rastegari, H.; Nadi, F.; Lam, S.S. Internet of Things in aquaculture: a review of the challenges and potential solutions based on current and future trends. Smart Agric. Technol. 2023, 4, 100187. [Google Scholar] [CrossRef]
  43. Mustapha, U.F.; Alhassan, A.W.; Jiang, D.N.; Li, G.L. Sustainable aquaculture development: a review on the roles of cloud computing, internet of things and artificial intelligence (CIA). Rev. Aquac. 2021, 13, 2076–2091. [Google Scholar] [CrossRef]
  44. Sun, M.; Yang, X.F.; Xie, Y.G. Deep learning in aquaculture: a review. J. Comput. 2020, 31, 294–310. [Google Scholar]
  45. Chen, C.; Li, X.; Huang, Y.; Xu, D.; Zhou, C.; Sun, C. Fish behavior classification using image texture features and support vector machines. Comput. Electron. Agric. 2018, 155, 131–138. [Google Scholar]
  46. Qiao, F.; Zhou, C.; Xu, D.; Sun, C.; Yang, X. Automatic analysis of fish location and quantity in aquaculture ponds using image preprocessing and edge detection. Comput. Electron. Agric. 2015, 119, 42–49. [Google Scholar]
  47. Aung, T.; Abdul Razak, R.; Rahiman, M.D.; Nor, A. Artificial intelligence methods used in various aquaculture applications: a systematic literature review. J. World Aquac. Soc. 2025, 56, e13107. [Google Scholar] [CrossRef]
  48. Iqbal, M.A.; Wang, Z.J.; Ali, Z.A. Automatic fish species classification using deep convolutional neural networks. Wirel. Pers. Commun. 2021, 116, 1043–1053. [Google Scholar] [CrossRef]
  49. Huang, Y.P.; Khabusi, S.P. Artificial intelligence of things (AIoT) advances in aquaculture: a review. Processes 2025, 13, 73. [Google Scholar] [CrossRef]
  50. Arepalli, P.G. IoT-based DSTCNN for aquaculture water-quality monitoring. Aquac. Eng. 2024, 108, 102369. [Google Scholar]
  51. Shete, R.P. IoT-enabled real-time WQ monitoring for aquafarming using Arduino measurement. Sensors 2024, 27, 10064. [Google Scholar]
  52. Khan, P.W.; Byun, Y.C. Optimized dissolved oxygen prediction using genetic algorithm and bagging ensemble learning for smart fish farm. IEEE Sens. J. 2023, 23, 15153–15164. [Google Scholar] [CrossRef]
  53. Liu, J.; Zhang, T.; Han, G.J. TD-LSTM: temporal dependence-based LSTM networks for marine temperature prediction. Sensors 2018, 18, 379. [Google Scholar] [CrossRef]
  54. Ren, H.; Wang, X.; Li, W.; Wei, Y.; An, D. Research of dissolved oxygen prediction in recirculating aquaculture systems based on deep belief network. Aquac. Eng. 2020, 90, 102085. [Google Scholar] [CrossRef]
  55. Claireaux, G.; Couturier, C.; Groison, A.L. Effect of temperature on maximum swimming speed and cost of transport in juvenile European sea bass (Dicentrarchus labrax). J. Exp. Biol. 2006, 209, 3420–3428. [Google Scholar] [CrossRef]
  56. Koumoundouros, G.; Sfakianakis, D.G.; Divanach, P.; Kentouri, M. Effect of temperature on swimming performance of sea bass juveniles. J. Fish Biol. 2002, 60, 923–932. [Google Scholar] [CrossRef]
  57. Hu, W.C.; Chen, L.B.; Huang, B.K.; Lin, H.M. A computer vision-based intelligent fish feeding system using deep learning techniques for aquaculture. IEEE Sens. J. 2023, 22, 7185–7194. [Google Scholar] [CrossRef]
  58. Hu, W.C.; Chen, L.B.; Wang, B.H. Design and implementation of a full-time artificial intelligence of things-based water quality inspection and prediction system for intelligent aquaculture. IEEE Sens. J. 2024, 24, 3811–3821. [Google Scholar] [CrossRef]
  59. Kumar, D.S.; Prabhaker, L.C.; Shanmugapriya, T. Water quality evaluation and monitoring model (WQEM) using machine learning techniques with IoT. Water Resour. 2024, 51, 1094–1110. [Google Scholar] [CrossRef]
  60. Baena-Navarro, R.; Carriazo-Regino, Y.; Torres-Hoyos, F.; Pinedo-López, J. Intelligent prediction & continuous monitoring of pond water quality with ML + quantum optimization. Water 2025, 17, 82. [Google Scholar]
  61. Eneh, A.H.; Udanor, C.N.; Ossai, N.I.; Aneke, S.O.; Ugwoke, P.O.; Obayi, A.A.; Ugwuishiwu, C.H.; Okereke, G.E. Improving IoT sensor data quality in aquaculture WQ systems (LoRa/Arduino cases). Sensors 2023, 26, 100625. [Google Scholar]
  62. Arepalli, P.G.; Khetavath, J.N. An IoT framework for quality analysis of aquatic water data using time-series convolutional neural network. Environ. Sci. Pollut. Res. 2023, 30, 125275–125294. [Google Scholar] [CrossRef] [PubMed]
  63. Nayoun, M.N.I.; Hossain, S.A.; Rezaul, K.M.; Siddiquee, K.N.E.A.; Islam, M.S.; Jannat, T. Internet of Things-driven precision in fish farming: A deep dive into automated temperature, oxygen, and pH regulation. Computers 2024, 13, 267. [Google Scholar] [CrossRef]
  64. Lu, H.Y.; Cheng, C.Y.; Cheng, S.C. A low-cost AI buoy system for monitoring water quality at offshore aquaculture cages. Sensors 2022, 22, 4078. [Google Scholar] [CrossRef]
  65. Chen, C.H.; Wu, Y.C.; Zhang, J.X.; Chen, Y.H. IoT-based fish farm water quality monitoring system. Sensors 2022, 22, 6700. [Google Scholar] [CrossRef]
  66. Lin, J.Y.; Tsai, H.; Lyu, W.H. An integrated wireless multi-sensor system for monitoring the water quality of aquaculture. Sensors 2021, 21, 8179. [Google Scholar] [CrossRef] [PubMed]
  67. Flores-Iwasaki, M.; Guadalupe, G.A.; Pachas-Caycho, M.; Chapa-Gonza, S.; Mori-Zabarburú, R.C.; Guerrero-Abad, J.C. IoT sensors for water-quality monitoring in aquaculture: systematic review & bibliometrics (2020–2024). AgriEngineering 2025, 7, 78. [Google Scholar]
  68. Zhang, T.; Yang, Y.; Liu, Y.; Liu, C.; Zhao, R.; Li, D.; Shi, C. Fully automatic system for fish biomass estimation based on deep neural network. Ecol. Inform. 2024, 79, 102399. [Google Scholar] [CrossRef]
  69. Bravata, N.; Kelly, D.; Eickholt, J.; Bryan, J.; Miehls, S. Applications of deep convolutional neural networks to predict length, circumference, and weight from mostly dewatered images of fish. Ecol. Evol. 2020, 10, 9313–9325. [Google Scholar] [CrossRef]
  70. Lopez-Tejeida, S.; Soto-Zarazua, G.M.; Toledano-Ayala, M.; Contreras-Medina, L.M.; Rivas-Araiza, E.A.; Flores-Aguilar, P.S. An improved method to obtain fish weight using machine learning and NIR camera with haar cascade classifier. Appl. Sci. 2023, 13, 69. [Google Scholar] [CrossRef]
  71. Mittún, Ó.F.; Andersen, L.E.J.; Svendsen, M.B.S.; Steffensen, J.F. An inexpensive 3D camera system based on a completely synchronized stereo camera, open-source software, and a Raspberry Pi for accurate fish size, position, and swimming speed. Fishes 2025, 10, 139. [Google Scholar] [CrossRef]
  72. Hu, W.C.; Chen, B.; Huang, B.K. A computer vision-based intelligent fish feeding system using deep learning techniques for aquaculture. IEEE Sens. J. 2022, 22, 7185–7194. [Google Scholar] [CrossRef]
  73. Xiao, Y. Review: computer vision for fish feeding-behaviour analysis & practice. Appl. Anim. Behav. Sci. 2025, 271, 105880. [Google Scholar]
  74. Måløy, H.; Aamodt, A.; Misimi, E. A spatio-temporal recurrent network for salmon feeding action recognition from underwater videos in aquaculture. Comput. Electron. Agric. 2019, 167, 105084. [Google Scholar] [CrossRef]
  75. An, D.; Huang, J.; Wei, Y. A survey of fish behaviour quantification indexes and methods in aquaculture. Rev. Aquac. 2021, 13, 2169–2189. [Google Scholar] [CrossRef]
  76. Chen, I.H.; Georgopoulou, D.G.; Ebbesson, L.O.E.; Voskakis, D.; Lal, P.; Papandroulakis, N. Food anticipatory behaviour on European seabass in sea cages: activity-, positioning- and density-based approaches. Front. Mar. Sci. 2023, 10, 1–14. [Google Scholar] [CrossRef]
  77. Wei, X.; Zhang, Y.; Liu, J.; Zhang, Y.; Li, D. A customized recurrent neural network for fish behavior analysis. Aquaculture 2021, 544, 737140. [Google Scholar]
  78. Zhang, Z.; Zou, B.; Hu, Q.; Li, W. Multimodal knowledge distillation framework for fish feeding behaviour recognition in industrial aquaculture. Biosyst. Eng. 2025, 255, 104170. [Google Scholar] [CrossRef]
  79. Feng, M.; Jiang, P.; Wang, Y.; Hu, S.; Chen, S.; Li, R.; Huang, H.; Li, N.; Zhang, B.; Ke, Q.; Zhang, Y.; Xu, P. YOLO-feed: An advanced lightweight network enabling real-time, high-precision detection of feed pellets on CPU devices and its applications in quantifying individual fish feed intake. Aquaculture 2025, 608, 742700. [Google Scholar] [CrossRef]
  80. Georgopoulou, D.G.; Vouidaskis, C.; Papandroulakis, N. Swimming behavior as a potential metric to detect satiation levels of European seabass in marine cages. Front. Mar. Sci. 2024, 11, 135038. [Google Scholar] [CrossRef]
  81. Cai, Y.; Li, J.; Zhou, X.; Wang, L. A two-stage framework for fish behavior recognition: modified YOLOv8 and ResNet-like model. Aquaculture 2024, 575, 112345. [Google Scholar]
  82. Yang, Y.; Yu, H.; Zhang, X.; Zhang, P.; Tu, W.; Gu, L. Fish behavior recognition based on an audio-visual multimodal interactive fusion network. Aquac. Eng. 2024, 107, 102471. [Google Scholar] [CrossRef]
  83. Wu, S.; Yang, T.; Lin, J.; Li, M.; Chen, X.; Li, D. DeformAtt-ViT: A largemouth bass feeding intensity assessment method based on Vision Transformer with deformable attention. J. Mar. Sci. Eng. 2024, 12, 726. [Google Scholar] [CrossRef]
  84. Ni, W.; Wei, D.; Peng, Z.; Ma, Z.; Zhu, S.; Tang, R.; Tian, X.; Zhao, J.; Ye, Z. An appetite assessment method for fish in outdoor ponds with anti-shadow disturbance. Comput. Electron. Agric. 2024, 221, 108940. [Google Scholar] [CrossRef]
  85. Zhao, H.X.; Cui, H.W.; Qu, K.M. A fish appetite assessment method based on improved ByteTrack and spatiotemporal graph convolutional network. Biosyst. Eng. 2024, 240, 46–55. [Google Scholar] [CrossRef]
  86. Yang, H.; Shi, Y.; Wang, X.; Wang, J.; Jia, B.; Zhou, C.; Ye, H. Detection method of fry feeding status based on YOLO lightweight network by shallow underwater images. Electronics 2022, 11, 3856. [Google Scholar] [CrossRef]
  87. Zeng, Q.; Liu, H.; Sun, Y.; Zhao, W.; Chen, D.; Li, D. Fish behavior recognition using audio spectrum swin transformer network. Aquac. Eng. 2023, 101, 102320. [Google Scholar]
  88. Zheng, K.; Wang, H.; Yang, T.; Liu, M.; Chen, L.; Xu, D. Spatio-temporal attention network for swimming and spatial features of pompano. Sensors 2023, 23, 3124. [Google Scholar]
  89. Du, Y.; Zhang, H.; Chen, X.; Li, Y. Fish broodstock behavior recognition using ResNet50-LSTM. Comput. Electron. Agric. 2022, 198, 106987. [Google Scholar]
  90. Du, Y.; Zhang, H.; Chen, X.; Li, Y. LC-GhostNet: Lightweight multimodal neural network for fish behavior recognition. Comput. Electron. Agric. 2023, 208, 107780. [Google Scholar]
  91. Feng, S.; Yang, X.; Liu, Y.; Zhao, Z.; Liu, J.; Yan, Y.; Zhou, C. Fish feeding intensity quantification using machine vision and a lightweight 3D ResNet-GloRe network. Aquac. Eng. 2022, 98, 102240. [Google Scholar] [CrossRef]
  92. Zhang, L.; Wang, J.; Li, B.; Liu, Y.; Zhang, H.; Duan, Q. A MobileNetV2-SENet-based method for identifying fish school feeding behavior. Aquac. Eng. 2022, 99, 102288. [Google Scholar] [CrossRef]
  93. Wang, H.; Zhang, S.; Zhao, S.L. Real-time detection and tracking of fish abnormal behavior based on improved YOLOv5 and SiamRPN++. Comput. Electron. Agric. 2022, 192, 106512. [Google Scholar] [CrossRef]
  94. Liu, J.; Chen, X.; Zhang, J.; Wang, H.; Li, D. CFFI-ViT: enhanced vision transformer for the accurate classification of fish feeding intensity. J. Mar. Sci. Eng. 2024, 12, 1132. [Google Scholar] [CrossRef]
  95. Zhao, S.; Ding, W.; Zhao, S.; Gu, J. Adaptive neural fuzzy inference system for feeding decision-making of grass carp (Ctenopharyngodon idellus) in outdoor intensive culturing ponds. Aquaculture 2019, 498, 28–36. [Google Scholar] [CrossRef]
  96. Wang, G.X.; Muhammad, A.; Liu, C. Automatic recognition of fish behavior with a fusion of RGB and optical flow data based on deep learning. Animals 2021, 11, 2774. [Google Scholar] [CrossRef]
  97. Ubina, F.C.; Estuar, M.R.J.E.; Ubina, C.D. Optical flow neural network for fish swimming behavior and activity analysis. Appl. Artif. Intell. 2021, 35, 1409–1424. [Google Scholar]
  98. Yang, X.; Zhang, S.; Liu, J.; Gao, Q.; Dong, S.; Zhou, C. Deep learning for smart fish farming: applications, opportunities and challenges. Rev. Aquac. 2021, 13, 66–90. [Google Scholar] [CrossRef]
  99. Saminiano, B. Feeding behavior classification of Nile Tilapia (Oreochromis niloticus) using convolutional neural network. Int. J. Adv. Trends Comput. Sci. Eng. 2020, 9, 259–263. [Google Scholar] [CrossRef]
  100. Zhang, Y.; Wang, J.; Duan, Q. Application of convolutional neural networks (CNN) for fish feeding detection. J. Aquac. Res. Dev. 2020, 11, 543–550. [Google Scholar]
  101. Fernandes, R.; Turra, E.M.; de Alvarenga, R.; Passafaro, T.L.; Lopes, F.B.; Alves, G.F.; Singh, V.; Rosa, G.J. Deep learning-based analysis of fish feeding images using CNNs. Aquac. Rep. 2020, 18, 100426. [Google Scholar]
  102. Zhou, C.; Xu, D.; Sun, C.; Yang, X.; Chen, L. Delaunay triangulation and texture analysis for fish behavior recognition in aquaculture. Aquac. Res. 2018, 49, 1751–1762. [Google Scholar]
  103. Liu, Z.; Li, X.; Fan, L.; Lu, H.; Liu, L.; Liu, Y. Measuring feeding activity of fish in RAS using computer vision. Aquac. Eng. 2014, 60, 20–27. [Google Scholar] [CrossRef]
  104. Adegboye, M.A.; Aibinu, A.M.; Kolo, J.G. Incorporating intelligence in fish feeding system for dispensing feed based on fish feeding intensity. IEEE Access 2020, 8, 91948–91960. [Google Scholar] [CrossRef]
  105. Atoum, Y.; Srivastava, S.; Liu, X.M. Automatic feeding control for dense aquaculture fish tanks. IEEE Signal Process. Lett. 2015, 22, 1089–1093. [Google Scholar] [CrossRef]
  106. Huang, M.; Zhou, Y.G.; Yang, X.G.; Gao, Q.F.; Chen, Y.N.; Ren, Y.C.; Dong, S.L. Optimizing feeding frequencies in fish: a meta-analysis and machine learning approach. Aquaculture 2024, 595, 741678. [Google Scholar] [CrossRef]
  107. Cao, J.; Wang, Y.; Chen, H.; Zhou, C. Enhanced CNN frameworks for identifying feeding behavior in aquaculture. Aquaculture 2023, 561, 738682. [Google Scholar]
  108. Vijayalakshmi, M.; Sasithradevi, A. AquaYOLO: Advanced YOLO-based fish detection for optimized aquaculture pond monitoring. Sci. Rep. 2025, 15, 6151. [Google Scholar] [CrossRef]
  109. Zhou, C.; Xu, D.; Chen, L.; Zhang, S.; Sun, C.; Yang, X.; Wang, Y. Evaluation of fish feeding intensity in aquaculture using a convolutional neural network and machine vision. Aquaculture 2019, 507, 457–466. [Google Scholar] [CrossRef]
  110. Cai, K.; Yang, Z.; Gao, T.; Liang, M.; Liu, P.; Zhou, S.; Pang, H.; Liu, Y. Efficient recognition of fish feeding behavior: a novel two-stage framework pioneering intelligent aquaculture strategies. Comput. Electron. Agric. 2024, 224, 109129. [Google Scholar] [CrossRef]
  111. Gu, X.Y.; Zhao, S.L.; Duan, Y.Q. MMFINet: a multimodal fusion network for accurate fish feeding intensity assessment in recirculating aquaculture systems. Comput. Electron. Agric. 2025, 232, 110138. [Google Scholar] [CrossRef]
  112. Liu, J.; Becerra, A.T.; Bienvenido-Barcena, J.F.; Yang, X.; Zhao, Z.; Zhou, C. CFFI-ViT: enhanced vision transformer for the accurate classification of fish feeding intensity in aquaculture. J. Mar. Sci. Eng. 2024, 12, 1132. [Google Scholar] [CrossRef]
  113. Ma, P.; Yang, X.; Hu, W.; Fu, T.; Zhou, C. Fish feeding behavior recognition using time-domain and frequency-domain signals fusion from six-axis inertial sensors. Comput. Electron. Agric. 2024, 227, 109652. [Google Scholar] [CrossRef]
  114. Du, M.; Cui, X.; Xu, Z.; Bai, J.; Han, W.; Li, J.; Yang, X.; Liu, C.; Wang, D. Harnessing multimodal data fusion to advance accurate identification of fish feeding intensity. Biosyst. Eng. 2024, 246, 135–149. [Google Scholar] [CrossRef]
  115. Nayan, A.A.; Saha, J.; Mozumder, A.N.; Mahmud, K.R.; Al Azad, A.K.; Kibria, M.G. A machine learning approach for early detection of fish diseases by analyzing water quality. Trends Sci. 2021, 18, 351. [Google Scholar] [CrossRef]
  116. O’Donncha, F.; Stockwell, C.L.; Planellas, S.R.; Micallef, G.; Palmes, P.; Webb, C.; Filgueira, R.; Grant, J. Data driven insight into fish behaviour and their use for precision aquaculture. Front. Anim. Sci. 2021, 2, 1–12. [Google Scholar] [CrossRef]
  117. Mandal, A.; Ghosh, A.R. Role of artificial intelligence (AI) in fish growth and health status monitoring: A review on sustainable aquaculture. Aquac. Int. 2024, 32, 2791–2820. [Google Scholar] [CrossRef]
  118. Costa, C.S.; Goncalves, W.N.; Zanoni, V.A.G.; Dos Santos De Arruda, M.; de Araujo Carvalho, M.; Nascimento, E.; Marcato, J.; Diemer, O.; Pistori, H. Counting tilapia larvae using images captured by smartphones. Smart Agric. Technol. 2023, 4, 10016. [Google Scholar] [CrossRef]
  119. Cui, M.; Liu, X.B.; Liu, H.H. Fish tracking, counting and behaviour analysis in digital aquaculture: a comprehensive survey. Rev. Aquac. 2025, 17, e13001. [Google Scholar] [CrossRef]
  120. Sadoul, B.; Alfonso, S.; Cousin, X.; Prunet, P.; Bégout, M.L.; Leguen, I. Global assessment of the response to chronic stress in European sea bass. Aquaculture 2021, 544, 737072. [Google Scholar] [CrossRef]
  121. Carbonara, P.; Alfonso, S.; Zupa, W.; Manfrin, A.; Fiocchi, E.; Pretto, T.; Spedicato, M.T.; Lembo, G. Behavioral and physiological responses to stocking density in sea bream (Sparus aurata): Do coping styles matter? Physiol. Behav. 2019, 212, 112698. [Google Scholar] [CrossRef] [PubMed]
  122. Li, D.L.; Wang, G.X.; Du, L. Recent advances in intelligent recognition methods for fish stress behavior. Aquac. Eng. 2022, 96, 102222. [Google Scholar] [CrossRef]
  123. Kolarevic, J.; Aas-Hansen, Ø.; Espmark, Å.; Baeverfjord, G.; Terjesen, B.F.; Damsgård, B. The use of acoustic acceleration transmitter tags for monitoring of Atlantic salmon swimming activity in recirculating aquaculture systems (RAS). Aquac. Eng. 2016, 72, 30–39. [Google Scholar] [CrossRef]
  124. Martinez-Alpiste, I.; De Tailly, J.B.; Alcaraz-Calero, J.M. Machine learning-based understanding of aquatic animal behaviour in high-turbidity waters. Expert Syst. Appl. 2024, 255, 124804. [Google Scholar] [CrossRef]
  125. Wang, X.; Li, P.; Chen, R.; Zhang, J.; Liu, Z. Appearance-motion autoencoder network (AMA-Net) for behavior recognition of Oplegnathus punctatus. Aquaculture 2023, 569, 739302. [Google Scholar]
  126. Huang, J.; Yu, X.; Chen, X.; An, D.; Zhou, Y.; Wei, Y. Recognizing fish behavior in aquaculture with graph convolutional network. Aquac. Eng. 2022, 98, 102246. [Google Scholar] [CrossRef]
  127. Kong, L.; Xu, W.; Zhao, J.; Sun, F. Active learning with VGG16 for behavior detection in Oplegnathus punctatus. Aquac. Eng. 2022, 96, 102175. [Google Scholar]
  128. Hu, C.; Yang, X.; Xu, D.; Zhou, C.; Chen, L. Automated monitoring of fish behavior in recirculating aquaculture systems using edge detection and segmentation methods. Aquac. Eng. 2015, 67, 13–24. [Google Scholar] [CrossRef]
  129. Sadoul, B.; Vijayan, M.M.; Schram, E.; Aluru, N.; Wendelaar Bonga, S.E. Physiological and behavioral responses to multiple stressors in farmed fish: application of imaging and dispersion indices. Aquaculture 2014, 432, 362–370. [Google Scholar]
  130. Pinkiewicz, T.H.; Purser, G.J.; Williams, R.N. A computer vision system to analyse the swimming behaviour of farmed fish in commercial aquaculture facilities: A case study using cage-held Atlantic salmon. Aquac. Eng. 2011, 45, 20–27. [Google Scholar] [CrossRef]
  131. Han, F.F.; Zhu, J.C.; Liu, B. Fish shoals behavior detection based on convolutional neural network and spatiotemporal information. IEEE Access 2020, 8, 126907–126926. [Google Scholar] [CrossRef]
  132. Hu, J.; Zhao, D.D.; Zhang, Y.F. Real-time nondestructive fish behavior detecting in mixed polyculture system using deep learning and low-cost devices. Expert Syst. Appl. 2021, 178, 115051. [Google Scholar] [CrossRef]
  133. Iqbal, U.; Li, D.L.; Akhter, M. Intelligent diagnosis of fish behavior using deep learning method. Fishes 2022, 7, 201. [Google Scholar] [CrossRef]
  134. Rutz, C.; Bronstein, M.; Raskin, A. Using machine learning to decode animal communication. Science 2023, 381, 152–155. [Google Scholar] [CrossRef]
  135. Saad Saoud, L.; Sultan, A.; Elmezain, M. Beyond observation: deep learning for animal behavior and ecological conservation. Ecol. Inform. 2024, 84, 102893. [Google Scholar] [CrossRef]
  136. Wang, J.H.; Lee, S.K.; Lai, Y.C.; Lin, C.C.; Wang, T.Y.; Lin, Y.R.; Hsu, T.H.; Huang, C.W.; Chiang, C.P. Anomalous behaviors detection for underwater fish using AI techniques. IEEE Access 2020, 8, 224372–224382. [Google Scholar] [CrossRef]
  137. Zhao, Y.X.; Qin, H.X.; Xu, L.A. Review of deep learning-based stereo vision techniques for phenotype feature and behavioral analysis of fish in aquaculture. Artif. Intell. Rev. 2025, 58, 7. [Google Scholar] [CrossRef]
  138. Zheng, T.; Wu, J.F.; Kong, H. A video object segmentation-based fish individual recognition method for underwater complex environments. Ecol. Inform. 2024, 82, 102689. [Google Scholar] [CrossRef]
  139. Long, L.; Johnson, Z.V.; Li, J.; Lancaster, T.J.; Aljapur, V.; Streelman, J.T.; McGrath, P.T. Automatic classification of cichlid behaviors using 3D convolutional residual networks. iScience 2020, 23, 101591. [Google Scholar] [CrossRef] [PubMed]
  140. Yang, H.; Zhou, C.; Shi, Y.; Wang, X.; Wang, J.; Ye, H. BlendMask-VoNetV2: Robust detection of overlapping fish behavior in aquaculture. Comput. Electron. Agric. 2023, 212, 108023. [Google Scholar]
  141. Huntingford, F.A.; Adams, C.; Braithwaite, V.A.; Kadri, S.; Pottinger, T.G.; Sandøe, P.; Turnbull, J.F. Current issues in fish welfare. J. Fish Biol. 2006, 68, 332–372. [Google Scholar] [CrossRef]
  142. Chakravorty, H. New approach for disease fish identification using augmented reality and image processing technique. IPASJ 2021, 9, 3. [Google Scholar]
  143. Alfonso, S.; Sadoul, B.; Cousin, X.; Bégout, M.L. Spatial distribution and activity patterns as welfare indicators in response to water quality changes in European sea bass (Dicentrarchus labrax). Appl. Anim. Behav. Sci. 2020, 226, 104974. [Google Scholar] [CrossRef]
  144. Ashley, P.J. Fish welfare: Current issues in aquaculture. Appl. Anim. Behav. Sci. 2007, 104, 199–235. [Google Scholar] [CrossRef]
  145. Bohara, K.; Joshi, P.; Acharya, K.P.; Ramena, G. Emerging technologies revolutionising disease diagnosis and monitoring in aquatic animal health. Rev. Aquac. 2024, 16, 836–854. [Google Scholar] [CrossRef]
  146. Ubina, N.A.; Lan, H.Y.; Cheng, S.C.; Chang, C.C.; Lin, S.S.; Zhang, K.X.; Lu, H.Y.; Cheng, C.Y.; Hsieh, Y.Z. Digital twin-based intelligent fish farming with Artificial Intelligence Internet of Things (AIoT). Smart Agric. Technol. 2023, 5, 100285. [Google Scholar] [CrossRef]
  147. Lim, L.W.K. Implementation of artificial intelligence in aquaculture and fisheries: deep learning, machine vision, big data, internet of things, robots and beyond. J. Comput. Commun. Eng. 2024, 3, 112–118. [Google Scholar] [CrossRef]
Figure 1. The main stages of an intelligent feeding system for fish. .
Figure 1. The main stages of an intelligent feeding system for fish. .
Preprints 175789 g001
Figure 2. Example of a smart aquaculture farm.
Figure 2. Example of a smart aquaculture farm.
Preprints 175789 g002
Table 1. AI feeding monitoring methods in aquaculture production.
Table 1. AI feeding monitoring methods in aquaculture production.
Method/Model Data Type Application Reference
MMKDR image based feeding behaviour & intensity quantification [78]
YOLO feed image based feed intake [79]
Feedforward neural network (FFN) water surface
fluctuations
feeding behaviour
YOLOv5 image based feeding behaviour
Dicentrarchus labrax
[80]
MobileViT-SENet image based fish density & feeding intensity in outdoor ponds [68]
YOLOv8 mode image based fish swimming behaviour and activity degree [81]
Mul-SEResNet50 multi information sound and activity degree
Oncorhynchus mykiss
[82]
DeformAtt-ViT image based swimming behaviour and activity degree
largemouth bass
[83]
RCNN video based activity degree
Ctenopharyngodon idella
[84]
FishFeed methods video based fish density and spatial information [85]
CNN image-based feeding behaviour classification [86]
BlendMask-VoNetV2 video based fish swimming behaviour and activity degree [87]
Audio spectrum swin transformer network multi information sound and activity level
Oncorhynchus mykiss
[88]
STAN multi information swimming and spatial features
pompanos
[89]
MMTM multi information sound and activity degree
Oncorhynchus aguabonita
[56]
LC-GhostNet lightweight network multi information Sound and activity degree
Oplegnathus punctatus
[90]
MSIF- MobileNetV3 image based swimming behaviour and activity degree
Oplegnathus punctatus
[6]
Resnet50-LSTM video based swimming behaviour, activity degree [91]
3D ResNet-GloRe video based swimming behaviour and activity degree
Oncorhynchus mykiss
[92]
MobileNetV2- SENet image based swimming behaviour and activity degree
Plectropomus leopardus
[93]
Multi-task network multi information group activity level
Oplegnathus punctatus
[94]
Long term recurrent convolutional network image based feeding behaviour
grass and crucian carps
[72]
YOLOv4 image based water feed detection [95]
CNN image based water feed detection [96]
CNN image based water feed detection [97]
Customized recurrent neural network video based swimming behaviour and activity degree
American black bass
[77]
Optical flow neural network image based fish swimming behaviour and activity degree [98]
Duel attention network-EfficinetB2 image based feeding behaviour
[99]
Optical flow model optical flow feeding behaviour recognition [76]
CNN image based feeding behaviour
tilapia
[100]
CNN image based fish recognition / feeding detection [101]
CNN image based feeding behaviour detection [102]
Dual-Stream Recurrent Network video based spatial and motion information
Atlantic salmon
[74]
CNN image based feeding behaviuor
tilapia
[103]
RNN sensors fish growth/environment modeling [52]
Computer vision feeding index image based feeding activity assessment [104]
Table 2. AI fish behaviour monitoring methods.
Table 2. AI fish behaviour monitoring methods.
Method/Model Data Type Application References
AquaYOLO image based fish detection [109]
AMA-Net video based appearance & motion
Oplegnathus punctatus
[126]
Graph Convolution Networks (GCN) image based Swimming/spatial features of Oncorhynchus mykiss
[127]
VGG16 + Active Learning image based swimming behaviour Oplegnathus punctatus
[128]
Multi-task network multi modal group activity level Oplegnathus punctatus
[94]
CNN
image based fish recognition [9]
CNN
image based spatial information
Tilapia
[100]
LeNet-5
image based swimming behaviour & activity
Tilapia
[108]
Recurrent Network
image based fish behaviour recognition
[44]
SVM + image texture
Image based fish behaviour analysis [94]
Social force model
Simulation fish schooling dynamics [35]
Grayscale + edge detection
image based fish location & quantity [45]
Image edge detection + threshold segmentation
image based behaviour analysis [129]
Image processing methods
image based group dispersion & activity index [130]

Adaptive threshold + edge detection

image based

swimming velocity & direction

[131]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated