Preprint
Technical Note

This version is not peer-reviewed.

Application of a Lightweight, Open-Hardware Wearable System for Robust Behavior Monitoring in Precision Livestock Farming

Submitted:

19 April 2026

Posted:

21 April 2026

You are already at the latest version

Abstract
Precision livestock farming (PLF) is hindered by high costs, infrastructure demands, and complex deployment. To address these barriers, we developed a practical, open-hardware wearable system for real-time movement tracking and behavior classification in pasture-based livestock. The collar-mounted device integrates a 6-axis IMU, GPS, and a low-power microcontroller within a modular architecture, and uses a novel routerless communication protocol to transmit data directly to a base station—eliminating reliance on network infrastructure. The system supports two operational modes: (1) synchronized data logging for video annotation, or (2) real-time embedded behavior classification. A year-long field trial confirmed robust performance, minimal animal disturbance, and resilience under harsh conditions. Remarkably, the system achieved near-perfect true positive rates (>0.95) for both basic and subtle behaviors using only one hour of annotated video for training, drastically reducing labeling effort. All design assets—CAD files, schematics, firmware, bill of materials, and protocols—are openly released to ensure full reproducibility. This work delivers a validated, scalable, and accessible tool that lowers entry barriers for researchers and developers, enabling rapid deployment and community-driven adaptation for diverse livestock applications.
Keywords: 
;  ;  ;  ;  ;  ;  
Subject: 
Engineering  -   Bioengineering

1. Introduction

Precision livestock farming (PLF) integrates sensing, connectivity, and data analytics to support real-time monitoring of animal behavior, health, and productivity. As global demand for animal-source foods rises alongside pressures to reduce environmental footprints and labor dependency, PLF has emerged as a critical enabler of sustainable intensification—not only in high-input systems but also in smallholder and pasture-based operations across diverse agroecological zones [1,2]. Behavior monitoring, in particular, serves as a non-invasive proxy for welfare, health status, and feeding efficiency, with applications ranging from estrus detection to early disease diagnosis [3].
Among sensing modalities, inertial measurement units (IMUs) have gained prominence for their ability to discriminate key behaviors—such as grazing, ruminating, resting, and walking—in ruminants under field conditions [4,5]. However, many existing IMU-based systems remain constrained by high energy consumption, reliance on cellular or long-range wireless infrastructure (e.g., 4G, LoRaWAN), and bulky form factors that limit deployment in extensive or topographically complex grazing systems [6]. These limitations hinder scalability in low-resource settings and reduce practicality for multi-animal monitoring without centralized gateways or cloud processing.
To address these challenges, we present CABRA (Collar-Assisted Behavior Recognition Analysis): a lightweight, open-hardware platform designed for robust, infrastructure-free behavior monitoring in ruminants. CABRA combines a collar-mounted node (6-axis IMU + ESP32 microcontroller) with local ESP-NOW communication to enable low-power, real-time data transmission across multiple animals without dependence on internet connectivity or proprietary cloud services. Its lightweight design (~60 g) allows deployment on human-habituated animals, facilitating secure attachment and ground-truth validation via synchronized video (practically limited to ~20 m for clear behavioral observation), while maintaining ESP-NOW connectivity at distances up to 100 m. IMU data are sampled at 20 Hz and can be used either for offline behavioral annotation or, following model training, for on-device classification using lightweight machine learning pipelines.
Initially validated for foraging behavior detection in dairy goats, CABRA is among the first PLF systems to exploit ESP-NOW for scalable, multi-animal telemetry in pasture environments. Critically, the entire platform—including 3D-printable housings, circuit schematics, firmware, and data-processing scripts—is released under open-source licenses to promote reproducibility, adaptation, and equitable access across research and farming communities worldwide.

2. Materials and Methods

The CABRA (Collar-Assisted Behavior Recognition Analysis) system is an open-source, modular platform designed for real-time, infrastructure-free monitoring of livestock behavior in field conditions. Figure 1 depicts its system achitecture. Designed to support deployment in pasture-based, remote, or topographically complex environments, it prioritizes low cost, low power, ease of deployment, and adaptability across species and environments. The system eliminates dependency on cloud platforms, cellular networks, or fixed gateways — making it suitable for pasture-based, remote, or topographically complex farms. The CABRA platform is fully open-source. Hardware designs (KiCad), firmware (C++/Arduino), and data processing code (Python) are released under the MIT License and archived at the GitHub public repository [7], with a permanent DOI at Zenodo.
CABRA follows a decentralized, peer-to-peer architecture composed of two primary components:
  • Collar-Mounted Sensor Nodes: Lightweight units worn by animals, capture inertial data (and optional GPS) at 20 Hz, and optionally display real-time summaries.;
  • Base station Unit: A standalone ESP32-based device that passively listens for data packets via ESP-NOW and logs them.
Data flows unidirectionally from collars to the receiver. No routers, access points, or internet connectivity are required. This design minimizes complexity, reduces power consumption, and enables deployment in areas with limited infrastructure.
The system is optimized for use with human-habituated livestock — animals tolerant of close human presence — allowing secure collar attachment and release, and synchronized video observation within a 20-meter operational radius. All firmware, PCB designs, 3D-printable enclosures, and data processing scripts are publicly available to ensure full reproducibility and community adaptation. Arduino C++ code for both collar unit and for base station is provided in the public repository.

2.1. Sensor Node Hardware

Each collar node integrates three core components. Figure 2 depicts the circuit diagram:
  • Microcontroller: ESP32 TTGO (LILYGO), featuring dual-core Xtensa LX6 processor (240 MHz, 520 KB SRAM), Wi-Fi/Bluetooth, and native ESP-NOW support. Selected for its low-power modes, robust wireless stack, and its integrated 1.14 inch colour SPI LCD screen for GPIO flexibility.
  • Inertial Measurement Unit (IMU): Bosch BMI160, a 6-axis (3-axis accelerometer + 3-axis gyroscope) MEMS sensor with 16-bit resolution. Configured for ±2g (accelerometer) and ±500°/s (gyroscope) ranges — optimal for capturing fine-grained jaw and head movements associated with foraging and rumination. Sampled at 20 Hz via I²C bus (GPIO 21 = SDA, GPIO 22 = SCL).
  • GPS Module: u-blox NEO-6M, providing NMEA-formatted Position, Velocity, and Time (PVT) sentences (Kaplan and Hegarty, 2017) via UART (TTL serial, 3.3V logic, with UART2 pins RXD2 GPIO 26 and TXD2 GPIO 27). Used for system time synchronization (described in Section 2.5) and spatial context (accuracy: 2.5m CEP; update rate: 5 Hz).
All peripheral components interface with the ESP32 via JST 2.0mm connectors to avoid boot-critical or “strapping” GPIOs pins (Espressif Systems, 2023), enabling safe disconnection during firmware programming and simplifying field maintenance to ensure reliable boot behavior. Full schematics and PCB layouts are available in the public repository.
Data is transmitted wirelessly to a dedicated receiver unit, a standalone ESP32-based device, that logs received data either to a microSD card or streams it via USB-UART to a host computer, depending on deployment mode.

2.2. Power and Enclosure

  • Battery: 1500 mAh LiPo (3.7V) , providing ~72 hours of continuous operation.
  • Power Management: ESP32 enters light-sleep mode between sensor reads, reducing average current draw to < 30 mA.
  • Enclosure: 3D-printed ABS housing (35 g), designed for IP67 dust/water resistance, with ventilation slots and strain relief for wiring. Collar attachment via adjustable, quick-release nylon strap (total collar weight < 80 g, under 0.5% of body weight for adult goats and cattle).
Field testing confirmed no adverse behavioral effects from collar weight or fit over 14-day deployment periods.

2.3. Data Acquisition and Packet Structure

Raw IMU data (accelerometer and gyroscope, sampled at 20 Hz) is buffered locally on the collar node and transmitted to the receiver in fixed-size packets via ESP-NOW. Each packet contains the following fields:
  • Node ID (8-bit unsigned integer) identifies source animal
  • Date (32-bit Unix timestamp, UTC) derived from GPS
  • Time (32-bit millisecond offset) sub-second precision within the day
  • AccX, AccY, AccZ (float, g’s) linear accelerations
  • GyroX, GyroY, GyroZ (float, °/s) angular velocities
  • CRC16 (16-bit) cyclic error-correcting code appended for data integrity verification
Upon receipt, data are error checked and logged as timestamped CSV records. The provided firmware supports raw USB UART streaming at 115200 baud. On-collar logging (i.e., via OpenLog) was prototyped during development but ultimately excluded from the final design to minimize weight, power consumption, and potential points of failure — consistent with the system’s goal of enabling long-duration, low-maintenance deployment on free-moving livestock.

2.4. Wireless Communication

CABRA utilizes ESP-NOW, a connectionless, low-power, and encrypted peer-to-peer protocol developed by Espressif. This protocol enables direct device-to-device communication, eliminating the need for wireless routers, gateways, or internet infrastructure. This design is ideal for remote pastures and barns with limited connectivity.
The system is engineered for low-duty-cycle operation, transmitting structured data packets of approximately 50 bytes. Under typical field conditions, a single base station can reliably support a herd of up to 30 collared animals. For larger deployments, the open architecture facilitates the use of multiple base stations on non-overlapping channels to maintain data integrity and real-time performance.
Validated Field Performance:
  • Range: Effective communication up to 100 m line-of-sight, confirmed in pasture environments.
  • Latency: Consistently below 10 ms, supporting near real-time monitoring.
  • Robustness: The combination of encryption, peer-to-peer topology, and low-duty-cycle transmission ensures reliable operation amid animal movement and environmental interference.

2.5. Video-IMU Synchronization

Temporal alignment between IMU data and video recordings was achieved using a shared GPS-derived time reference as shown in Figure 3. Each collar node acquires UTC-synchronized timestamps from its onboard u-blox NEO-6M GPS module. At the start of each recording session, all video cameras simultaneously recorded a 5–10 s clip of a designated “sync collar” displaying its Node ID and current GPS time on the integrated LCD. The offset between the video system’s clock (Android POSIX time) and GPS time was computed from this clip and applied uniformly to all IMU streams for the session. Video timestamps were extracted using ExifTool (v12.70) [Harvey, 2024]. This method enabled drift-free synchronization across all sensors and cameras for sessions lasting 2–4 hours, with accuracy limited by video frame rate (40 ms at 25 fps). A full description of the time reference methodology is provided in Appendix A.

2.6. Behavior Classification Pipeline

The system supports offline and embedded behavior classification using a two-stage pipeline: feature extraction followed by tree-based inference.
Feature extraction was implemented in Python for model development. A sliding window (1.5 s duration, 0.1 s step) was applied to synchronized IMU data to compute time- and frequency-domain features, including statistical descriptors (mean, standard deviation, skewness, etc.), signal magnitude, dominant FFT frequencies, and dynamic features (energy, entropy). The full feature set follows established approaches in livestock behavior recognition [Hammond et al., 2021; Zhang & Poslad, 2018].
Classification models were trained using Random Forest and XGBoost (scikit-learn and XGBoost libraries). For embedded deployment, trained models were converted to C code using emlearn [Nordby et al., 2019] and custom decision-tree serialization routines, enabling dependency-free execution on the ESP32. The final firmware supports real-time inference of five behavioral classes: resting, walking, selection, grazing, and rumination. All training scripts, feature extractors, and model-conversion tools are available in the public repository.

3. Results

We evaluated the performance of five machine learning models—Logistic Regression, Support Vector Machine (SVM), Random Forest, XGBoost, and a Multilayer Perceptron (MLP)—for classifying five key livestock behaviors: grazing, ruminating, walking, resting, and foraging/selection. The assessment was based on data collected across four independent field sessions, which incorporated iterative hardware improvements and increasing IMU sampling rates (from 1 Hz to 20 Hz) to better resolve behavioral transitions under real-world grazing conditions.
For each session and model, stratified k-fold cross-validation was performed, and performance was quantified using class-wise F1-scores and normalized confusion matrices. Tree-based models consistently outperformed linear and neural network alternatives across all deployment phases. In the two final sessions—conducted with the mature CABRA configuration (20 Hz sampling, GPS-synchronized IMU streams, and refined collar ergonomics)—XGBoost achieved exceptional classification accuracy. The mean F1-score across all five behavioral classes exceeded 0.99, with true positive rates ≥98%, for every class except walking and grazing. Misclassifications were rare and primarily limited to brief transitional states (e.g., onset of rumination following grazing), consistent with known ambiguities in behavioral ethograms.
These results confirm that the selected feature set—derived from 1.5 s sliding windows of 6-axis IMU data—and the associated preprocessing pipeline effectively encode behaviorally discriminative patterns, even in uncontrolled pasture environments. Critically, the high separability between morphologically similar behaviors (e.g., ruminating vs. resting) demonstrates the system’s capacity for fine-grained behavioral monitoring without reliance on external infrastructure or post-hoc annotation.
Together, these findings validate the feasibility of deploying the full classification pipeline—including feature extraction and tree-based inference—directly on the ESP32 collar node for real-time, edge-based behavior recognition. A complete confusion matrix and per-class sample counts for the final validation session are provided in Figure 4.

4. Discussion

The high accuracy (>98% true positives, F1 > 0.99) achieved by the CABRA system in classifying fine-grained ruminant behaviors—particularly the distinction between ruminating and resting—addresses a long-standing challenge in precision livestock farming (PLF). Prior IMU-based systems have often struggled to resolve such subtle postural and jaw-movement differences without high sampling rates (>30 Hz) or multi-sensor fusion (e.g., accelerometry + acoustics) [3,5]. Our results demonstrate that a 20 Hz 6-axis IMU, combined with a well-engineered feature set and tree-based classifiers, is sufficient for robust discrimination under real-world conditions—striking a favorable balance between performance, power consumption, and computational feasibility on resource-constrained hardware.
This finding is particularly significant in light of the dominant commercial paradigm, where collar-based systems are primarily designed for geolocation and virtual fencing (e.g., Nofence, Pasture.io) rather than behavioral phenotyping. While useful for pasture rotation or boundary control, these platforms typically subsample movement data (often <1 Hz) and provide little to no on-device inference—limiting their utility for welfare monitoring or health surveillance. In contrast, CABRA’s infrastructure-free architecture, real-time edge classification, and open-hardware ethos position it as a complementary or alternative solution for researchers and farmers seeking granular, ethologically meaningful insights without cloud dependency—a critical advantage in remote or low-connectivity regions [8].
Moreover, the system’s modularity aligns with recent calls for interoperable, transparent PLF tools that empower users to inspect, modify, and extend functionality [2]. By releasing full schematics, firmware, and data pipelines under an open license, CABRA lowers barriers to innovation—especially for smallholder or public-sector actors who may lack access to proprietary platforms.
The current deployment focused on daytime, active-phase monitoring, which explains the absence of deep sleep modes in our power management strategy. However, this also highlights a key limitation: long-term autonomy remains constrained by battery capacity and static duty cycling. Future iterations will address this through adaptive sensing—dynamically adjusting sampling rates based on circadian patterns or behavioral state—to extend operational life beyond 72 hours. Similarly, while ESP-NOW provides reliable short-range communication (≤100 m), large-scale pasture systems will benefit from hybrid protocols such as LoRaWAN backhaul, enabling herd-level monitoring without sacrificing the edge-processing advantages demonstrated here.
Looking ahead, the platform’s real potential lies in expanding contextual intelligence. Integrating Ultra-Wideband (UWB) for inter-animal proximity tracking could unlock social behavior analysis and estrus detection in species like sows, where mounting events are brief but critical. Likewise, embedding lightweight anomaly detection models—such as those for predator-induced flight responses in ewes—could transform CABRA from a monitoring tool into a proactive welfare safeguard.
In sum, CABRA exemplifies how accessible, open-source hardware, when combined with thoughtful signal processing and edge AI, can deliver high-fidelity behavioral monitoring without infrastructural overhead. As global livestock systems face mounting pressures to improve sustainability, resilience, and ethical standards, such democratized, scalable solutions will be essential—not just in Europe, but across diverse agroecological and socioeconomic contexts.

Author Contributions

Conceptualization: Jesus A. Baro; Data curation: Jesus A. Baro and Jose A. Bodero; Formal analysis: Jesus A. Baro and Jose A. Bodero; Funding acquisition: Jesus A. Baro and Victor Romero; Investigation: Jesus A. Baro and Jose A. Bodero; Methodology: Jesus A. Baro and Jose A. Bodero; Project administration: Jesus A. Baro; Resources: Jesus A. Baro, Victor Romero and Jose A. Bodero; Software: Jesus A. Baro, Victor Romero and Jose A. Bodero; Supervision: Jesus A. Baro; Validation: Jesus A. Baro and Jose A. Bodero; Visualization: Jesus A. Baro and Jose A. Bodero; Writing – original draft: Jesus A. Baro; Writing–review and editing: Jesus A. Baro and Jose A. Bodero; All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

We encourage all authors of articles published in MDPI journals to share their research data. In this section, please provide details regarding where data supporting reported results can be found, including links to publicly archived datasets analyzed or generated during the study. Where no new data were created, or where data is unavailable due to privacy or ethical restrictions, a statement is still required. Suggested Data Availability Statements are available in section “MDPI Research Data Policies” at https://www.mdpi.com/ethics.

Acknowledgments

In this section, you can acknowledge any support given which is not covered by the author contribution or funding sections. This may include administrative and technical support, or donations in kind (e.g., materials used for experiments). Where GenAI has been used for purposes such as generating text, data, or graphics, or for study design, data collection, analysis, or interpretation of data, please add “During the preparation of this manuscript/study, the author(s) used [tool name, version information] for the purposes of [description of use]. The authors have reviewed and edited the output and take full responsibility for the content of this publication.”.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
CRC16
FFT
GPIO
IMU
MLP
NMEA
PLF
16-bit cyclic error-correcting code
Fast Fourier Transform
General-purpose input/output
Inertial Measurement Unit
Multilayer Perceptron
National Marine Electronics Association
Precision livestock farming
PVT Position, Velocity, and Time
SVM Support Vector Machine
UART
XGBoost
Universal asynchronous receiver-transmitter
eXtreme Gradient Boosting

Appendix A

Video–IMU Synchronization Protocol

To align IMU data streams (timestamped via GPS) with multi-camera video recordings (timestamped via Android devices), we established a global time offset at the start of each 2–4 h recording session. All collar nodes used u-blox NEO-6M GPS modules to acquire synchronized timestamps (accuracy ±10 ms across nodes, due to common satellite visibility).
At session onset, all cameras simultaneously recorded a 5–10 s video clip of a designated “sync collar” displaying its Node ID and current GPS-derived time on the integrated LCD. Video timestamps were extracted from media files using ExifTool (v12.70) [9]. The temporal offset between the Android device’s POSIX clock and GPS time was computed from this clip and applied uniformly to all IMU streams for the duration of the session.
Although Android devices were offline during field deployments (preventing NTP updates), empirical testing confirmed camera clock drift remained below 100 ms over 4 hours. The resulting synchronization accuracy was limited by video frame rate (≤33 ms at 30 fps), which is sufficient for labeling sub-second behaviors such as chewing bouts or posture transitions.
For background on GPS time references, see [10].

References

  1. Neethirajan, S. Recent Advances in Wearable Sensors for Animal Health Management. Sens Bio-Sens Res. 2017, 12, 15-29. doi:10.1016/j.sbsr.2016.11.004.
  2. Wathes, C.M.; Kristensen, H.H.; Aerts, J.M.; Berckmans, D. Is precision livestock farming an engineer’s daydream or nightmare, an animal’s friend or foe, and a farmer’s panacea or pitfall?. Comput. Electron. Agric. 2008, 64, 2-10. [CrossRef]
  3. Alsaaod, M.; Fadul, M.; Steiner, A. Automatic lameness detection in cattle. Vet J. 2019, 246, 35-44. [CrossRef]
  4. Li; G.; Chai, L. AnimalAccML: An open-source graphical user interface for automated behavior analytics of individual animals using triaxial accelerometers and machine learning. Comput. Electron. Agric. 2023, 209, 107835.
  5. Ruuska, S.; Kajava, S.; Mughal, M.; Zehner, N.; Mononen, J. Validation of a pressure sensor-based system for measuring eating, rumination and drinking behaviour of dairy cattle. Appl. Anim. Behav. Sci. 2012, 174, pp. 19-23. [CrossRef]
  6. Fajardo, B.; Méndez, D. A.; Villagrá, A.; Calvet, S. Development and validation of a triaxial accelerometer for behavior monitoring in Murciano-Granadina goats. In Proceedings of the 76th Annual Meeting of the European Federation of Animal Science, Innsbruck, Austria, 28 8 2025.
  7. Baro, J. A. CABRA Hardware and Firmware Repository [Code]. Available online: https://github.com/knklB/IMUcabra (accessed on 30 December 2025).. [CrossRef]
  8. Neethirajan, S. Transforming the Adaptation Physiology of Farm Animals through Sensors. Animals (Basel) 2020, 10(9), p. 1512. [CrossRef]
  9. Harvey, P. ExifTool (Version 12.70) [Computer Software]. Available online: https://exiftool.org/ (accessed on 30 December 2025).
  10. Kaplan, E. D.; Hegarty, C. J. Understanding GPS/GNSS: Principles and Applications, 2nd ed.; Artech House: Norwood, USA ,2017; pp. 578-579.
Figure 1. CABRA System Architecture. Collar nodes sample IMU data at 20 Hz and transmit it via a low-power, routerless ESP-NOW protocol to a local base station. This enables infrastructure-free deployment. The base station logs data to an SD card and/or performs real-time behavior classification. A GPS module provides spatiotemporal context.
Figure 1. CABRA System Architecture. Collar nodes sample IMU data at 20 Hz and transmit it via a low-power, routerless ESP-NOW protocol to a local base station. This enables infrastructure-free deployment. The base station logs data to an SD card and/or performs real-time behavior classification. A GPS module provides spatiotemporal context.
Preprints 209237 g001
Figure 2. Collar node circuit driagram.
Figure 2. Collar node circuit driagram.
Preprints 209237 g002
Figure 3. System-wide time synchronization. A single ‘sync collar’ provides a common reference point. All cameras record its GPS-synchronized LCD time at session start, establishing a global clock offset. All other collars, sharing the same precise GPS time, are automatically synchronized with the cameras..
Figure 3. System-wide time synchronization. A single ‘sync collar’ provides a common reference point. All cameras record its GPS-synchronized LCD time at session start, establishing a global clock offset. All other collars, sharing the same precise GPS time, are automatically synchronized with the cameras..
Preprints 209237 g003
Figure 4. Detailed confusion matrix and behavior collection statistics for the final session for the 20 Hz session.
Figure 4. Detailed confusion matrix and behavior collection statistics for the final session for the 20 Hz session.
Preprints 209237 g004
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated