Preprint
Article

This version is not peer-reviewed.

Evaluation of the Psychometric Scale of Quiet and Passive Quitting Using an Integrated Approach Combining Structural Equation Modeling and Machine Learning

Submitted:

11 February 2026

Posted:

13 February 2026

You are already at the latest version

Abstract
Purpose The aim of the study was to evaluate and shorten a psychometric scale measuring quiet quitting and passive quitting, while maintaining the quality of measurement and the predictive utility of the instrument. Design / Methodology / Approach A hybrid approach was applied, integrating structural equation modeling (SEM) and supervised machine learning (ML). A two-factor measurement model with regression on organizational engagement (UWES-9) was estimated using a sample of 1,040 working respondents. Simultaneously, the predictive validity of the scale items was assessed using regression algorithms within a cross-validation procedure. The scale was shortened iteratively by eliminating only those items whose removal did not significantly worsen SEM model fit or ML predictive performance. Findings The scale was reduced from 14 to 9 items. The reduction led to improved SEM fit indices (increased CFI and TLI with stable RMSEA) and only a slight decrease in the predictive validity of the ML models. The results confirm that integrating SEM and ML enables effective shortening of psychometric tools while maintaining their reliability and diagnostic functionality. Research Limitations / Implications The study was based on a single external criterion (organizational engagement) and one research sample, which limits the generalizability of the results. Future studies should include other criteria (e.g., burnout, turnover) and independent validation samples. Practical Implications The shortened scale reduces respondent burden, shortens survey time, and lowers measurement costs while retaining predictive utility relevant for HR practice and organizational diagnostics. Social Implications Improved and more efficient measurement of quiet and passive quitting can support early identification of declining employee engagement, contributing to enhanced quality of work life and human resource management policies. Originality / Value The originality of the study lies in proposing an integrated procedure for shortening psychometric scales by combining measurement and predictive criteria, which constitutes a methodological contribution to research on the development and optimization of measurement tools in management and quality sciences.
Keywords: 
;  ;  ;  ;  

Introduction

In recent years, there has been growing interest in the attitude known as quiet quitting, which reflects changes in employee–organization relationships and the ways individuals experience work. This attitude refers to the intentional reduction of professional effort while formally remaining employed, inevitably leading to a decline in organizational engagement and work effectiveness. A similar attitude is passive quitting, which also involves a reduction in engagement. However, in this case, it is unintentional and may result from factors such as burnout, indifference, apathy, or exhaustion (Nowak, 2026). These two attitudes thus represent distinct mechanisms of disengagement. In response to increasing interest in these constructs, attempts have been made to operationalize them in the form of psychometric tools. However, existing research on related scales has focused primarily on classical measurement criteria (e.g., factor structure, reliability, model fit), often overlooking their predictive utility-the ability to predict important external criteria, especially organizational engagement. As a result, decisions to shorten or modify scales are made solely on the basis of measurement parameters, without considering potential loss of prognostic value. There is, therefore, a lack of approaches that integrate measurement theory with the predictive perspective characteristic of machine learning methods. The aim of this article is to address this gap by proposing and empirically validating a hybrid procedure for evaluating and shortening a scale measuring quiet and passive quitting, combining structural equation modeling (SEM) with supervised machine learning (ML). This approach enables the simultaneous assessment of the measurement quality of latent constructs and their predictive utility with respect to organizational engagement as an external criterion. The article includes a literature review, a description of the integrated methodology and procedure, presentation of SEM and ML analysis results, and a discussion of theoretical and methodological conclusions.

Literature Review

The phenomenon of quiet quitting is still so new that there is noticeable disagreement about how to define and interpret it. Quiet quitting refers to a situation in which employees deliberately lower their performance and productivity at work, fulfilling only the minimum requirements of their role, demonstrating minimal effort, motivation, and emotional engagement, and refraining from performing tasks outside the formal scope of their duties-such as staying late, attending non-mandatory meetings, or taking on additional assignments (Bolino et al., 2024). This behavior is characterized more by psychological distancing than physical resignation and is often associated with disengagement and withdrawal (Belli & Erçelik, 2026; Patel et al., 2025; Yu & Jiao, 2026). According to Scheyett (Scheyett, 2023), quiet quitting represents an approach in which “one doesn’t literally quit the job, but simply does the work that is expected in the given role, without going beyond what is required.” Many authors argue that quiet quitting is the same as burnout (Hamouche et al., 2023; Richardson, 2023) or leads to burnout (Galanis et al., 2025; Gün et al., 2025). However, another prevalent perspective emphasizes the intentional nature of quiet quitting, suggesting that it is a way to avoid excessive demands at work (Serenko, 2024). Detert (Detert, 2023) described quiet quitting as “calibrated contributing” to one’s job. Those who quietly quit do not actually want to leave their jobs but continue performing their tasks with minimal effort (Gigol, 2023; Patel et al., 2025; Talukder & Prieto, 2025). An interesting concept that bridges both perspectives is the distinction proposed by Kanwal and colleagues (Kanwal et al., 2025), who divided quiet quitting into two types: passive quiet quitting and deliberate quiet quitting, each characterized by distinct affective, cognitive, and behavioral dimensions. Similarly, Nowak (Nowak, 2026; Nowak et al., 2026) introduced a comparable differentiation, defining quiet quitting as a conscious approach, while referring to Kanwal’s passive quiet quitting as “passive quitting.” Consequently, two distinct mechanisms leading to reduced organizational commitment can be identified: quiet quitting (intentional and deliberate) and passive quitting (unintentional and unconscious). This approach is adopted in the present article. Some authors consider quiet quitting a manifestation of deviant workplace behavior (Wu & Wei, 2024). According to Bolino et al. (Bolino et al., 2024), quiet quitting represents an attitude contrary to organizational citizenship behavior and is perceived negatively by supervisors (Patel et al., 2025). Overall, quiet quitting is believed to reduce organizational performance (Moczydłowska, 2024), while also threatening the careers of those who adopt this approach (Serenko, 2024). Importantly, the phenomenon does not exist in isolation-as Yu and Jiao (Yu & Jiao, 2026) demonstrate, quiet quitting spreads through employee networks, where behaviors and negative emotions “infect” other members of the organization, amplifying the scale of the problem. Nevertheless, there are also positive interpretations of quiet quitting. For example, Dillard, Cavallo, and Zhang (Dillard et al., 2025) argue that nothing can be held against employees who fulfill their duties well during working hours, and the quiet quitting attitude represents a return to humanism in the workplace. Coworkers’ reactions to those performing only the bare minimum evolve from withdrawal of support to open hostility, and this dynamic is closely tied to the relational norms of the team and mechanisms of social contagion. As Bennett et al. (Bennett et al., 2025) indicate, employees observing quiet quitting among colleagues respond with reduced helping behaviors and a significant increase in initiated incivility. According to Samnani and Robertson (Samnani & Robertson, 2025), in environments characterized by strong interpersonal bonds, initial empathy and attempts to help, when passive attitudes persist, give way to a deep sense of betrayal, leading to collective sanctions such as ostracism or deliberate reputation undermining. The widespread adoption of the distinction between quiet quitting and passive quitting could, over time, lead to differentiated responses from coworkers and supervisors depending on the nature of the behavior, and enable more effective interventions.

Methods

In the study, previously validated measurement tools were used to assess quiet quitting and passive quitting (Nowak, 2026; Nowak et al., 2026). The developed scale is as follows:
QQS1. At work, I consciously limit myself to performing only those tasks that fall within my duties and for which I am paid.
QQS2. I consciously do only what is necessary, because maintaining a work-life balance is more important to me than going above and beyond.
QQS3. When I know I am overworked, I consciously limit my effort to only what is necessary to keep my job.
QQS4. Since extra effort is not recognized in my company, I have decided to limit myself to basic duties.
QQS5. I consciously avoid taking on tasks outside my responsibilities, even if they are interesting.
QQS6. Because I don’t feel supported by the company, I have decided not to engage beyond what my duties require.
QQS7. To maintain mental balance, I consciously distance myself from work and don’t engage more than necessary.
PQS1. I am often too tired or overwhelmed to put more than the minimum effort into my work.
PQS2. I care less and less about my job and its results-it’s hard for me to stay engaged.
PQS3. I feel so exhausted by work that it’s hard to care about the quality of what I do.
PQS4. At times, I perform my duties mechanically, without engagement or initiative.
PQS5. I feel that the lack of appreciation discourages me from putting effort into my work.
PQS6. Work no longer brings me satisfaction-I do only what’s necessary to get through the day.
PQS7. I feel that the lack of development and meaning in my work has made me care less and less about what I do.
A detailed description of the validation procedures is presented in (Nowak, 2026). The scales were evaluated in terms of face and content validity: the CVI index reached values of 0.929 (content validity) and 0.914 (clarity), while the CVR was 1.00. Subsequently, an exploratory factor analysis (EFA) was conducted. The floor effect ranged from 3.46% to 13.75%, and the ceiling effect from 6.25% to 13.65%. Low skewness and negative kurtosis indicated distributions close to normal. The KMO value was 0.918, and Bartlett’s test was significant (χ² = 9026.24, p < 0.001), confirming the suitability of factor analysis. Both Kaiser’s criterion and the scree plot indicated two factors (quiet and passive quitting). The Principal Axis Factoring method with oblique (oblimin) rotation was applied, confirming a two-factor structure consistent with the theoretical framework. The two factors explained over 66% of the variance. Cronbach’s alpha coefficients were 0.910 (quiet quitting) and 0.916 (passive quitting), confirming the high reliability and internal consistency of both scales. The goal of the method was to shorten the psychometric scale (quiet and passive quitting items) while preserving the quality of latent construct measurement and predictive utility with respect to an external criterion-organizational commitment. A hybrid approach was employed, combining:
  • Structural Equation Modeling (SEM) to assess the quality of the measurement and structural models (Nowak & Zajkowski, 2025)
  • Supervised Machine Learning (ML) to evaluate the quality of criterion prediction based on the scale items (Nowak et al., 2025).
The SEM and ML results are integrated through an iterative item elimination procedure-an item is removed only if it simultaneously meets quality criteria in both approaches. The analysis is based on employee data, where i = 1, …, N denotes individual respondents. Responses were collected for the following: QQ items (QQS1–QQS7), PQ items (PQS1–PQS7), and UWES-9 items (UWES1–UWES9). Organizational commitment, which serves as a component of the SEM model and the label in ML, was calculated as the average of the UWES items. The method consists of three main parts: A (SEM modeling), B (machine learning), and C (integrative scale reduction procedure).

Part A. SEM Modeling

A1. Measurement Model (CFA)

Two latent factors were assumed: quiet quitting (QQ) and passive quitting (PQ). For each individual i, the observed items are modeled as functions of their respective latent constructs and measurement error components (1):
Q Q S i j = ν j ( Q Q ) + λ j ( Q Q ) Q Q i + ε i j ( Q Q ) ,     j J Q Q ,
P Q S i j = ν j ( P Q ) + λ j ( P Q ) P Q i + ε i j ( P Q ) ,     j J P Q ,
where:
J Q Q , J P Q - sets of items currently retained in the model,
ν j ( Q Q / P Q ) – intercept (free term),
λ j ( Q Q / P Q ) - factor loading,
ε i j ( Q Q / P Q ) – measurement error.
To avoid identification issues during the scale reduction process, a minimum number of indicators per factor is required, i.e., J Q Q 3 and J P Q 3 .
A2. Structural Model
The criterion variable OC (organizational commitment) is explained by the latent factors QQ and PQ as follows:
O C i = α + β Q Q Q Q i + β P Q P Q i + ζ i ,
where ζ i is the disturbance term.
Correlations between the latent predictors are allowed:
C o v ( Q Q i , P Q i ) = ϕ .
A3. SEM Fit Indices
Standard fit indices are used to assess SEM model quality, particularly CFI and RMSEA:
  • – CFI (Comparative Fit Index) w in its general form:
C F I = 1 m a x ( χ M 2 d f M , 0 ) m a x ( χ B 2 d f B , 0 ) ,
where M denotes the tested model, B the baseline (independence) model, and d f the degrees of freedom.
  • − RMSEA (Root Mean Square Error of Approximation):
R M S E A = max ( χ M 2 d f M d f M ( N 1 ) , 0 ) .
In the scale reduction procedure, changes in these fit indices between the current model and a candidate model (after item removal) are used.
Part B. Machine Learning (Prediction of Organizational Commitment Based on Scale Items)

B1. Dependent Variable and Predictors

In the machine learning (ML) part, the problem is formulated as supervised regression:
  • Dependent variable: Organizational commitment (OC),
  • Feature vector: Responses to the QQS and PQS items that remain in the scale:
x i R p , p = J Q Q + J P Q .
The predictive model can be written as:
O C ^ i = f ( x i ; θ ) ,
where f ( ) is a regression function (e.g., Random Forest), and θ denotes the model parameters.

B2. Cross-Validation

Prediction quality is not estimated on the training data but through a K-fold cross-validation procedure. Given a partition of the dataset:
1 , , N = k = 1 K I k ,     I k I l = ( k l ) ,
for each fold k , the model is trained on { i I k } and evaluated on { i I k } . The final result is the average over all folds.

B3. Prediction Quality Metric

The main metric used to assess prediction quality is RMSE (Root Mean Square Error), where lower values indicate better predictive performance:
RMSE = 1 n i = 1 n ( O C i O C ^ i ) 2 .
RMSE = B4. Selection of the Best ML Model
On the full item set ( p = p 0 ), a set of candidate models { f m } m = 1 M (e.g., ten regression models) is compared. The reference model is selected by minimizing the average RMSE from cross-validation:
m * = a r g m i n m { 1 , , M } E C V [ R M S E m ] ,
where E C V [ ] denotes the mean across folds.
The selected model f m *   is then kept fixed throughout the entire scale reduction process to ensure comparability across iterations.
Part C. Integrative Greedy Scale Reduction Procedure

C1. General Idea

The scale reduction procedure is iterative. In each iteration, the removal of a single item from the current set of indicators is considered. An item is removed only if:
  • SEM model fit does not deteriorate beyond the defined thresholds, and
  • ML prediction quality does not deteriorate beyond the defined thresholds.
Let J ( t ) denote the set of items at iteration t (composed of z J Q Q ( t ) and J P Q ( t ) ). For each candidate item j J ( t ) , a reduced item set is constructed:
J j ( t ) = J ( t ) { j } .
Next, the following are estimated:
SEM metrics: ( C F I ( t ) , R M S E A ( t ) ) for the current model, and ( C F I j ( t ) , R M S E A j ( t ) )
  • – for the candidate model without item j,
ML metric: R M S E ( t ) and R M S E j ( t )
  • – (cross-validated averages).

C2. Acceptance Criteria

The following differences are defined:
Δ C F I j = C F I j t C F I t ,
Δ R M S E A j = R M S E A j t R M S E A t ,
Δ R M S E j = R M S E j ( t ) R M S E ( t ) .
An item j is accepted for removal only if:
Δ C F I j τ C F I ,
Δ R M S E A j τ R M S E A ,
Δ R M S E j τ R M S E ( R M S E ( t ) ) ,
where τ C F I = 0.005 , τ R M S E A = 0.005 , while the tolerance for RMSE takes the form:
τ R M S E ( R M S E ( t ) ) = m a x ( τ a b s , τ r e l R M S E ( t ) ) ,
with τ a b s = 0.01 and τ r e l = 0.01 (i.e., 1% of the current value).
Additionally, an identification constraint is imposed:
J Q Q ( t ) 3 , J P Q ( t ) 3 .
It is also assumed that, in the context of SEM modeling, the RMSE must not exceed the acceptable threshold of 0.08 in total.

C3. Selection of Items for Removal in a Given Iteration

If, in iteration t, there exists at least one item that meets the conditions, the item is selected that maximizes the preference function (for example, by favoring no deterioration in ML and improvement in SEM):
j * = a r g m a x j A ( t ) S ( j ) ,
where A ( t ) is the set of items meeting the acceptance criteria, and S ( j ) m may take a hierarchical form (e.g., first minimizing R M S E j ( t ) , then minimizing R M S E A j ( t ) , and finally maximizing C F I j ( t ) ). The item j * is removed, and the procedure proceeds to iteration t + 1 .

C4. Stopping Criterion

The procedure ends when:
  • − there is no item whose removal simultaneously meets the SEM and ML criteria,
  • − or the minimum number of items per factor has been reached.
The integration of SEM and ML allows for a parallel evaluation of two complementary aspects of scale quality:
5.
measurement quality and theoretical consistency (SEM – factor structure, global fit),
6.
predictive utility (ML – informativeness of items in predicting the criterion).
The goal of the procedure is to eliminate redundant or poorly fitting items while limiting the loss of predictive information.

Results

The empirical part is based on a CAWI survey conducted between May 23–29, 2025, on a sample of 1,040 Polish employees (Ariadna panel). The questionnaire included demographic items, proprietary scales of quiet quitting (QQ) and passive quitting (PQ), as well as the organizational commitment scale (UWES-9). The collected data were used in SEM analyses and machine learning.
Part A. SEM Modeling
In Part A, SEM modeling was conducted. In the base model, QQ and PQ were treated as latent factors measured by 7 items each, and organizational commitment (OC) was represented as an observed average from UWES. The following fit indices were obtained: CFI = 0.9462, TLI = 0.9359, RMSEA = 0.0755. CFI and TLI indicate good fit, while RMSEA points to an acceptable, though suboptimal, level. The two-factor structure with regression on OC thus receives general empirical support, while still allowing for further model optimization (e.g., through the reduction of weakly fitting or redundant items).

Part B. Machine Learning (Prediction Of Organizational Commitment Based On Scale Items)

In the baseline model, the dependent variable was organizational commitment (mean score from UWES-9), and the predictors were items from the QQS and PQS scales. The following methods were applied: linear regression, Ridge, Lasso, Elastic Net, SVR, KNN, as well as ensemble methods: Random Forest, Extra Trees, Gradient Boosting, and HistGradientBoosting (Nowak et al., 2025), using 5-fold cross-validation (Table 1).
The results indicate a significant predictive value of the QQS and PQS items with respect to commitment. The models achieved moderate fit (R² = 0.4361–0.5048), explaining approximately half of the variance in UWES-9. Tree-based methods outperformed linear models. The highest performance was achieved by Random Forest (RMSE = 0.9215, R² = 0.5048, MAE = 0.6791), with results very similar to Gradient Boosting, suggesting a nonlinear and interactional nature of the relationships. This confirms the high predictive utility of the items alongside their theoretical validity (SEM). Due to its lowest RMSE, Random Forest was adopted as the reference model in the scale-shortening procedure.
Part C. Integrative Scale-Shortening Procedure (Greedy Elimination)
The shortening procedure for the QQ and PQ scales was conducted using a greedy algorithm. In each iteration, all remaining items were analyzed by simulating their individual removal. For each variant, new SEM fit indices were calculated: C F I ( j ) , R M S E A ( j ) , as well as ML prediction quality (Random Forest, 5-fold CV: R M S E ( j ) , R ( j ) 2 . An item was included in the candidate set only if it simultaneously met the tolerance criteria for both SEM and ML. Selection followed a lexicographic rule: first, RMSE minimization; in case of a tie-lower RMSEA; next-higher CFI. The following items met the elimination criteria: QQS3, QQS6, PQS4, QQS4, and PQS5 (Table 2).
The first two eliminations (QQS3 and QQS6) improved both SEM fit and ML prediction. Higher CFI and TLI and lower RMSE indicate these items added measurement error without meaningful predictive value. Removing PQS4 was prediction-neutral (stable RMSE) while preserving very good SEM fit, suggesting redundancy. Subsequent eliminations (QQS4 and PQS5) produced controlled deterioration in ML with continued SEM improvement, indicating some predictive contribution but within tolerance thresholds. Initially, items weak in both measurement and prediction were removed; later, a classic trade-off emerged between measurement quality and predictive information. The final scale includes nine items (−36%). SEM fit did not worsen: CFI increased (0.9462 → 0.9665) and TLI (0.9359 → 0.9543), with stable RMSEA (0.0755 → 0.0756). This reflects removal of items with higher measurement error, weaker loadings, or partial ambiguity. Improved CFI/TLI with unchanged RMSEA suggests better relative fit without greater approximation error. From the SEM perspective, the reduction is beneficial or at least non-detrimental. In prediction (Random Forest), accuracy declined moderately: RMSE rose from 0.9215 to 0.9383 and R² fell from 0.5048 to 0.4847 (≈ −0.02). The loss is systematic but within accepted thresholds and typical when reducing predictors. The shortened scale is slightly less predictive but more efficient, reducing survey time, costs, and respondent burden while maintaining or improving measurement quality.

Summary

The article contributes to psychometrics in organizational research by integrating two paradigms of tool evaluation: the measurement-oriented approach (SEM) and the predictive approach (ML). Until now, scale shortening has primarily relied on factor loadings and model fit, neglecting predictive value. The proposed procedure demonstrates that measurement quality and predictive utility are complementary yet distinct, and that scale reduction should therefore be treated as a task of multi-criteria optimization. It has been shown that the constructs of quiet and passive quitting can be measured using a shorter, more coherent tool without deteriorating the theoretical model.The practical implications include reduced respondent burden, shorter survey time, and preserved diagnostic value. Including the predictive criterion mitigates the risk of developing scales that are psychometrically sound but weak in prediction. The procedure is universal in nature and can be applied to the evaluation of other scales related to organizational criteria (e.g., performance, turnover, well-being). Limitations include reliance on a single external criterion, selection of a specific ML algorithm (random forest), heuristic elimination without a guarantee of global optimum, and lack of validation on an independent sample. Future research should incorporate multiple criteria, compare optimization strategies, and validate the shortened scale in various organizational contexts, as well as integrate SEM and ML in hybrid or regularized models.

References

  1. Belli, M., & Erçelik, Z. E. (2026). Moral resilience and quiet quitting tendencies among pediatric nurses: A multiple linear regression study. Journal of Pediatric Nursing, 87, 154–159.
  2. Bennett, A. A., Epler, R. T., Thomas, V. L., & Jalil, D. (2025). Exploring coworker perceptions of and reactions to quiet quitting. Human Resource Management, 64(6), 1815–1831.
  3. Bolino, M. C., Klotz, A. C., & Whitney, J. M. (2024). The origin, evolution, and future of organizational citizenship behavior. Academy of Management Collections, 3(3), 35–56.
  4. Detert, J. (2023). Let’s call quiet quitting what it often is: Calibrated contributing. MIT Sloan Management Review, 64(2), 1–3.
  5. Dillard, N., Cavallo, T., & Zhang, P. (2025). A return to humanism: A Multi-Level analysis exploring the positive effects of quiet quitting. Human Resource Development Review, 24(2), 127–156.
  6. Galanis, P., Katsiroumpa, A., Vraka, I., Siskou, O., Konstantakopoulou, O., Katsoulas, T., Moisoglou, I., Gallos, P., & Kaitelidou, D. (2025). The influence of job burnout on quiet quitting among nurses: The mediating effect of job satisfaction. International Journal of Nursing Practice, 31(5), e70057.
  7. Gigol, T. (2023). Quiet quitting, job burnout and turnover intention. International business from east to west: Global risks and opportunities, 18–25.
  8. Gün, Ibrahim, Balsak, H., & Ayhan, F. (2025). Mediating effect of job burnout on the relationship between organisational support and quiet quitting in nurses. Journal of Advanced Nursing, 81(8), 4644–4652.
  9. Hamouche, S., Koritos, C., & Papastathopoulos, A. (2023). Quiet quitting: relationship with other concepts and implications for tourism and hospitality. International Journal of Contemporary Hospitality Management, 35(12), 4297–4312.
  10. Kanwal, F., Putri, N., Sawhney, G., & Bijlani, A. (2025). Understanding Quiet Quitting: An Exploratory Study of the Concept. Academy of Management Proceedings, 2025(1), 13357.
  11. Moczydłowska, J. M. (2024). Quiet quitting as a challenge for human capital management–the results of qualitative research. Zeszyty Naukowe. Organizacja i Zarzadzanie/Politechnika Ślaska.
  12. Nowak, M. (2026). Spadek zaangażowania organizacyjnego – ciche i bierne odchodzenie. Polskie Towarzystwo Ekonomiczne.
  13. Nowak, M., Wiecek-Janka, E., & Zajkowski, R. (2026). The Impact of Quiet Quitting and Passive Quitting Phenomena on Occupational Burnout.
  14. Nowak, M., & Zajkowski, R. (2025). An Integrated Structural Equation Modelling and Machine Learning Framework for Measurement Scale Evaluation—Application to Voluntary Turnover Intentions. AppliedMath, 5(3), 105.
  15. Nowak, M., Zajkowski, R., & Pawłowska-Nowak, M. (2025). Prediction of the Sustainability Index in Family Firms Using Explainable Artificial Intelligence. Sustainability, 17(16), 7226.
  16. Patel, P. C., Guedes, M. J., Bachrach, D. G., & Cho, Y. (2025). A multidimensional quiet quitting scale: Development and test of a measure of quiet quitting. PloS one, 20(4), e0317624.
  17. Richardson, S. D. (2023). Making the Entrepreneurial Transition: Understanding the Challenges of Women Entre-Employees. Springer Nature.
  18. Samnani, A.-K., & Robertson, K. (2025). More Than a Personal Decision: A Relational Theory of Quiet Quitting. Human Resource Management.
  19. Scheyett, A. (2023). Quiet quitting. W Social Work (T. 68, Numer 1, s. 5–7). Oxford University Press.
  20. Serenko, A. (2024). The human capital management perspective on quiet quitting: recommendations for employees, managers, and national policymakers. Journal of Knowledge Management, 28(1), 27–43.
  21. Talukder, M. F., & Prieto, L. (2025). A “quiet quitting” scale: development and validation. International Journal of Organizational Analysis, 33(6), 1487–1510.
  22. Wu, A., & Wei, W. (2024). Rationalizing quiet quitting? Deciphering the internal mechanism of front-line hospitality employees’ workplace deviance. International Journal of Hospitality Management, 119, 103681.
  23. Yu, Y., & Jiao, X. (2026). Social interactions in organisations: Investigating employee quiet quitting using spatial econometric methods. Tourism Management, 115, 105396.
Table 1. Results of the machine learning process for the baseline model.
Table 1. Results of the machine learning process for the baseline model.
Algorithm RMSEA R2 MAE
RandomForest 0.9215 0.5048 0.6791
GradientBoosting 0.9227 0.5028 0.6750
ExtraTrees 0.9437 0.4810 0.6983
HistGradientBoosting 0.9441 0.4802 0.6944
Lasso 0.9495 0.4746 0.7194
ElasticNet 0.9496 0.4745 0.7193
Ridge 0.9497 0.4744 0.7193
LinearRegression 0.9497 0.4744 0.7193
KNN 0.9637 0.4590 0.7135
SVR(RBF) 0.9832 0.4361 0.7385
Source: own elaboration.
Table 2. Changes in RMSEA indices in SEM and in ML for removed items from the quiet and passive quitting scales.
Table 2. Changes in RMSEA indices in SEM and in ML for removed items from the quiet and passive quitting scales.
Removed item RMSEA_SEM CFI TLI RMSE_ML R2_ML
Baseline model 0.0755 0.9462 0.9359 0.9215 0.5048
QQS3 0.0765 0.9499 0.9392 0.9143 0.5125
QQS6 0.0685 0.9622 0.9532 0.9138 0.5128
PQS4 0.0723 0.9618 0.9515 0.9212 0.5050
QQS4 0.0737 0.9637 0.9524 0.9294 0.4956
PQS5 0.0756 0.9665 0.9543 0.9383 0.4847
Source: own elaboration.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated