Business, Economics and Management

Sort by

Article
Business, Economics and Management
Econometrics and Statistics

David Veloso-Castello

,

J. Carlos García-Díaz

Abstract: This paper analyzes volatility forecasting in the Spanish electricity spot market over the period 2021–2025, characterized by uncertainty, frequent price jumps, and the increasing occurrence of zero and negative prices. To accommodate these features, electricity prices are shifted to ensure welldefined logreturns, and predictable intraday and seasonal patterns are removed using the Ullrich demeaning procedure. Daily realized volatility measures are constructed from highfrequency data, including jumprobust and noiserobust estimators such as Median Realized Volatility and Realized Kernel. A broad set of volatility models, comprising GARCHtype specifications and multiple extensions of the Heterogeneous Autoregressive (HAR) framework, is evaluated using a coherent outofsample forecasting procedure. Model comparison is conducted through the Model Confidence Set methodology based on the QLIKE loss function, which identifies a Superior Set of Models with equal predictive ability. Conditional diagnostics, including OutofSample ROOS2measures and Mincer–Zarnowitz regressions, are subsequently used to characterize forecast accuracy, unbiasedness, and efficiency. The empirical results show that all GARCH models are systematically excluded from the superior set, while HARtype specifications based on realized volatility dominate. Within this set, a HAR model incorporating Median Realized Volatility, jump components, and dayoftheweek effects delivers the strongest economic performance, achieving an OutofSample ROOS 2close to 0.5 with unbiased forecasts. Overall, the findings highlight the importance of longmemory dynamics, discontinuous price movements, and residual weekly seasonality for volatility forecasting in modern electricity markets.

Article
Business, Economics and Management
Econometrics and Statistics

Marlon Fritz

,

Thomas Gries

,

Yuanhua Feng

Abstract: The most widely used method for trend estimation in economics is the Ho-drick-Prescott (HP) filter. The HP filter has various disadvantages as the arbitrary and frequency-dependent choice of the smoothing parameter λ, boundary problems and difficult interpretation when linking to economic theory. We suggest an alternative method by improving some of these disadvantages using a purely data-driven, endog-enous nonparametric trend estimation. A simulation study and different applications demonstrate the advantages of the nonparametric trend compared to the HP filter. We identify optimal time windows supporting the momentary growth trend. Within this window economic fundamentals smoothly change and drive the trend.

Article
Business, Economics and Management
Econometrics and Statistics

Anjali Chaudhary

,

Nisa Vinodkumar

,

Sayeda Meharunisa

,

Naila Iqbal Qureshi

,

Hena Naaz

,

Shoaib Ansari

Abstract: Achieving carbon neutrality has become a central policy objective for emerging economies, particularly the BRICS countries-BRICS (Brazil, Russia, India, China, and South Africa) which collectively account for a substantial share of global carbon emissions and energy consumption. The transition toward green energy, rapid technological innovation, and the expansion of green finance mechanisms are increasingly viewed as critical drivers of sustainable development and environmental improvement. However, empirical evidence integrating these three dimensions within a unified analytical framework for BRICS remains limited. This study examines the contribution of green energy transition, technological innovation, and green finance to achieving carbon neutrality in BRICS countries using a Pooled mean group auto regressive distributed Lag (PMG ARDL) framework and Dumitrescu–Hurlin panel causality analysis. The results indicate that green energy transition significantly reduces carbon emissions in both the long run (−0.45) and short run (−5.65), emphasizing the importance of shifting toward renewable energy sources. Technological innovation exerts a significant negative effect in the long run (−0.17), reflecting efficiency gains and cleaner production, although its short-run impact remains insignificant. Similarly, green finance improves environmental quality in the long run (−0.10) by supporting low-carbon investments, while short-run effects are statistically insignificant due to adjustment frictions. Economic growth increases emissions in the long run (0.43), confirming the scale effect, whereas trade openness reduces emissions (−0.87), indicating the role of technology diffusion. The error correction term (−0.76) confirms a strong convergence toward long-run equilibrium. The causality analysis reveals unidirectional causality from green energy transition, technological innovation, and green finance to carbon emissions, while bidirectional causality exists between economic growth and emissions, highlighting a feedback mechanism. Policy implications suggest that BRICS economies should strengthen green financial systems, accelerate renewable energy adoption, promote innovation-driven sustainability, and design growth strategies that decouple economic expansion from environmental degradation.

Article
Business, Economics and Management
Econometrics and Statistics

Alireza Yazdani

Abstract: This paper revisits and extends the machine learning framework for U.S. recession prediction introduced by Yazdani2020 by incorporating post-pandemic macroeconomic dynamics, an expanded predictor set and machine learning models. Using monthly data from January 1959 through December 2024, recession forecasting is formulated as an imbalanced binary classification problem. We use downsampling for static models and class-weighted loss functions for neural networks and evaluate model performance using classification metrics robust to rare events. We further examine structural stability across four economic regimes and assess economic value through a dynamic stock–bond allocation strategy. We observe that ensemble tree methods, particularly gradient boosting (XGBoost, LightGBM) and random forests, consistently deliver the strongest discrimination, with out-of-sample AUC above 0.99 and PR-AUC above 0.96. The Transformer achieves probability calibration, and Deep sequence models exhibit high discrimination, while performance deteriorates across model classes in the 2020–2024 regime, especially for linear specifications. We also examine risk-adjusted returns of models. Overall, ensemble trees and Transformers show high predictive power and emerge as complementary tools in macroeconomic recession forecasting.

Article
Business, Economics and Management
Econometrics and Statistics

Meiqi Chen

,

Hyukku Lee

Abstract: Urban eco-efficiency (UEE) is fundamental to achieving China's dual-carbon goals. However, literature has overlooked green space carbon sequestration, and linear models fail to capture complex nonlinear relationships. This study integrates green space carbon sinks into the evaluation framework, employing the global super-efficiency EBM model to measure the UEE of 108 cities in the Yangtze River Economic Belt (YREB) from 2012 to 2023. It combines XGBoost-SHAP with Geographically and Temporally Weighted Regression (GTWR) to examine UEE's spatiotemporal dynamics and driving mechanisms. The findings reveal that: (1) UEE in the YREB increased from 1.0760 in 2012 to 1.0990 in 2023, while spatial polarization became more pronounced. (2) Core driving factors exhibited significant nonlinear threshold and interactive effects. Specifically, fiscal decentralization's environmental dividend is contingent on active government intervention to circumvent localized "race to the bottom" behaviors. Furthermore, population density transitions from yielding scale dividends to inducing "crowding effects" beyond optimal capacities—a degradation advanced financial systems appear unable to mitigate. (3) A spatiotemporal misalignment was observed: fiscal decentralization unleashed green institutional dividends downstream (coefficients up to 0.0682), but caused a race to the bottom in middle and upper reaches (extending to -0.6548); excessive population agglomeration in megacities induced a crowding effect eroding early pollution control dividends. This study supports abandoning one-size-fits-all approaches and developing precise, spatiotemporally differentiated low-carbon policies.

Article
Business, Economics and Management
Econometrics and Statistics

Xingwei Hu

,

Caihong Hu

,

Cheng-Kuang Wu

Abstract: This paper derives closed-form expressions for the asymptotic covariance matrices of factor loading and uniqueness estimators obtained from several widely used factor extraction methods, including least squares, principal factor, iterative principal component, alpha factor, and image factor analysis. By treating factor solutions as implicitly defined estimators, the proposed framework characterizes the asymptotic behavior of factor loadings and uniquenesses as explicit functions of the asymptotic covariance matrix of the sample covariance or correlation matrix. This approach avoids reliance on likelihood-based information matrices, numerical differentiation, and resampling methods. Consequently, valid statistical inference is feasible under non-Gaussian sampling, serial dependence, and conditional heteroskedasticity, and can be implemented using heteroskedasticit- and autocorrelation-robust or other sandwich estimators of second moments. The framework naturally accommodates applications in which factor analysis is applied to residual covariance matrices arising from multivariate regressions, panel data models, or structural vector autoregressions (SVARs). Monte Carlo simulations demonstrate accurate finite-sample performance, and an empirical illustration shows how the proposed formulas can be implemented in practice. From an econometric perspective, the results are particularly relevant for settings in which factor structures serve as intermediate objects---such as dynamic factor models, factor-augmented regressions, and SVARs---allowing uncertainty in factor estimates to be coherently propagated into impulse response functions, forecast-error variance decompositions, and other nonlinear functionals used in structural inference.

Article
Business, Economics and Management
Econometrics and Statistics

Julio César Mariños-Alfaro

,

Augusto Aliaga-Miranda

,

Luis Ricardo Flores-Vilcapoma

,

Paulo César Callupe-Cueva

,

Luis Antonio Visurraga-Camargo

,

Alexandra Rivas-Meza

,

Yadira Yanase-Rojas

Abstract: The purpose of this investigation was to analyze the effect of financial structure and fruit-fly control on the development of Small and Medium Enterprises (SMEs) of citrus in the Central Jungle of Peru. Using a quantitative design and a balanced sample of 54 observations, the analysis estimates complementary linear models with interaction terms and restricted cubic spline specifications to capture direct, synergistic, and nonlinear effects. The baseline results show that both financing structure and fruit-fly control exert positive and statistically significant effects on business growth. The interaction term is also positive and significant, indicating that the returns to improved financing rise when phytosanitary management is stronger, and that effective pest control becomes more productive when firms operate with more stable and diversified financial resources. Flexible spline estimates further reveal that these relationships are not constant across the explanatory range, but vary according to firms’ positions within the financial and technological space. Overall, the findings suggest that sustainable growth in citrus SMEs depends on the simultaneous strengthening of rural finance and phytosanitary capabilities under conditions of production risk and market constraints. The study contributes to the agricultural development literature by linking crop protection, farm-level managerial capacity, and enterprise performance in a single empirical framework.

Article
Business, Economics and Management
Econometrics and Statistics

Israel Maingo

,

Leonard Marevhula

Abstract: This study looks into the predictive performance of linear econometric and deep learning methodologies for the South African unemployment rate quarterly data. In this paper, the Autoregressive Integrated Moving Average with exogenous variables (ARIMAX) model was compared to the Long Short-Term Memory (LSTM) network using unemployment rate quarterly data. Exploratory Data Analysis (EDA) suggested that the unemployment rate series is non-stationary, with structural breaks around 2020 and time-varying volatility. Stationarity tests established the need for differencing, whereas diagnostic tests revealed the presence of autocorrelation and ARCH effects in the raw data. The ARIMAX model added labour market covariates, and the differenced Not Economically Active (NEA) variable was statistically significant, whereas Discouraged workers were not. Although the ARIMAX model provided a good in-sample fit, residual diagnostics showed deviations from normality. Out-of-sample forecast study revealed moderate predictive accuracy, with relatively substantial forecast errors and increasing prediction intervals over time. In contrast, the LSTM model showed significant learning capacity, with early convergence and well-behaved residuals that meet both independence and homoskedasticity criteria. The model achieved significantly lower forecast errors, with RMSE, MAE, and MAPE values much lower than those of the ARIMAX model. Comparative forecast analysis using Diebold-Mariano (DM) test and model confidence Set (MCS) method and bootstrap confidence intervals consistently demonstrated the statistical superiority of the LSTM model. The findings give strong evidence that the LSTM model outperformed the ARIMAX model for projecting South African unemployment rate. The findings emphasise the importance of nonlinear modelling approaches in capturing complex labour market dynamics while also demonstrating the limitations of classic linear models. These findings also emphasise the importance of using nonlinear machine learning algorithms in macroeconomic forecasting.

Article
Business, Economics and Management
Econometrics and Statistics

Marcin Nowak

Abstract: The increasing use of large language models (LLMs) in enterprises creates a need for the effective selection between lower-cost models and more advanced ones. The aim of the article is to propose a multicriteria decision-making framework for prompt routing to LLMs in an enterprise environment, taking into account organizational preferences regarding cost, response quality, business risk, response time, standardization, and creativity. The study adopts a design-and-evaluation approach. In the design phase, a mechanism was developed in which prompts are assessed according to managerial routing criteria, weighted using the AHP method, and then directed to either a lower-cost or a more powerful model using the SAW method. In the evaluation phase, the solution was tested on a dataset of 100 business prompts and compared with two benchmark strategies: always cheap and always strong. The article’s contribution includes framing LLM routing as a managerial decision-support problem, operationalizing managerial routing criteria, and proposing evaluation metrics such as sufficiency rate, average cost per prompt, cost per sufficient response, and incremental cost of sufficiency gain. The results indicate that the proposed solution improves the cost–quality trade-off, while maintaining an acceptable level of response sufficiency and limiting the cost of query handling.

Article
Business, Economics and Management
Econometrics and Statistics

Muhammad Sukri Bin Ramli

Abstract: The global copper market is experiencing a period of fundamental structural volatility, guided by supply chain realignments, geopolitical friend-shoring, and an increasing reliance on the circular economy. To accurately diagnose the current state of this critical mineral, this paper presents a strictly empirical, data-driven algorithmic pipeline, the Apex Empirical Model, applied to recent UN Comtrade transaction ledgers (2020-2025). By utilizing robust machine learning architectures, this research systematically identifies a phenomenon we term Stage-Specific Starvation (SSS) across the upstream, midstream, circular, and downstream stages of the value chain. Integrating Deep Autoencoders, Network Graph Analysis, Holt-Winters Time-Series Forecasting, and Risk-Parity Optimization, the model successfully isolates targeted capital flight via transfer mispricing and maps the exact flow-through volumes of global transshipment hubs. Furthermore, the framework applies network topology to assess systemic vulnerabilities, empirically confirming the existence of a geopolitical price premium, and engineers a continuous mass-balance metric to predict projected smelter capacity adjustments six months into the future. Finally, our resilience metrics mathematically prove the financial arbitrage and stability advantages of secondary scrap integration. Ultimately, this research leverages Causal Inference to introduce Circular Risk Parity (CRP), providing a prescriptive, optimized portfolio allocation that balances risk equally across the supply chain, allowing stakeholders to navigate exogenous supply shocks in the modern copper market.

Article
Business, Economics and Management
Econometrics and Statistics

Carlo Mari

,

Emiliano Mari

Abstract: A locally parametric framework is proposed for Monte Carlo simulation of electricity prices that jointly reproduces the key stylized facts of power markets: mean-reversion, fat tails, asymmetry, and volatility clustering. Following a two-stage pipeline in which mean-reversion is estimated separately from the innovation distribution, the paper focuses on the second stage: simulating the residual innovations via topological conditioning on Natural Visibility Graphs (NVG) built on the observed innovation sequence. At each simulation step, the local structure of the graph is used to identify historically similar market states and to draw the next innovation from a locally fitted distribution. The key methodological contribution is that this topological conditioning mechanism simultaneously determines the local scale, skewness, and tail weight of the innovation distribution — three properties that parametric models such as GARCH must address through separate equations — without any assumption on regime dynamics or transitions. The framework is locally parametric: the number of model parameters grows with the sample size rather than being fixed in advance, and the specific distributional family used as a local working model can be replaced without altering the conditioning mechanism. Applied to two power markets with contrasting distributional characteristics — the Italian Power Exchange (PUN) and PJM West Hub (US) — the framework achieves simultaneous coverage of three distributional statistics (\( \hat\sigma \), \( \hat\gamma, \hat\kappa \)) and the first-order autocorrelation of squared innovations \( \hat\rho_1(\varepsilon_t^2) \) for both markets, with a single neighbourhood size k=10 and no market-specific re-calibration; more generally, k serves as the natural adaptation parameter for markets with substantially different distributional characteristics.

Concept Paper
Business, Economics and Management
Econometrics and Statistics

Chhunhong Te

Abstract: Background: Small-scale retail kiosks commonly deploy transactional point-of-sale (POS) systems that capture sales data but lack integrated analytical and forecasting capabilities for operational decision support. This gap limits the ability of small-and-medium enterprise (SME) operators to respond proactively to demand fluctuations. Methods: This study presents the structured analysis, design, implementation, and evaluation of a cloud-deployed self-service kiosk system embedding interactive analytics and demand forecasting modules. The system integrates a Django-based backend, a PostgreSQL relational database, RESTful APIs, a structured demand simulation engine, and three forecasting models: Seasonal ARIMA (SARIMA), XGBoost Regressor, and Scikit-Learn Gradient Boosting Regressor. Forecasting performance was evaluated using rolling backtesting with Mean Absolute Error (MAE), Root Mean Squared Error (RMSE), and Mean Absolute Percentage Error (MAPE). Results: The Gradient Boosting Regressor achieved the highest predictive accuracy (MAE = $93.74, RMSE = $112.65, MAPE = 8.9%), outperforming both XGBoost (MAPE = 10.0%) and SARIMA (MAPE = 10.2%). The proposed architecture demonstrates that Systems Analysis and Design principles can guide the development of an integrated decision-support platform for small retail environments. Machine learning ensemble models more effectively capture nonlinear demand patterns generated by growth and seasonality dynamics than classical time-series models. The system is deployed as a proof-of-concept cloud application accessible at the address listed in the Data Availability section.

Article
Business, Economics and Management
Econometrics and Statistics

Jinhua Sun

,

Lifang Gao

,

Jian Hu

,

Rong Tang

Abstract: The flow of R&D factors serves as a crucial channel for linking regional collaborative innovation resources and plays a significant role in promoting spatial knowledge spillovers, making it an important engine for the in-depth implementation of innovation-driven development strategies. This study takes the Yangtze River Delta urban agglomeration as its research object, utilizing data from 2003-2023, and employs gravity models and dynamic spatial fixed effects models to analyse the impact of R&D factor flows on regional collaborative innovation, as well as the moderating role of intellectual property protection. The study revealed that both the flow of R&D personnel and R&D capital significantly promote regional collaborative innovation, with the flow of R&D capital playing a more prominent role. The intensity of intellectual property protection positively moderates the relationship between the flow of R&D factors and regional collaborative innovation, but a single threshold effect exists, where the moderating effect weakens after exceeding the threshold. The intensity of inter-city collaborative innovation continues to increase, with core cities such as Shanghai, Hangzhou, and Nanjing playing a significant leading role. The emergence of new central cities such as Nantong, Ningbo, and Jiaxing has driven the evolution of collaborative innovation toward a "star-shaped" structure. The mechanism for the flow of R&D factors should be optimized, the intensity of intellectual property protection should be balanced, collaborative innovation between core cities and emerging central cities should be strengthened, and regional innovation infrastructure construction should be enhanced to promote high-quality innovative development in the Yangtze River Delta region.

Article
Business, Economics and Management
Econometrics and Statistics

Shancheng Hu

,

Weiyi Xiang

,

Yichao Wan

Abstract: Using panel data from 278 Chinese cities during 2011–2019, this study examines how digital financial inclusion (DFI) affects economic growth through capital deepening, total factor productivity, and technological innovation. The results reveal substantial heterogeneity across DFI dimensions. Expansion of coverage breadth significantly and robustly promotes city-level economic growth. In contrast, greater usage depth exerts a negative effect, possibly due to regulatory lags in internet credit and insurance that intensify financial risks. The digitisation level shows a positive but statistically insignificant impact, indicating that digital infrastructure has not yet been fully transformed into growth-enhancing productivity. These findings suggest that policy efforts should prioritise broadening DFI coverage while strengthening regulation of usage-related activities, thereby balancing financial innovation with systemic stability.

Article
Business, Economics and Management
Econometrics and Statistics

Milad Javadi

Abstract: This study examines how the depth of temporary price cuts is related to weekly unit sales in the ready-to-eat cereal category of the Dominick’s Finer Foods scanner data for 1989–1997. Promotion depth is modeled as a dose–response treatment in a store × UPC × week panel. The baseline specification regresses ln(1+MOVE) on discount-depth bins while absorbing store × UPC, store × week, and UPC × week fixed effects, with two-way clustered standard errors by store and UPC. The cleaned and trimmed panel contains 4.64 million observations, and the preferred estimating sample uses the top 100 UPCs ranked by cumulative sales. Deeper discounts are consistently associated with larger same-week lift. Relative to the 0–5% bin, the preferred top-100 estimates imply approximately 4.7% higher weekly sales for 5–10% discounts, 10.4% for 10–20% discounts, and 28.0% for discounts above 20%. Additional episode evidence indicates that most promotions are short, although a long right tail motivates treating very long runs as price regimes. A corrected event study around promotion starts shows a large contemporaneous spike but a significantly positive near lead, so those dynamics are best read as descriptive. By contrast, a promotion-end event study documents a persistent post-promotion dip of roughly 2–5% for at least eight weeks after promotions end, consistent with inventory drawdown. The evidence therefore supports strong contemporaneous lift and dynamic payback, although the coarse-bin static design does not, by itself, establish concavity in the depth–sales relationship.

Article
Business, Economics and Management
Econometrics and Statistics

Ntebogang Dinah Moroke

Abstract: This paper develops a deep reinforcement learning (DRL) framework for cryptocurrency portfolio management in which transaction costs are derived from the Riemannian geometry of the underlying volatility model rather than assumed constant. A Proximal Policy Optimisation (PPO) agent is trained on a reward function derived from non-equilibrium thermodynamics: the free-energy Bellman equation, in which (i) transaction costs are the geodesic slippage S on the Fisher information manifold of a maximum-entropy Markov-switching GARCH model, and (ii) regime-transition costs are the Wasserstein-2 distance Wt between the calm and turbulent return distributions. The agent is embedded in the WOW-E-W quadrilogy, a four-paper research programme that integrates statistical mechanics, fluid dynamics, Riemannian information geometry, and thermodynamic control into a unified cryptocurrency risk architecture. The PPO agent observes an 11-dimensional state vector ot that combines turbulent-regime probabilities \( \hat{\xi}_t(2) \) and parameter estimates \( \hat{\theta}_t \) from a maximum-entropy Markov-switching GARCH model, a viscosity-filtered velocity signal ht and gate states zt, rt from a GRU viscosity filter, and the Fisher curvature Gt, Ricci scalar κt, Betti numbers β0,t, β1,t,Wasserstein dissipation Wt, and topological alarm dI(t) from the Riemannian execution geometry layer. The framework establishes a thermodynamic Carnot bound on portfolio efficiency: η ≤ 1 − Hturb/Hcalm, where Hturb and Hcalm are the maximum-entropy values of the turbulent and calm regime distributions. Five hypotheses are tested across Bitcoin, Ethereum, Ripple, Litecoin, and Bitcoin Cash over January 2017 to March 2026: the geometric-cost PPO agent achieves higher Sharpe ratio than Buy-and-Hold, Greedy signal-following, and flat-fee PPO baselines (bootstrap p < 0.05 for four of five assets); portfolio turnover is reduced by 56 to 83 percent relative to signal-following; the thermodynamic friction point at which the agent prefers no-trade is asset-specific and ranges from 0.6 percent (Bitcoin) to 1.8 percent (Ethereum), ordered by turbulent half-life (Spearman ρ = 0.94, p = 0.017); a joint topological and geometric circuit breaker reduces Maximum Drawdown by 28 to 38 percent; and ablation confirms that every component of ot contributes a statistically significant performance gain (Diebold-Mariano p < 0.05 for at least four of five assets per component). The framework requires liquid cryptocurrency markets with validated parametric volatility models; transferability to other asset classes requires upstream recalibration and is an explicitly bounded limitation.

Article
Business, Economics and Management
Econometrics and Statistics

Vittorio Maniezzo

,

Lisa Vecchi

Abstract: Detecting changepoints in time series is a fundamental task in statistical modeling and data-driven decision-making. We introduce a novel set partitioning-based model for changepoint detection that leverages combinatorial optimization to identify an optimal set of segments explaining the observed data. Unlike conventional methods based on dynamic programming (DP), which often impose strict structural constraints on the objective function (e.g., additivity), our formulation uses Integer Linear Programming (ILP). This approach offers significantly increased expressiveness and flexibility in the cost structure, accommodating diverse, non-additive loss functions and complex penalty schemes. Our formulation guarantees global optimality under this flexible cost structure, a critical advantage over heuristic or approximate approaches. The model’s design enables high adaptability to different application domains, including finance, bioinformatics, and industrial monitoring. The efficiency of modern MILP solvers, combined with tailored dominance rules, enables the solution of instances with several hundreds of observations in practical time. Computational results indicate that the approach extends tractability beyond previously studied settings, effectively handling classes of instances whose structural constraints could not be accommodated by existing methods, while retaining robustness and interpretability.

Article
Business, Economics and Management
Econometrics and Statistics

Géza Tóth

,

Tekla Szép

,

Mohammad M. Jaber

Abstract: While total solar PV capacity in Hungary was only 1 MW in 2010, this figure has grown to 7,551 MW by 2024 as a result of the favorable settlement system, available subsidies, and uncertainty caused by the Russian-Ukrainian war. More than 6.4% of households are already prosumers. In our study, we focus on Hungarian districts, examining the spatial patterns of household-scale solar PV systems and the main drivers of technology adoption in 2024. We use the Theil T index to examine spatial heterogeneity and the Moran’s I statistic to test for spatial dependence. Spatial autocorrelation is further explored using maps based on the Local Moran’s I and Local Geary statistics. Finally, a spatial error model is applied to identify the factors influencing the share of household-scale solar PV systems per 100 households. Our results show that the spatial error variable has the largest effect, complemented by household education, the age and size of the building stock, population growth, and the built-up area. It confirms the need for a spatially sensitive policy approach and the importance of space and spatial relations in energy economic studies.

Article
Business, Economics and Management
Econometrics and Statistics

Feridoon Koohi-Kamali

,

Willi Semmler

,

Samuel Owusu

Abstract: This paper addresses the output and employment impacts of a climate self-financed taxation/subsidy policy on CO2 emission-reduction. We model a balanced climate fiscal expenditure by a two-regime CO2-based threshold autoregressive model that separates the periods of rising emissions by negative CO2 log-differences and falling emissions by positive CO2 log-differences. Applied to data sets of 16 OECD countries over 23 years (1995-2018), we find that self-financing of equal amounts of tax and subsidy over the lifespan of the data set produces an outcome in which the CO2-reducing regime dominates with significant threshold and marginal policy impacts, on both output and employment. The policy impacts by the panel data variance decomposition forecast show policy shock to total output variance outweighs other effects up to three years and to total employment variance up to four years. The assessment of a two-regime/threshold model of neutral fiscal policy constitutes our contribution to the literature.

Review
Business, Economics and Management
Econometrics and Statistics

Geoffrey Rothwell

Abstract: While there is some agreement on estimating construction cost contingency for “known unknowns,” there is little consensus on management reserves for “unknown unknowns.” Also, definitions of risk and uncertainty differ between the economics and finance literature and the cost engineering literature. This paper examines how cost engineering guidance on estimating management reserves is interpreted in government-sponsored project cost estimates. This lack of consensus is evident in a specific program: managing, treating, and disposing of 212,000 cubic meters of mixed radioactive and hazardous chemical waste generated by plutonium production at the Hanford Site. Over $30 billion has been invested in treatment facilities, vitrification plants, and laboratories analyzing gases, liquids, sludges, and salt cake from 177 aging storage tanks. The remaining construction and operating costs are highly uncertain, with estimates ranging from $300 billion to $640 billion. Analyses of alternatives for constructing Hanford waste treatment facilities assume 15% contingencies and 40% management reserves. A method is presented to compute the implicit moments of Extreme Value distributions of cost estimates for different options, helping determine whether one alternative’s cost estimate stochastically dominates others. Adopting industry definitions of contingency and management reserves by government agencies could improve construction cost estimation in government-financed programs.

of 17

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated