Preprint
Article

Effective Evolutionary Principles for System-of-Systems: Insights from Agent-Based Modeling in Vehicular Networks

This version is not peer-reviewed.

Submitted:

27 February 2024

Posted:

27 February 2024

You are already at the latest version

A peer-reviewed article of this preprint also exists.

Abstract
System-of-systems (SoS) evolution is a complex and unpredictable process. Although various principles to facilitate collaborative SoS evolution have been proposed, there is a lack of experimental data validating their effectiveness. To address these issues, we present an Agent-Based Model (ABM) for SoS evolution in the Internet of Vehicles (IoV) , serving as a quantitative analysis tool for SoS research. By integrating multiple complex and rational behaviors of individuals, we aim to simulate real-world scenarios as accurately as possible. To simulate the SoS evolution process, our model employs multiple agents with autonomous interactions and incorporates external environmental variables. Furthermore, we propose three evaluation metrics: evolutionary time, degree of variation, and evolutionary cost, to assess the performance of SoS evolution. Our study demonstrates that enhanced information transparency significantly improves the evolutionary performance of distributed SoS. Conversely, the adoption of uniform standards only brings limited performance enhancement to distributed SoSs. Although our proposed model has limitations, it stands out from other approaches that utilize Agent-Based Modeling to analyze SoS theories. Our model focuses on realistic problem contexts and simulates realistic interaction behaviors. This study enhances the comprehension of SoS evolution processes and provides valuable insights for the formulation of effective evolutionary strategies.
Keywords: 
system-of-systems; evolutionary principle; agent-based model; internet of vehicles
Subject: 
Engineering  -   Control and Systems Engineering

1. Introduction

A system-of-systems (SoS) is a system with a highly complex structure. When the structure of an SoS undergoes transformation, further non-linear changes will occur as a result of its complexity. In order to guide the SoS to achieve directed evolution, researchers have proposed many principles to manage the evolutionary process of the SoS [1,2,3,4].
However, due to the extremely complex nature of the SoS itself, it is challenging to describe the effects of these principles during the actual engineering process, as well as their mechanisms of action [5]. This leads to a lack of understanding when the management of SoS evolution processes is considered in SoS engineering, making it difficult to ensure that the relevant measures are sufficiently accurate and effective.
To address these challenges, we examine the impacts of different SoS evolution principles on the performance of complex systems. As obtaining empirical evidence from a sufficient number of SoSs can be arduous, due to the scarceness of complex system design data, we propose an alternative approach for creating unique SoS models and simulating design processes based on empirically verified phenomena to overcome this issue. We undertook an analysis of the ABM method's efficacy utilizing a case study centered on the Internet of Vehicles. Using this method, researchers can gain valuable insights into how specific factors influence SoSs without the need for extensive empirical data.
The case study detailed in this paper examines the evolution of a telematics SoS dealing with situational awareness problems. As a typical collaborative SoS, the structure of the telematics SoS is decentralized and distributed. The core principle of telematics is interactive communication between different distributed nodes (e.g., vehicles or infrastructure) within a network, facilitating the sharing of information to achieve situational awareness of the environment. Situational awareness refers to the perception of environmental factors under certain temporal and spatial conditions, as well as the prediction of their future trends, by collecting data through on-board sensors, cameras, and other devices [6,7,8]. When a telematics SoS receives new environmental information, nodes with different devices cooperate with each other to collect, process, and transmit relevant information to achieve information sharing in the telematics SoS, thus completing the evolution process. The effect of choosing different evolutionary principles and strategies on the evolutionary performance of SoSs can be studied using the presented framework.
The problem of situational awareness in telematics SoS offers an ideal case to examine the impacts of evolutionary principles on system-of-systems (SoS) performance. The emergent evolution that occurs during the situational awareness process in telematics SoS is often unpredictable. Consequently, this paper aims to address the challenge of modeling this evolutionary process and analyzing the influence of different principles on the information sharing process in the SoS. In this article, we combine SoS evolution theory, agent-based modeling, design optimization, and research on the Internet of Vehicles (IoV) architecture, which is achieved through the following steps: (1) Generating a vehicle networking SoS, (2) simulating the evolutionary process during SoS situational awareness and behaviors of intelligent agents, (3) verifying the effectiveness of the evolutionary principles by adjusting various parameter settings.
The article delves into the intricacies of SoS evolution by examining the application of evolutionary principles and an agent-based model (ABM) in the context of vehicle networking SoS. Employing the ABM, we developed a unique model that generates complex SoS and simulates their evolutionary processes. This was achieved through the execution of Monte Carlo simulations in 150 distinct SoSs, concurrently altering their underlying evolutionary principles. The findings of this comprehensive study highlight the varying effects that different principles have on the evolution performance of SoS. This research not only contributes to a better understanding of SoS evolution, but also emphasizes the importance of selecting appropriate evolutionary principles when designing and optimizing vehicle networking SoSs.

2. Preliminaries

2.1. The Concept of SoS Evolution

In the realm of system engineering, a system-of-systems (SoS) represents a sophisticated arrangement of specialized systems that synergistically pool resources and integrate their capabilities to create a more functional and high-performing system. Within this intricate structure, a constituent system (CS) operates as a component of one or more SoSs, functioning as a complete system with its own objectives and resources. A particularly intriguing manifestation of SoSs is the collaborative SoS, which emerges when constituent systems voluntarily align themselves around a central purpose and collectively determine their implementation and maintenance standards. One salient example of a collaborative SoS is a connected vehicle SoS, designed to achieve situational awareness of the environment by harnessing the collective intelligence and capabilities of various vehicular components. This innovative approach to system integration and collaboration not only enhances the overall performance of the individual systems, but also fosters a safer and more efficient transportation ecosystem.
Evolutionary characteristics were first identified as an inherent part of SoSs by Maier [9]. From a macroscopic perspective, the SoS can be seen as continuously, but slowly evolving [10]. This evolutionary process is incessant [11], meaning that the SoS has no permanent state [12]. From a microscopic perspective, SoS evolution takes place through a series of largely deliberate preservative or adaptive interventions [3], such as upgrades to constituent systems or responses to an ever-changing environment [13,14].
In the evolutionary process of SoS, a salient challenge is that evolution is not necessarily centrally controlled, and the impetus for change may occur suddenly and dramatically [15]. Therefore, evolution can simultaneously occur at multiple levels and within multiple areas of an SoS. In practice, such evolutions within SoSs may take place in multiple places at once and on an ongoing basis. And at the same time, different parallel evolution processes may complement or disrupt each other [16]. Consequently, any unforeseen evolution in the system (SoS) may lead to undesirable emergent phenomena [2]. If such evolution cannot be effectively controlled, then the resulting emergent behavior may lead to failures within the SoS development process [17]. Accordingly, emergent evolution has been proposed to describe this unpredictable form of evolution [18]. In this paper, we examine the emergent evolution of system-of-systems from the bottom-up level.
Based on the abovementioned studies, it is clear that emergent behavior among systems can be reflected in the evolution of the whole system-of-systems and even influence the evolutionary outcome. Therefore, it is necessary to propose appropriate evolutionary measures to manage and guide this bottom-up behavior during the evolution of the system-of-systems. The validation and application of SoS principles can help managers to better design SoS frameworks and manage emergent behaviors in the evolutionary process, thus enhancing the efficiency and effectiveness of SoS evolution.

2.2. Guiding Principles for SoS Evolution

This paper focuses on four of the significant principles that guide the evolution of SoSs.

2.2.1. Facilitate Information Exchange

Facilitating information exchange is an important principle when functionally distinct systems interact within SoS [19]. Clausing argued that the impact of design reuse can be reduced and the quality of evolution improved by ensuring the effective exchange of information during system evolution [20]. Lock affirmed the positive role of information exchange, and argued that poor levels of information exchange in the interaction between member systems within an SoS can affect the efficiency and reliability of SoS evolution [2]. In addition, Carney argued that the rate of coordinated information exchange is an important aspect of the maintenance of interoperability, and that coordinated information exchange between systems allows the SoS to retain its original characteristics during the evolution process [3]. Indeed, information exchange is an important factor in the ability to build new organizations. Information exchange is the link that facilitates interaction between systems in the process of building an SoS from independent systems [21]. Many properties of systems are closely related to information exchange, including reliability and interoperability [22]. Successful information exchange is considered to be the basis for the achievement of system properties such as interoperability [21].
Information exchange can also influence emergent complexity in the organizational structure when analyzed from a micro perspective. Emergent changes that occur internally, such as unpredictable deviations, unforeseen errors, or significant constraints, can be perceived at an earlier stage through information exchange [23,24,25]. Moreover, information exchange can also motivate agents to form organizations naturally [23]. In this process, frequent information exchange has an effect on the inertia of agents—which may be due to short-term behavior—but, in the long run, may lead to evolutionary trends [26].

2.2.2. Implementing Uniform Standards

Standards can provide a common framework and terminology, making participants more likely to have the same understanding of the same problem [27]. Uniform standards can avoid ambiguity in interpretation, which may increase the level of mutual understanding between systems as they evolve [28]. For example, relying on accepted formal or informal standards in multi-agent systems can lead to mutual understanding [29]. By promoting and enhancing the level of understanding between systems, uniform standards can further influence the evolutionary performance of a SoS. Lock argued that “The agility of SoS evolution can be improved by using and implementing uniform standards that promote understanding between interacting systems” [4]. Carney et al. argued that inter-system agreement determines whether local interoperability relationships can be established [2]. Selberg and Austin argued, from a system design perspective, that the selection of uniform and widely used standards is important for SoS evolution [1].
Therefore, we argue that it will be easier for member systems to learn new knowledge and for constituent systems to adapt to dynamic change if uniform standards or protocols are implemented. A uniform standard will provide a common framework and language, making it easier for the members of an organization to identify and use the information in a database.

2.2.3. Enhancing Transparency of Information

Information transparency, in the context of organizational dynamics, pertains to the degree of information sharing and disclosure [30]. Transparency of information between systems can eliminate inconsistencies between non-verbal and verbal behaviors and build trust between systems, thereby promoting a level of mutual understanding in organizations [31,32]. For example, in human–computer interaction systems, transparency can facilitate mutual understanding between humans and agents, leading to cooperation [33]. In the area of SoS evolution, Lane and Valerdi argued that building trust and transparency between systems helps to gain the support of SoS members to effectively guide the evolution of the SoS [34]. Carney et al. argued that building trust mechanisms between systems (e.g., by establishing information transparency) is particularly necessary for each stage of SoS evolution [3].
Therefore, we argue that enhancing the transparency of information between systems is conducive to building trust between SoS members and ultimately facilitating the evolution of the SoS. In the framework presented in this study, Principle 3 (Enhancing Information Transparency) primarily operates through top-down behavioral control mechanisms, while Principle 1 (Facilitate Information Exchange) is chiefly governed by bottom-up interactive behaviors.

2.2.4. Establishing Common Goals

The term 'Common Goals' denotes the collective objectives or targeted outcomes that multiple constituents or stakeholders within a system seek to realize in order to optimize system functionality. Common goals among the members of the organization help to create synergy within the organization, thus enhancing organizational cohesion [35]. Carney et al. argued that when there is a common purpose between the constituent systems in a SoS, the closer each system is to that purpose and the greater the SoS’s ability to adapt to change [3]. When members of an organization share a common purpose, they are more motivated to seek consensus and, thus, collaborate and cooperate more effectively [36].
Therefore, we argue that establishing a common purpose between constituent systems within a SoS increases the SoS’s ability to adapt to change, which ultimately facilitates the evolution of the SoS.

2.3. Agent-Based Modeling

To handle the intricacy of SoS evolution, an agent-based model (ABM) can be used to simulate the SoS, representing agent behaviors and SoS alterations [37]. This technique can be used to simulate the attributes of an SoS (i.e., its types, behaviors, and capabilities) [38]. As a “bottom-up” technique, an ABM uses agents to represent the domain being modeled [39]. Then, organizational behaviors may emerge as the cumulative result of individual behaviors. It should be noted that the evolution of a SoS is a complex, non-linear process. And an ABM provides just such a model, requiring each agent to behave in a stochastic, non-linear manner and possess a non-linear ability to adapt over time [40]. Characterized by being autonomous and flexible [41,42], An ABM explains the actions and interactions of agents with a view to assessing their effects on the system as a whole, as in the Kauffman NK model [43].
The autonomy and flexibility of Agent-Based Modeling (ABM) make it highly suitable for modeling a wide range of distributed SOS. In the context of this research paper, the vehicle SoS under study represents a typical distributed SOS, thus making it an ideal candidate for the application of ABM methodology. The pioneering work in utilizing ABM methodology for vehicle research can be attributed to Nagel and Schreckenberg [44], who developed a stochastic discrete automation model incorporating Agent-Based Simulation (ABS) to simulate individual vehicle nodes. Since then, ABM has garnered significant attention in the realm of vehicular systems research. Notable contributions include studies on traffic theory by Nagel and Flötteröd [45] and Zhu et al. [46], vehicular transportation simulations conducted by Zou and Levinson [47], Levy et al. [48], and traffic control investigations by Bosse [49], Bui and Camacho [50], Wang and Lv et al. [51].
And regarding the information sharing problem addressed in this paper, Shang Wenlong and Ke Han et al. employed ABM to establish a transportation network model and a vehicle SoS communication model. Their study aimed to explore the impact of tourism information sharing on road networks[52]. Furthermore, the issue of information sharing within vehicular networks using the ABM approach has been explored by Mahdi and Hasson [53], Zia and Shafi et al. [54], and Rathee and Garg et al. [55]. It is important to note that this paper distinguishes itself from previous vehicular system studies by focusing on the intricate interaction behaviors among vehicle nodes and attempting to simulate the unpredictable emergent evolution observed in vehicle SoS.
At the same time, this paper examines the suitability of ABM for organizational evolution. In this paper, each individual of an organization interacts with the others and the environment through various behaviors. It has been well-documented that the foundational level of an organization can be elucidated by examining the interactions between actors, whether individuals or groups, and their environment [56]. And such characteristics of interactions play a crucial role in managing the emergent evolution of SoS. Thus, we anticipate that ABM can capture these behavior characteristics within an organization and facilitate research into SoS evolution issues.
In fact, several articles have utilized the ABM approach to examine intricate organizational structures of SoS evolution. Peppard and Breu et al. used ABM techniques to analyze the evolutionary patterns of the formation mechanisms of collaborative SoSs by simulating the assembly processes of molecules under natural conditions [57]. These results were used by engineers to improve the design and management of SoSs. Besides, Sindiy and DeLaurentis et al. modeled a distributed SoS and used an SoS to define time-varying performance metrics and an agent-based model to simulate the evolution of these architectures [58], while Nikolic and Dijkema used ABM techniques to model industrial and infrastructural development in seaport areas, combining the feedback of knowledge processes with SoS evolution simulation [59]. While ABMs are beginning to be used for SoS-related research, there have been few efforts to improve SoS theory-related aspects [36]. The use of ABM to test SoS theory in this paper represents a continued expansion of this field.

3. Methodology

This section presents the conceptualization and implementation of the model utilizing the ODD (Overview, Design concepts, Details) protocol [60].

3.1. Overall Model Structure

In this paper, the subject is an abstract system consisting of a number of agents in three categories (vehicle, infrastructure and mobile device) . These agents can absorb internal and external changes (e.g., changes in the environment or changes in the mission) through different behaviors and interactions, leading to emergent evolution of the SoS. It is assumed that these three categories of agents can adapt spontaneously to the dynamics of the SoS. The structure of the agent-based model is illustrated in Figure 1.

3.2. SoS Evolution

The case assumed in this paper is the evolution of an SoS when dealing with situational awareness problems in telematics. The telematics SoS consists of vehicle nodes, infrastructure nodes (e.g., traffic lights, surveillance cameras, road sensors, wireless network base stations), mobile device nodes (e.g., portable devices such as smartphones and wearable devices), and cloud servers. The core principle of telematics is interactive communication between various distributed nodes (e.g., vehicles and infrastructure) within the network and information sharing, in order to obtain situational awareness of the environment. Situational awareness refers to the perception of environmental factors under specific temporal and spatial conditions, as well as the prediction of their future trends, by collecting data from on-board sensors, cameras, and other devices.
In this model, three facilities in the same area form the Telematics SoS. To achieve situational awareness of the area, different behaviors occur at each node (i.e., constituent system) in response to changes in the environment. The constituent system is represented by an agent. In the experiment, initial relationships between constituent systems in the same sector or in different sectors are established to simulate the relationships between systems in the SoS.
When the Telematics SoS receives new environmental information, the nodes carrying different devices cooperate with each other to collect, process, and transmit relevant information for information sharing in the Telematics SoS, thus completing an evolutionary process. Specifically, in the Telematics SoS, each node within it undergoes independent evolution in response to changes in the external environment through interactive behaviors, which eventually leads to evolution of the SoS.
In this research, the fundamental design concepts of the model primarily pertain to the implementation of information sharing principles. Consequently, the agents in the experimental setup were represented as binary strings comprising numerical values that encapsulate information content. This representation serves as a means to depict their knowledge sets, which follows a well-established modeling approach. The environment and the behavior of the agents in the experiments change these strings and, thus, the properties of the agents. The schematic diagram of the knowledge set is shown in Figure 2.
The Telematics SoS is in a constantly changing environment, where external changes fall into three categories: Natural environment changes, man-made environment changes, and vehicle status changes. Natural environmental changes refer to changes in the external natural environment which have an impact on vehicle performance and driving safety (e.g., weather changes, road conditions, and so on). Man-made environmental changes refer to the impacts of urban planning, population flow, road reconstruction, and other factors on vehicle driving. Vehicle state changes refer to changes in the operating state of the vehicle itself (e.g., engine failure, tire leakage, and so on). Different changes add different knowledge values to the agent’s knowledge set, and the agent’s behavior has different effects under different changes. These knowledge values are assigned to all agents and ultimately affect the evolutionary direction of the SoS.
For example, if a change in the human environment is applied to a class of facilities, an agent in that class will set its knowledge value about the “human environment” to 1, meaning that it senses and receives knowledge relating to the change. At this point, the knowledge values of the other agents are set to 0, indicating that they are not receiving knowledge related to that change. As the SoS evolves, the knowledge values of the other agents about the change will eventually change to 1, indicating that the SoS as a whole has fully absorbed the change.

3.3. Agent Behaviors

The agents in this paper are designed as state machine models. In particular, the agents have two states: No knowledge set and existing knowledge set. Agents in the no knowledge set state are transformed to the knowledge set state when they are affected by three environmental changes. The agent state transition diagram is shown in Figure 3.
Five spontaneous behaviors are set for the agents in the model, with reference to the behavior of the self-directed system during the evolution of the SoS: Communication, negotiation, learning, cooperation, and competition. When the knowledge set exists, there are two possible states for the knowledge value of a certain bit in the knowledge set: the knowledge value is either 0 or 1. When the knowledge value is 0, if any of the five interactive behaviors affect the knowledge value, then it will be transformed to 1. The state transition diagram for the knowledge value is shown in Figure 4.
Agents perform the established behaviors spontaneously and participate in the evolution of the SoS. The five spontaneous behaviors of agents are described below.
Communication refers to the communication between various nodes within the telematics SoS through information transfer, such as vehicle–road cooperation between vehicles and road facilities, communication between vehicles, and data exchange between vehicles and cloud servers. Through communication, different nodes can understand each other’s status and needs, thus improving the efficiency and safety of the whole system.
Learning means that the nodes in the Telematics SoS continuously collect and analyze a large amount of data by interacting with the cloud server, as well as using algorithms to improve their own performance and adaptability to the environment. For example, monitoring devices can predict future traffic conditions by analyzing factors such as traffic flow and congestion, then adjust their own monitoring behavior based on these predictions.
Negotiation is the process of reaching a common decision between two or more independent nodes through interaction. For example, nodes must decide among themselves how to allocate resources such as bandwidth, processing power, and so on. The nodes need to adjust their respective behaviors to adapt to environmental changes or to optimize certain metrics (e.g., reducing latency or lowering power consumption) among themselves. During the negotiation process, nodes need to send messages to each other, explain their intentions, exchange preferences, and reach an agreement. The result of negotiation can be an agreement, an allocation of resources, or a change in the way that the nodes behave.
Competition refers to the behavior of nodes competing for limited resources. In the Telematics SoS, individual nodes require access to resources such as data, storage space, and network bandwidth to perform their tasks. As these resources are limited, resource competition between nodes can occur; for example, multiple nodes send data to the central server at the same time, which may result in insufficient bandwidth, ultimately affecting the quality and speed of data transmission.
Cooperation is the act of working together among nodes to achieve a common goal. In the Telematics SoS, individual nodes must work together to accomplish the overall situational awareness task. For example, in self-driving cars, individual sensors need to work together to obtain information about the environment and aggregate this information to the central controller for analysis and processing, which enables the autonomous driving function.
Overall, the behaviors of communication, cooperation, competition, negotiation, and learning are essential factors in the evolution of an SoS. In this model, these behaviors interact and influence each other, acting on the knowledge sets of the agents and ultimately shaping the characteristics and evolutionary direction of the SoS.

3.4. Principle

The ultimate goal of this study is to investigate the roles of the four principles mentioned in Section 2.3 in the evolution of the SoS. In the model described herein, a crucial design concept is the impact of these principles on the likelihood of an agent's behavior and consequently the evolutionary dynamics of the system.
In the considered model, the evolutionary principles of facilitating Information Exchange influence the communication and negotiation behaviors in the SoS. On one hand, communication is the fundamental behavior within an organization, and is also a form of information exchange. Encouraging internal information exchange within an organization can improve the quality and fluidity of communication [61]. On the other hand, moderate and accurate information exchange can significantly improve the negotiation performance without significant cost to the negotiators who initiated it, resulting in more mutually beneficial negotiation outcomes [62].
In the considered model, the principle of implementing uniform standards can influence learning and competitive behaviors in the SoS. We found that applying uniform and standardized work principles effectively reduces costs, has a positive impact on team member learning, and provides a basis for sustainable team improvement [63]. Furthermore, researchers have found that uniform standards can regulate competition patterns and, thus, energize the entire organization [64]; however, an excessive reliance on standards can also lead to certain monopolistic phenomena [65].
In the considered model, the principle of data transparency can affect cooperation and communication behaviors in the SoS. Information transparency is considered, in some studies, as a tool that helps stakeholders to perceive information as relevant and timely, providing a reliable picture of the organizational reality [66]. The principle of transparency enhances communication between stakeholders and, thus, allows more valuable information to be conveyed [67]. In addition, information transparency enhances transparency within an organization, leading to the involvement of members in collaboration and the creation of certain collaborative mechanisms [68].
In the considered model, the principle of establishing common goals can affect cooperation and negotiation behaviors in the SoS. Researchers have concluded that there is a positive relationship between the degree to which members of an organization agree on a common goal and the effectiveness of collaboration toward that goal [69]. Thus, effective collaboration requires a shared, common goal [70]. In addition, what is negotiated among members is a limited common goal. A common goal can create a sense of trust between negotiators [71]. From another point of view, maximizing the common goal effort between the two parties is the focus of each negotiation [72].
In the initial model, the different behaviors have a defined probability of occurrence, which are validated to allow the system to evolve in a balanced way, while different principles are established to influence the probability of occurrence of the behaviors. In this way, evolutionary principles are studied indirectly in this paper.

3.5. Indicators

In order to assess the impacts of different principles on the system, this subsection introduces misalignment as a reference indicator for the SoS evolutionary process in simulation experiments. Furthermore, to provide a complete picture of performance, this subsection introduces the evolutionary time (ET), degree of variation (DOV), and cost as performance assessment metrics.
Misalignment refers to the degree of mismatch between the knowledge sets of the member systems in the system. We argue that, as the system evolves, differences between member systems will continue to emerge, where higher differences will negatively affect organizational performance. Based on the ABM model constructed in this section, misalignment is viewed as a difference in knowledge between different types of entities in the evolutionary process.
R M S E V N s , I N s = i = 1 n ( X V N , i X I N , i ) 2 n .
The formula for calculating this indicator is shown in Equation (1), where X V N , i on the right side of the formula denotes the mean value of the ith knowledge value of the vehicle node (VN) and X I N , i denotes the mean value of the ith knowledge value of the infrastructure node (IN). Therefore, R M S E V N , I N on the left side of the formula indicates the root-mean-squared error between the vehicle node (VN) and the infrastructure node (IN); that is, the difference in value between the two facilities. The smaller the R M S E V N , I N , the smaller the difference between the two facilities and the smaller the negative effect of evolution on the SoS.
The total root-mean-squared error between the three types of facilities is denoted as M i s a l i g n m e n t , as shown in Equation (2), where R M S E V N s , I N s denotes the difference between vehicle nodes (VNs) and mobile facility nodes (MDNs); R M S E I N s , M D N s denotes the difference between infrastructure nodes (INs) and mobile facility nodes (MDNs); and R M S E V N s , M D N s denotes the difference between vehicle nodes (VNs) and infrastructure nodes (INs). Thus, M i s a l i g n m e n t is the sum of the root-mean-squared errors between the different facility knowledge sets, indicating the total variance value within the system at that moment. This metric is the basic metric that provides the results of the model, the role of which is to indicate the evolution of the SoS.
M i s a l i g n m e n t = R M S E V N s , I N s + R M S E I N s , M D N s + R M S E V N s , M D N s .
Based on the misalignment base metric, the first evaluation metric of the model is evolution time (ET), which is the time taken for the misalignment base metric to return to zero (i.e., the total time required to complete the evolution of the SoS). ET describes the basic characteristics of the evolution of the SoS. If the ET is smaller, the time required for SoS evolution is shorter and the evolutionary performance of the model is stronger. As the SoS evolution process in the situational awareness problem is short, it is measured in seconds.
The second evaluation metric of the model is the degree of variation (DOV), the value of which is the integral of the variation over time (i.e., the time-weighted average of the root-mean-squared error among all knowledge sets in the model). The DOV describes the degree of accumulation of the total degree of variation over time in the Telematics SoS. If the DOV is smaller, the degree of variation in the SoS evolution process is smaller and the evolutionary performance of the model is stronger.
D O V = 0 E T M i s a l i g n m e n t   d t .
The third metric introduced in this study is cost, which represents the consumption of various resources during the evolution of the SoS. As knowledge transfer in an organization is supported by the consumption of various resources, we therefore assume that the implementation of each code of conduct may incur costs. The case studied in this paper is a situational awareness problem and, as vehicular networking requires the constant transmission of a large amount of data (e.g., vehicle location, speed, acceleration, vehicle status, traffic conditions, and so on), sufficient bandwidth is required to support the transmission of this data. Therefore, we use network bandwidth to represent the resource consumption in this process. The unit is Gbps, which is the amount of data transmitted in gigabits per second, in order to represent the network bandwidth occupied by the task of situational awareness.

3.6. Monte Carlo Simulation and Model Verification

Monte Carlo simulations have recently emerged as a prevalent technique for testing agent-based models (ABMs) and generating statistically significant outcomes under various evaluation metrics [73,74]. By leveraging the capabilities of Monte Carlo simulations, numerous studies have successfully modeled complex organizational relationships and gained valuable insights into the underlying dynamics that govern these systems [75,76,77].
To eliminate the stochastic nature of ABM simulations, we generated and designed several unique complex systems using Monte Carlo simulations, in order to sample the effects of specific behaviors across a large number of complex systems. To fully test the role of each principle in the evolutionary process, four experimental groups and a no-principle control group were set up, according to the four evolutionary principles detailed above. In addition, the amount of environmental change can affect the evolutionary process of the system. Therefore, the experiment was set up with three levels, according to the amount of change, which were tested separately. The experiment was run 80 times in each state, in order to eliminate the effect of random errors.
With five experimental groups, three variations, and 80 executions per combination, the ABM in this paper was run 5 × 80 = 400 times. The initial variation in each experiment randomly affected a few agents. By utilizing the Monte Carlo method, we were able to generate 400 unique and representative complex SoSs, providing a robust data set for further analysis.
Validation of ABMs is difficult but necessary [78]. Therefore, a sensitivity analysis of the simulation data between ABM experiments was required. For this study, sensitivity tests were performed on metrics that were not relevant to the purpose of the experiment, in order to select optimal control variables (see Table A1).

3.7. Time Complexity Analysis

The time complexity of Agent-Based Modeling (ABM) can pose a significant drawback as the input size of certain parameters increases. In this paper, we focus on the evolution process of the vehicular networking system as our model simulation. The initial amount of variation introduced during this evolution process plays a crucial role in a specific Telematics system. Therefore, this study aims to examine the impact of different initial variation amounts on the simulation time. The results of our tests are presented in Figure 5.
The data from the tests can be fitted to a quadratic function: y = 1.0561 x 2 0.1723 x + 55.876 , where x represents the initial amount of variation. By analyzing the trends, we observe that the simulation time increases quadratically with the initial amount of variation. Consequently, the estimation time complexity of the model can be expressed as O ( n 2 ) , where n represents the amount of variation introduced in the model. This implies that the model's efficiency may be compromised when handling large-scale data.

4. Results

The evolution of the SoS is shown in Figure 6, in which the basic misalignment parameter describes the evolution of the system. A summary of the information shown in the figure is provided in the following:
(1)
The misalignment metrics in all four graphs showed an increasing and then decreasing trend. The reason for this phenomenon is that the initial interactions between some of the constituent systems increased the degree of difference between all nodes in the SoS under the influence of the external environment. As the interactions continued, evolution caused the degree of difference between most of the constituent systems to decrease, eventually leading to complete evolution.
(2)
The peak misalignment values in the plot for Principle 2 (Implementing Uniform Standards) occurred earlier than those without the application of the principle. The reason for this phenomenon is that the application of this principles increased the overall efficiency of the system at an early stage and different nodes in the installation received more new information in a short period of time, thus creating differences between the self-managed systems. Meanwhile, the peak misalignment values in the plot of principle 1 (Facilitating information exchange) was significantly lower than that without the application of the principle, probably as the exchange of information between nodes somewhat mitigated the degree of difference between the self-managed systems.
(3)
Compared to the control group, the misalignment values of the SoS with different principles applied were all improved, in terms of in the rate of decline after reaching the peak. As such, the time to complete SoS evolution was also shorter in all cases. This indicates that the application of different principles can enhance the efficiency of system evolution, to some extent.
In the simulation experiments, the effects of the different principles on the evolution of the system varied. The results of the evaluation of the four principles based on the three indicators described above are shown in Figure 7. The information in the figure is summarized in the following follows:
(1)
Figure 6a shows the average evolution time of the SoS with the different principles applied. The evolution time for the SoS without applying any principles was 1181.2 s. The evolution times for the systems with Principle 1 and Principle 2 applied were close, at 989.8 s and 965.0 s, respectively (roughly 82% of the original time). Meanwhile, the average evolution time with Principle 4 applied was 954.8 s (80.8% of the original time), and the lowest evolution time was obtained with Principle 3, which was only 892.8 s (or 75.6% of the original time).
(2)
Figure 6b shows the degree of variation accumulated in the evolution of the SoS with the application of the different principles. All four principles reduced the degree of variation to a greater extent. The smallest reduction was obtained with Principle 2 (Implementing Uniform Standards), which was 83.8% of the baseline variance, while the greatest reduction in the degree of variation was achieved with Principle 1 (Facilitating information exchange), which was 72.3% of the baseline degree of variation.
Figure 6c shows the average cost of the SoS evolution process after applying the different principles. The evolution costs of the systems using Principle 3 and Principle 4 were relatively similar, at 4585.5 Gbps and 4559.7 Gbps, respectively, with very low increases in cost. Principle 2 had the highest cost of evolution (5262.3 Gbps), with a 16.4% increase compared to the baseline cost.
As can be seen from Figure 6, Principle 2 (Implementing Uniform Standards) was the least effective in improving the situational awareness model in this paper, with a small improvement in evolution performance and a large increase in cost. Meanwhile, Principle 3 (Enhancing Transparency of Information) was the most effective, significantly reducing the evolution time and variability while keeping the evolution cost low. In addition, Principle 1 (Facilitating information exchange) presented a good reduction in variance and Principle 4 (Establishing Common Goals) had the lowest cost.

5. Discussion

5.1. Elaboration of Experimental Outcomes

The model detailed in this paper provides insights into the validation of theoretical principles, allowing for study of the effects of different principles on system evolution and the implementation of complex system-of-systems modeling using the ABM approach. Overall, the experimental findings presented in this paper demonstrate variations in the impacts of the four evolutionary principles on SoS evolution.
The experimental results indicate that the Degree of Variation (DOV) of the Telematics SoS decreases significantly over time after implementing Principle 1 (Facilitating Information Exchange). This outcome corroborates the efficacy of information exchange in mitigating discrepancies during the SoS evolution, aligning with our preliminary hypothesis. Prior research has previously underscored the potential of information exchange to enhance information visibility within transportation systems [79]. Furthermore, the process of information exchange serves to eliminate obstructions between disparate systems, align the interests of stakeholders [80], and alleviate unwarranted variations within systems [81].
From a pragmatic perspective, the enhancement of information exchange is instrumental in resolving prevalent challenges within the SoS, such as conflicting interests and ambiguous accountabilities [82]. These challenges are inherently associated with the disparities present among the constituent systems. Consequently, it is plausible to postulate that the enhancement of SoS performance via information exchange is attributed to the principle's efficacy in bridging differences among system members.
In our study, we have made an interesting observation regarding the effect of Principle 2 (Implementing Uniform Standards) . Contrary to our initial prediction, we found that implementing uniform standards is actually the most costly. This finding is significant because unified standards are generally expected to reduce costs in system architecture [83]. Furthermore, our research also revealed that uniform standards are less effective in improving the performance of the SoS in our experiment. These observations raise an important question: why are uniform standards not as effective in the experimental setting designed in this paper?
We believe that this phenomenon can be attributed to the difficulty of adapting uniform standards to distributed SoSs, particularly when there are significant differences among constituent systems. This phenomenon has been acknowledged in other literature as well. In distributed systems, uniform standards may overlook the heterogeneity of individuals, leading to a decrease in overall system efficiency [84,85]. These findings highlight the need for further investigation into this issue in future research.
Our experimental findings indicate that Principle 3 (Enhancing Information Transparency) outperforms other principles in minimizing both the duration of SoS evolution and the variability encountered during this progression. We posit that this principle augments the interactive behavior among agents by modulating specific factors. Prior studies have corroborated the efficacy of augmented information transparency in bolstering trust within constituent systems [86,87,88] , especially in analogous distributed SoSs [89]. Additionally, increased transparency has been shown to bolster the efficiency of knowledge support, subsequently amplifying the innovative capacity of organizational members [90].
It is noteworthy, however, that the minimal variability observed during the evolutionary process and the shortest evolutionary duration do not necessarily correspond to the least evolutionary cost. This observation suggests that the inherent nature of enhanced information transparency might levy supplementary costs upon the organization [91]. In summation, the SoS exhibits optimal performance post the implementation of Principle 3 ((Enhancing Information Transparency). From an applied standpoint, organizational leaders can augment information transparency within the SoS employing modern information technologies, such as social networking platforms and electronic bulletin boards, or even through conventional means like performance bulletin boards [90].
In addition to these findings, the simulation results demonstrate that Principle 4 (Establishing Common Goals) has a positive impact on reducing the SoS variance and the time required for SoS evolution, aligning with the expectations before the experiment conducted. However, it is noteworthy that the establishing common goals makes the evolution of SoS significantly less costly compared to other principles. This result challenges conventional wisdom, as there is no readily apparent logical correlation between the two variables under consideration. Furthermore, there is a lack of relevant research to substantiate the relationship between common goals and costs.
Drawing upon the analyzed observations, this study posits a hypothesis delineating the nuanced role of the 'establishing common goals' in modulating the cost associated with the evolution of SoS. This modulation ostensibly occurs through its impact on a constellation of intermediary factors. Extant literature substantiates that the articulation of common goals can attenuate the risks inherent in information exchange processes, thereby cultivating a milieu of trust among constituent members [92,93]. Such trust can decrease conflict-related costs and the need for mutual monitoring within organizations. Furthermore, a consensus on common goals can enhance access to tacit knowledge, streamlining work processes and bolstering decision-making efficiency [94,95,96]. To gain further insights, we plan to conduct additional research to investigate this factor in more depth.

5.2. Evaluating the Efficacy and Limitations of the Model

In the realm of Telematics, a pioneering study on the information sharing problem was conducted by Shang Wenlong, Han Ke, and their colleagues, who employed Agent-Based Modeling (ABM) as their research approach. Their investigation centered around the utilization of the penetration rate as an evaluative metric for gauging the extent of information sharing. While this assessment method offers a broad perspective, it fails to consider the intricate interplay among vehicle nodes. In contrast to prior investigations, this scholarly article indirectly captures the interplay of diverse behaviors that are challenging to model accurately. Consequently, it substantiates the influence of various widely adopted principles on information sharing within connected vehicle systems, thereby presenting a innovation in the field.
Experimentation and validation of SoS theory is a difficult area in system-of-systems research. In addition, the ABM method is a unique tool for verification of theory. When an ABM is used as an experimental tool to study SoS problems, representing the complexity of an SoS involving multiple constituent systems is a difficult problem to solve. The research idea presented in this paper is intended to improve the modeling of agents as much as possible—for example, by incorporating multiple complex and reasonable behaviors—in order to simulate the situation as realistically as possible. On this basis, multiple agents which interact autonomously were used, which is the core advantage of the ABM approach, in order to simulate the emergent nature of the SoS and achieve an effective simulation of the SoS.
Nevertheless, it is important to acknowledge that the ABM model presented in this paper has certain limitations. As discussed in subsection 3.7, the model exhibits high time complexity, which may hinder its efficiency when dealing with more intricate problems. To address this concern, future studies will explore the application of proxy sampling methods or optimization algorithms to mitigate these limitations. Additionally, it is worth noting that certain simplifying assumptions were made in the context of this simulation experiment. To apply a model like the one presented in this paper to more complex SoS problems, researchers may need to add more realistic attributes to the agent.

5.3. Bridging Natural and Social Sciences: A Methodological Discourse

The discourse surrounding the congruencies and divergences in the research methodologies employed in the natural and social sciences has been long-standing. Hayek posited that the inherently uncertain nature of human beings, who are central subjects in social science studies, precludes the social sciences from yielding results analogous to those derived from the natural sciences [97]. This uncertainty stems from cognitive limitations, conflicting interests, diverse value orientations, and reflexivity.
Contrastingly, Popper contended that a uniform set of criteria should be applied when evaluating both the natural and social sciences [98]. Within the realm of systems engineering, there is a recurrent necessity to incorporate considerations from both technical (pertaining to natural sciences) and human (pertaining to social sciences) perspectives, especially when navigating complex systems.
Drawing upon the experimental simulation methodology prevalent in the natural sciences, which emphasizes categorical identification, this study aligns with Popper's perspective to scrutinize social science theorems. The objective is to facilitate a transition from quantitative analysis to a more qualitative approach. It is important to acknowledge that the principles examined in this research are inherently challenging to fully validate or falsify due to the fundamental uncertainty associated with human behavior, a challenge frequently encountered in social science research.
Given this backdrop, the present paper proposes a novel approach to validating social science theories leveraging Agent-Based Modeling. This preliminary exploration seeks to foster a foundation for further empirical investigations in subsequent research endeavors.

6. Conclusions

A system-of-systems is a complex system with a high degree of unpredictability in its evolutionary process. In order to improve the performance of SoS evolution and to achieve a guided evolutionary process, many studies have proposed principles for SoS evolution. However, these principles remain at the theoretical level and lack experimental data to support them. In this study, an agent-based model of the SoS evolution process was developed against the background of an SoS handling the situational awareness problem in vehicular networks, and an attempt was made to validate and study the SoS evolution principles.
The results of the simulation indicate that the application of all four evolutionary principles can enhance the evolutionary performance of the telematic SoS, but with varying effects. Specifically, promoting information exchange between constituent systems can successfully minimize the degree of variation during SoS evolution, while establishing a common objective among constituent systems can substantially reduce the cost of SoS evolution. Regarding the evolutionary problem in the model, the implementation of uniform standard in collaborative SoS seems less effective due to its disregard for individual heterogeneity. Among the considered contexts, improving high information transparency within the SoS performs optimally among the four strategies and significantly shortens the time required for the evolution of the telematics SoS.
The aim of this study was to validate the principles of SoS evolution proposed in previous studies, in order to provide evidence for the SoS theory through experimental model and data. The contributions of this paper serve to further deepen understanding of the SoS evolution process through experimental results, as well as providing researchers with a reference for improving SoS evolutionary principles. However, it is important to note that the current model being investigated suffers from high time complexity and oversimplification, as discussed in this paper. To address these limitations, we plan to optimize the developed ABM model in the future to enhance the Computational Conclusion presented in this study.

Author Contributions

Conceptualization, J.-J.L. and J.-X.L.; methodology, J.-J.L. and J.-X.L.; software, J.-J.L. and J.-X.L.; validation, J.-J.L., J.-X.L. and M.-M.Z.; formal analysis, J.-J.L.; investigation, J.-J.L.; resources,J.-J.L.; data curation, J.-J.L.; writing—original draft preparation, J.-J.L.; writing—review and editing, J.-J.L.; visualization, J.-J.L.; supervision, J.-X.L.; project administration, M.-M.Z.; funding acquisition, M.-M.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Research Project on Organizational Mechanism of Multi-platform Avionics System Reference Architecture.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Data and results of sensitivity tests.
Table A1. Data and results of sensitivity tests.
Parameters Values Finding
Number of knowledge values contained in a single knowledge set 30
60
90
Evolutionary time increases with the number of knowledge values and has little impact on the overall evolutionary trend.
Number of agents in the domain 20
30
The evolution time increases with the number of agents, and the evolution trend is similar.
Number of groups of systems with connectivity in the domain 1
2
4
The evolution time decreases with the increase in the number of groups, the evolution performance changes more drastically, and the evolution trend is roughly similar.
Number of goal-related changes 0
2
3
Evolutionary time increases with the number of changes and has less impact on the overall evolutionary trend.
Number of standard-related changes 0
2
3
Evolutionary time increases with the number of changes and has less impact on the overall evolutionary trend.
Number of task-related changes 0
2
3
Evolutionary time increases with the number of changes and has less impact on the overall evolutionary trend.
Probability of information correspondence behavior within a group 0.7
0.9
Evolutionary time decreases with increasing probability and has no effect on evolutionary trend.
Probability of data-sharing behavior within a group 0.7
0.9
Evolutionary time decreases with increasing probability and has no effect on evolutionary trend.
Probability of consensus-seeking behavior within the group 0.7
0.9
Evolutionary time decreases with increasing probability and has no effect on evolutionary trend.
Probability of information-correspondence behavior in the domain 0.3
0.5
Evolutionary time decreases with increasing probability and has no effect on evolutionary trend.
Probability of data-sharing behavior in the domain 0.3
0.5
Evolutionary time decreases with increasing probability and has no effect on evolutionary trend.
Probability of consensus-seeking behavior in the domain 0.3
0.5
Evolutionary time decreases with increasing probability and has no effect on evolutionary trend.
Number of simulations per experimental condition 20
40
No effect on evolutionary trends.

References

  1. Selberg, S.A.; Austin, M.A. 10.1. 1 toward an evolutionary system-of-systems architecture. In Proceedings of the INCOSE International Symposium, Utrecht, The Netherlands, 15–19 June 2008; Volume 18, pp. 1065–1078. [CrossRef]
  2. Lock, R. Developing a methodology to support the evolution of system-of-systems using risk analysis. Syst. Eng. 2012, 15, 62–73. [CrossRef]
  3. Carney, D.; Fisher, D.; Place, P. Topics in Interoperability: System-of-Systems Evolution; Carnegie-Mellon Univ Pittsburgh Pa Software Engineering Inst: Pittsburgh, PA, USA, 2005.
  4. Lane, J.A.; Valerdi, R. Accelerating system-of-systems engineering understanding and optimization through lean enterprise principles. In Proceedings of the 2010 IEEE International Systems Conference, San Diego, CA, USA, 5–8 April 2010; IEEE: Piscataway, NJ, USA, 2010; pp. 196–201.
  5. Nielsen, C.B.; Larsen, P.G.; Fitzgerald, J.; Woodcock, J.; Peleska, J. Systems of systems engineering: Basic concepts, model-based techniques, and research directions. ACM Comput. Surv. (CSUR) 2015, 48, 1–41.
  6. Hildebrandt, N.; Spillmann, C.M.; Algar, W.R.; Pons, T.; Stewart, M.H.; Oh, E.; Susumu, K.; Díaz, S.A.; Delehanty, J.B.; Medintz, I.L. Energy transfer with semiconductor quantum dot bioconjugates: A versatile platform for biosensing, energy harvesting, and other developing applications. Chem. Rev. 2017, 117, 536–711. [CrossRef]
  7. Forbes, S.A.; Beare, D.; Boutselakis, H.; Bamford, S.; Bindal, N.; Tate, J.; Cole, C.G.; Ward, S.; Dawson, E.; Ponting, L.; et al. COSMIC: Somatic cancer genetics at high-resolution. Nucleic Acids Res. 2017, 45, D777–D783. [CrossRef]
  8. Ott, P.A.; Hu, Z.; Keskin, D.B.; Shukla, S.A.; Sun, J.; Bozym, D.J.; Zhang, W.; Luoma, A.; Giobbie-Hurder, A.; Peter, L.; et al. An immunogenic personal neoantigen vaccine for patients with melanoma. Nature, 2017, 547, 217–221. [CrossRef]
  9. Maier, M.W. Architecting principles for systems-of-systems. In Proceedings of the INCOSE 1996 6th Annual International Symposium of the International Council on Systems Engineering, 7–11 July 1996; INCOSE: Boston, MA, USA.
  10. Abbott, R. Open at the top; open at the bottom; and continually (but slowly) evolving. In Proceedings of the 2006 IEEE/SMC International Conference on system-of-systems Engineering, Los Angeles, CA, USA 26–28 April 2006; IEEE: Piscataway, NJ, USA, 2006; p. 6.
  11. Bloomfield, R.; Gashi, I. Evaluating the Resilience and Security of Boundaryless, Evolving Socio-Technical Systems of Systems; Technical Report; Centre for Software Reliability, City University: London, UK, 2008.
  12. Carlock, P.G.; Fenton, R.E. system-of-systems (SoS) enterprise systems engineering for information-intensive organizations. Syst. Eng. 2001, 4, 242–261.
  13. Crossley, W.A. system-of-systems: An introduction of Purdue University schools of Engineering’s Signature Area. In Engineering Systems Symposium; MIT Engineering Systems Division: Cambridge, MA, USA, 2004.
  14. Despotou, G.; Alexander, R.; Hall-May, M. Key Concepts and Characteristics of Systems of Systems (SoS); DARP-HIRTS; University of York: York, UK, 2003.
  15. Smith, D.; Lewis, G. Systems of Systems: New challenges for maintenance and evolution. In Proceedings of the 2008 Frontiers of Software Maintenance, Beijing, China, 28 September–4 October 2008; IEEE: Piscataway, NJ, USA, 2008: 149-157.
  16. Clausing, D.P.; Andrade, R. Strategic reusability. In Proceedings of the Engineering Design Conference'98,: Design Reuse, London, UK, 23–25 June 1998; pp. 98–101.
  17. Vargas, I.G.; Gottardi, T.; Braga, R.T.V. Approaches for integration in system-of-systems: A systematic review[C]//2016 IEEE/ACM 4th International Workshop on Software Engineering for Systems-of-Systems (SESoS), Austin, TX, USA, 16 May 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 32–38.
  18. Chen, P.; Han, J. Facilitating system-of-systems evolution with architecture support. In Proceedings of the 4th International Workshop on Principles of Software Evolution, Vienna, Austria, 10–11 September 2001; pp. 130–133.
  19. Ackoff, R.L. Towards a system-of-systems concepts. Manag. Sci. 1971, 17, 661–671. [CrossRef]
  20. Clausing, D. Reusability in Product Development, In Proceedings of the Engineering Design Conference'98,: Design Reuse, London, UK, 23–25 June 1998; pp. 57–66.
  21. Sahin, F.; Jamshidi, M.; Sridhar, P. A discrete event xml based simulation framework for system-of-systems architectures. In Proceedings of the 2007 IEEE International Conference on system-of-systems Engineering, San Antonio, TX, USA, 16–18 September 2007; IEEE: Piscataway, NJ, USA, 2007; pp. 1–7.
  22. Kazman R, Schmid K, Nielsen C B; et al. Understanding patterns for system-of-systems integration. In Proceedings of the 2013 8th International Conference on system-of-systems Engineering, Maui, HI, USA, 2–6 June 2013; IEEE: Piscataway, NJ, USA, 2013: 141-146.
  23. Baker, J.; Singh, H. The roots of misalignment: Insights from a system dynamics perspective. In Proceedings of the JAIS Theory Development Workshop, Fort Worth, TX, USA, 13 December 2015; pp. 1–37.
  24. Benbya, H.; McKelvey, B. Using coevolutionary and complexity theories to improve IS alignment: A multi-level approach. J. Inf. Technol. 2006, 21, 284–298. [CrossRef]
  25. Tanriverdi, H.; Lim, S.Y. How to survive and thrive in complex, hypercompetitive, and disruptive ecosystems? The roles of IS-enabled capabilities. In Proceedings of the 38th International Conference on Information Systems, Seoul, Republic of Korea, 10–13 December 2017; pp. 1–21.
  26. Besson, P.; Rowe, F. Strategizing information systems-enabled organizational transformation: A transdisciplinary review and new directions. J. Strateg. Inf. Syst. 2012, 21, 103–124. [CrossRef]
  27. Tversky, A.; Kahneman, D. The framing decisions and the psychology of choice. In Question Framing and Response Consistency; Hogarth, R., Ed.; Jossey-Bass Inc.: San Francisco, CA, USA, 1982.
  28. Daft, R.L.; Wiginton, J.C. Language and organization. Acad. Manag. Rev. 1979, 4, 179–191.
  29. Lind, J. Specifying agent interaction protocols with standard UML. In Proceedings of the International Workshop on Agent-Oriented Software Engineering, Montreal, QC, Canada, 29 May 2001; Springer: Berlin/Heidelberg, Germany, 2001; pp. 136–147.
  30. Schnackenberg A K, Tomlinson E C. Organizational transparency: A new perspective on managing trust in organization-stakeholder relationships[J]. Journal of management, 2016, 42(7): 1784-1810.
  31. Akkermans, H.; Bogerd, P.; Van Doremalen, J. Travail, transparency and trust: A case study of computer-supported collaborative supply chain planning in high-tech electronics. Eur. J. Oper. Res. 2004, 153, 445–456. [CrossRef]
  32. Foa, U.G.; Foa, E.B.; Schwarz, L.M. Nonverbal communication: Toward syntax, by way of semantics. J. Nonverbal Behav. 1981, 6, 67–83. [CrossRef]
  33. Calvaresi, D.; Najjar, A.; Winikoff, M.; Främling, K. Explainable, Transparent Autonomous Agents and Multi-Agent Systems; Second International Workshop, EXTRAAMAS 2020, Auckland, New Zealand, May 9–13, 2020, Revised Selected Papers; Springer Nature: Berlin/Heidelberg, Germany, 2020.
  34. González A, Piel E, Gross H G; et al. Testing challenges of maritime safety and security systems-of-systems. In Proceedings of the Testing: Academic & Industrial Conference-Practice and Research Techniques (Taic Part 2008), Windsor, ON, USA, 29–31 August 2008; IEEE: Piscataway, NJ, USA, 2008; pp. 35–39.
  35. Cvitanovic, C.; Colvin, R. M.; Reynolds, K. J.; Platow, M. J. Applying an organizational psychology model for developing shared goals in interdisciplinary research teams. One Earth. 2020, 2, 75–83. [CrossRef]
  36. Ohnuma, S. Consensus building: Process design toward finding a shared recognition of common goal beyond conflicts. In Handbook of Systems Sciences; Springer: Singapore, 2021; pp. 645–662.
  37. de Amorim Silva, R.; Braga, R.T.V. Simulating systems-of-systems with agent-based modeling: A systematic literature review. IEEE Syst. J. 2020, 14, 3609–3617. [CrossRef]
  38. Kinder, A.; Henshaw, M.; Siemieniuch, C. system-of-systems modelling and simulatio—An outlook and open issues. Int. J. Syst. Syst. Eng. 2014, 5, 150–192.
  39. AMour, A.; Kenley, C.R.; Davendralingam, N.; DeLaurentis, D. Agentbased modeling for systems of systems. In 23nd Annual International Symposium of the International Council of Systems Engineering, Philadelphia, PA, USA, 24–27 June 2013; Volume 1, pp. 250–264.
  40. Tivnan, B.F. Coevolutionary dynamics and agent-based models in organization science. In Proceedings of the 37th Conference on Winter Simulation Conference, Orlando, FL, USA, 4–7 December 2005; pp. 1013–1021.
  41. Wooldridge, M.; Jennings, N.R. Intelligent agents: theory and practice. The knowledge engineering review. 1995, 10, 115–152. [CrossRef]
  42. Macal, C. M.; NORTH, M. J. Tutorial on agent-based modelling and simulation. Journal of Simulation. 2010, 4, 151-162. [CrossRef]
  43. Kauffman, S. The Origins of Order: Selforganization and Selection in Evolution; Oxford University Press: Oxford, UK, 1993.
  44. Nagel, K.; Schreckenberg, M. A cellular automaton model for freeway traffic. Journal de Physique. 1992, 2, 2221-2229. [CrossRef]
  45. Nagel, K.; Flötteröd, G. Agent-based traffic assignment: going from trips to behavioral travelers. In Proceedings of the 12th International Conference on Travel Behaviour Research IATBR, Jaipur, India, 14-18 November 2009; pp. 261-294.
  46. Zhu, S; Levinson, D. Do people use the shortest path? An empirical test of Wardrop’s first principle. PloS one, 2015, 10, e0134322. [CrossRef]
  47. Zou, X.; Levinson, D. A Multi-Agent Congestion and Pricing Model. Transportmetrica. 2006, 2, 237-249. [CrossRef]
  48. Levy, N.; Martens, K.; Benenson, I. Exploring cruising using agent-based and analytical models of parking. Transportmetrica A: Transport Science. 2013, 9, 773-797. [CrossRef]
  49. Bosse S. Self-organising Urban Traffic control on micro-level using Reinforcement Learning and Agent-based Modelling[C]//Intelligent Systems and Applications: Proceedings of the 2020 Intelligent Systems Conference (IntelliSys) Volume 2. Springer International Publishing, 2021: 745-764.
  50. Bui K H N, Camacho D, Jung J E. Real-time traffic flow management based on inter-object communication: a case study at intersection[J]. Mobile Networks and Applications, 2017, 22: 613-624.
  51. Wang J, Lv W, Jiang Y, et al. A multi-agent based cellular automata model for intersection traffic control simulation[J]. Physica A: Statistical Mechanics and its Applications, 2021, 584: 126356.
  52. Shang W, Han K, Ochieng W, et al. Agent-based day-to-day traffic network model with information percolation[J]. Transportmetrica A: Transport Science, 2017, 13(1): 38-66.
  53. Mahdi M A, Hasson S T. Complex agent network approach to model mobility and connectivity in vehicular social networks[J]. J. Eng. Appl. Sci, 2018, 13: 2288-2295.
  54. Zia K, Shafi M, Farooq U. Improving recommendation accuracy using social network of owners in social internet of vehicles[J]. Future Internet, 2020, 12(4): 69.
  55. Rathee G, Garg S, Kaddoum G, et al. Trusted computation using ABM and PBM decision models for its[J]. IEEE Access, 2020, 8: 195788-195798.
  56. Peppard, J.; Breu, K. Beyond alignment: A coevolutionary view of the information systems strategy process. In Proceedings of the 24th International Conference on Information Systems, Seattle, WA, USA, 15–17 December 2003; pp. 61–69.
  57. Baldwin, W.C.; Ben-Zvi, T.; Sauser, B.J. Formation of collaborative system-of-systems through belonging choice mechanisms. IEEE Trans. Syst. Man Cybern.-Part A Syst. Hum. 2011, 42, 793–801. [CrossRef]
  58. Sindiy, O.V.; DeLaurentis, D.A.; Stein, W.B. An agent-based dynamic model for analysis of distributed space exploration architectures. J. Astronaut. Sci. 2009, 57, 579–606. [CrossRef]
  59. Nikolic, I.; Dijkema, G.P.J. Framework for Understanding and Shaping Systems of Systems The case of industry and infrastructure development in seaport regions. In Proceedings of the 2007 IEEE International Conference on system-of-systems Engineering, San Antonio, TX, USA, 16–18 September 2007; IEEE: Piscataway, NJ, USA, 2007; pp. 1–6.
  60. Grimm, V.; Railsback, S.F.; Vincenot, C.E.; Berger, U.; Gallagher, C.; DeAngelis, D.L.; Edmonds, B.; Ge, J.; Giske, J.; Groeneveld, J.; et al. The ODD Protocol for Describing Agent-Based and Other Simulation Models: A Second Update to Improve Clarity, Replication, and Structural Realism. J. Artif. Soc. Soc. Simul. 2020, 23, 7. [CrossRef]
  61. Alkire, A.A.; Collum, M.E.; Kaswan, J. Information exchange and accuracy of verbal communication under social power conditions. J. Personal. Soc. Psychol. 1968, 9, 301. [CrossRef]
  62. Thompson, L.L. Information exchange in negotiation. J. Exp. Soc. Psychol. 1991, 27, 161–179. [CrossRef]
  63. Maginnis, M.A. The impact of standardization and systematic problem solving on team member learning and its implications for developing sustainable continuous improvement capabilities. J. Enterp. Transform. 2013, 3, 187–210. [CrossRef]
  64. Harrison, G.W.; Rutherford, T.F.; Tarr, D.G. Increased competition and completion of the market in the European Union: Static and steady state effects. J. Econ. Integr. 1996, 11, 332–365.
  65. Tassey, G. Standardization in technology-based markets. Res. Policy 2000, 29, 587–602. [CrossRef]
  66. Albu, O.B.; Flyverbom, M. Organizational transparency: Conceptualizations, conditions, and consequences. Bus. Soc. 2019, 58, 268–297. [CrossRef]
  67. Michener, G. Gauging the impact of transparency policies. Public Adm. Rev. 2019, 79, 136–139. [CrossRef]
  68. Noveck, B.S. Rights-based and tech-driven: Open data, freedom of information, and the future of government transparency. Yale Hum. Rights Dev. Law J. 2017, 19, 1–45.
  69. Kolleck, N.; Rieck, A.; Yemini, M. Goals aligned: Predictors of common goal identification in educational cross-sectoral collaboration initiatives. Educ. Manag. Adm. Leadersh. 2020, 48, 916–934.
  70. Thayer-Bacon, B.J.; Brown, S. What “Collaboration” Means: Ethnocultural Diversity’s Impact. United States Department of Education, Office of Educational Research and Improvement.: Washington, DC, 1995.
  71. Weingart, L.R.; Bennett, R.J.; Brett, J.M. The impact of consideration of issues and motivational orientation on group negotiation process and outcome. J. Appl. Psychol. 1993, 78, 504.
  72. Shendell-Falik, N. The art of negotiation. Prof. Case Manag. 2002, 7, 228–230.
  73. Lee, J.-S.; Filatova, T.; Ligmann-Zielinska, A.; Hassani-Mahmooei, B.; Stonedahl, F.; Lorscheid, I.; Voinov, A.; Polhill, G.; Sun, Z.; Parker, D.C. The complexities of agent-based modeling output analysis. J. Artif. Soc. Soc. Simul. 2015, 18. [CrossRef]
  74. Bruch, E.; Atwell, J. Agent-based models in empirical social research. Sociol. Methods Res. 2015, 44, 186–221. [CrossRef]
  75. Meluso, J.; Austin-Breneman, J. Gaming the system: An agent-based model of estimation strategies and their effects on system performance. J. Mech. Des. 2018, 140, 121101. [CrossRef]
  76. Sarjoughian, H.S.; Zeigler, B.P.; Hall, S.B. A layered modeling and simulation architecture for agent-based system development. Proc. IEEE 2001, 89, 201–213. [CrossRef]
  77. Crowder, R.M.; Robinson, M.A.; Hughes, H.P.N.; Sim, Y.-W. The development of an agent-based modeling framework for simulating engineering team work. IEEE Trans. Syst. Man Cybern.-Part A Syst. Hum. 2012, 42, 1425–1439. [CrossRef]
  78. Fioretti, G. Agent-based simulation models in organization science. Organ. Res. Methods 2013, 16, 227–242. [CrossRef]
  79. Jović M, Tijan E, Žgaljić D, et al. Improving maritime transport sustainability using blockchain-based information exchange[J]. Sustainability, 2020, 12(21): 8866.
  80. Deng S, Zhou D, Wu G, et al. Evolutionary game analysis of three parties in logistics platforms and freight transportation companies’ behavioral strategies for horizontal collaboration considering vehicle capacity utilization[J]. Complex & Intelligent Systems, 2023, 9(2): 1617-1637.
  81. Love T E, Ehrenberg N, Sapere Research Group. Addressing unwarranted variation: literature review on methods for influencing practice[M]. New Zealand: Health Quality & Safety Commission, 2014.
  82. Hinds P J, Mortensen M. Understanding conflict in geographically distributed teams: The moderating effects of shared identity, shared context, and spontaneous communication[J]. Organization science, 2005, 16(3): 290-307. [CrossRef]
  83. Ray K. One size fits all? Costs and benefits of uniform accounting standards[J]. Journal of International Accounting Research, 2018, 17(1): 1-23.
  84. Latin H. Ideal versus real regulatory efficiency: Implementation of uniform standards and fine-tuning regulatory reforms[J]. Stan. L. Rev., 1984, 37: 1267.
  85. Elmer C F. The Economics of Vehicle CO 2 Emissions Standards and Fuel Economy Regulations: Rationale, Design, and the Electrification Challenge[M]. Technische Universitaet Berlin (Germany), 2016.
  86. Foscht T, Lin Y, Eisingerich A B. Blinds up or down? The influence of transparency, future orientation, and CSR on sustainable and responsible behavior[J]. European Journal of Marketing, 2018, 52(3/4): 476-498.
  87. Kumar N, Ganguly K K. External diffusion of B2B e-procurement and firm financial performance: Role of information transparency and supply chain coordination[J]. Journal of Enterprise Information Management, 2021, 34(4): 1037-1060.
  88. Cheng M, Liu G, Xu Y, et al. Enhancing trust between PPP partners: The role of contractual functions and information transparency[J]. Sage Open, 2021, 11(3): 21582440211038245.
  89. Lee U K. The effect of information deception in price comparison site on the consumer reactions: an empirical verification[J]. International Journal of Distributed Sensor Networks, 2015, 11(9): 270685.
  90. Che T, Wu Z, Wang Y, et al. Impacts of knowledge sourcing on employee innovation: the moderating effect of information transparency[J]. Journal of Knowledge Management, 2019, 23(2): 221-239.
  91. Wu Y, Zhang K, Xie J. Bad greenwashing, good greenwashing: Corporate social responsibility and information transparency[J]. Management Science, 2020, 66(7): 3095-3112.
  92. Chen Y H, Lin T P, Yen D C. How to facilitate inter-organizational knowledge sharing: The impact of trust[J]. Information & management, 2014, 51(5): 568-578.
  93. Verberne F M F, Ham J, Midden C J H. Trust in smart systems: Sharing driving goals and giving information to increase trustworthiness and acceptability of smart systems in cars[J]. Human factors, 2012, 54(5): 799-810.
  94. Li J J, Poppo L, Zhou K Z. Relational mechanisms, formal contracts, and local knowledge acquisition by international subsidiaries[J]. Strategic Management Journal, 2010, 31(4): 349-370.
  95. Wang L, Song M, Zhang M, et al. How does contract completeness affect tacit knowledge acquisition?[J]. Journal of Knowledge Management, 2021, 25(5): 989-1005.
  96. Wang N, Huang Y, Fu Y, et al. Does lead userness matter for electric vehicle adoption? An integrated perspective of social capital and domain-specific innovativeness[J]. Journal of Consumer Behaviour, 2022, 21(6): 1405-1419.
  97. Hayek F A. The counter-revolution of science[J]. Economica, 1941, 8(31): 281-320.
  98. Popper K. The logic of scientific discovery[M]. Routledge, 2005.
Figure 1. Schematic diagram of the model structure.
Figure 1. Schematic diagram of the model structure.
Preprints 99964 g001
Figure 2. Schematic diagram of the knowledge set.
Figure 2. Schematic diagram of the knowledge set.
Preprints 99964 g002
Figure 3. Agent state transition diagram.
Figure 3. Agent state transition diagram.
Preprints 99964 g003
Figure 4. State transition diagram for the knowledge value.
Figure 4. State transition diagram for the knowledge value.
Preprints 99964 g004
Figure 5. Relationship between the initial amount of introduced variation and model simulation time.
Figure 5. Relationship between the initial amount of introduced variation and model simulation time.
Preprints 99964 g005
Figure 6. Diagrams comparing the evolution of the SoS with and without the application of the various principles: (a) Principle 1 (Facilitating information exchange); (b) Principle 2 (Implementing Uniform Standards); (c) Principle 3 (Enhancing Transparency of Information); and (d) Principle 4 (Establishing Common Goals).
Figure 6. Diagrams comparing the evolution of the SoS with and without the application of the various principles: (a) Principle 1 (Facilitating information exchange); (b) Principle 2 (Implementing Uniform Standards); (c) Principle 3 (Enhancing Transparency of Information); and (d) Principle 4 (Establishing Common Goals).
Preprints 99964 g006aPreprints 99964 g006b
Figure 7. Assessment of the roles of the four principles in the evolution of the SoS: (a) Evolutionary time; (b) degree of variation; and (c) cost.
Figure 7. Assessment of the roles of the four principles in the evolution of the SoS: (a) Evolutionary time; (b) degree of variation; and (c) cost.
Preprints 99964 g007
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.

Altmetrics

Downloads

98

Views

18

Comments

0

Subscription

Notify me about updates to this article or when a peer-reviewed version is published.

Email

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

© 2025 MDPI (Basel, Switzerland) unless otherwise stated