Preprint
Article

This version is not peer-reviewed.

Assessing Smart Campus Performance: Framework Consolidation and Validation Through a Delphi Study

A peer-reviewed article of this preprint also exists.

Submitted:

06 November 2024

Posted:

07 November 2024

You are already at the latest version

Abstract
The concept of a smart campus is rapidly gaining traction worldwide, driven by the growth of artificial intelligence (AI) and the Internet of Things (IoT) along with the digital transformation of higher education institutions. While numerous initiatives have been undertaken to enhance the capability of smart campus systems to keep pace with AI advancements, there have been few attempts to develop a cohesive conceptual framework for the smart campus and, to date, there has been limited empirical research conducted to validate the framework. This study bridges this gap by providing the first in-depth assessment of a holistic smart campus conceptual framework. The paper uses a Delphi Study approach to validate and consolidate a framework for assessing the robustness of the smart campus assessment framework for application in university settings. The framework consists of four domains, 16 categories, and 48 indicators, comprising a total of 68 items were validated by the experts across the globe. Two rounds of structured questionnaires were conducted to achieve consensus on the framework. The first round involved 34 experts from diverse geographic and professional backgrounds in the smart campus field. The second round included 21 of the earlier participants, which was sufficient to determine consensus. In total seven of the 48 indicators were agreed upon after round 1, increasing to 43 after round 2. The results indicate strong agreement among the experts, affirming the framework’s robustness. This study offers an expert-based, interpretive assessment on the development of the smart campus concept, with a particular focus on validating the smart campus framework.
Keywords: 
;  ;  ;  ;  ;  ;  

1. Introduction

The rapid advancements of artificial intelligence (AI) have significant implications and transformative impacts across all sectors of society, particularly in education, learning, and research within higher education institutions (HEIs) [1]. Moreover, the COVID-19 pandemic accelerated the digital transformation of HEIs. The development of smart campus frameworks has become important to address AI’s influence on university campuses, [2,3]. This evolution has progressed to the evaluation of these frameworks for practical implementation within the university context [2,4,5,6].
A digitally transformed university campus, often referred to as a ‘smart campus’ [7,8], draws inspiration from the concept of the ‘smart city’ [9,10]. The smart campus is frequently seen as a scaled-down version of a smart city, benefiting from the research and implementation strategies of its urban counterpart [11]. Although the idea of a smart campus may seem familiar, its full potential is often overlooked due to the relatively recent adoption of digital technologies in many aspects of life. The advent of technologies such as the Internet of Things (IoT), cloud computing, AI, Industry 4.0, and City 4.0 has revolutionized both business and education, driving unprecedented levels of efficiency and speed [12].
HEIs are increasingly integrating advanced technologies to transform teaching, learning, and research experiences for students and staff, offering alternative modes of attendance and collaboration [1,2]. HEI, or for simplicity university, administrations worldwide are embracing the progressive advancement of digital campus transformations [13]. The smart campus model mirrors the smart city ecosystem by extending beyond traditional educational functions [11,14]—teaching, learning, and research—while also promoting lifelong learning, innovation, entrepreneurship, and Living Laboratories (Living Labs) that offer various programs designed to facilitate the development and implementation of cutting-edge technologies and practices [37–40].
Despite the apparent progress, there remains a significant gap in the establishment of comprehensive systems, frameworks, and policies to manage these rapid developments effectively [2,15]. While the current benefits of smart campuses are evident, they represent only the beginning of a larger revolution that requires strategic planning and action to fully realize its potential.
A pioneering smart campus conceptual framework [16] was developed, incorporating four key normative dimensions adapted from smart cities research [17]: society, economy, environment, and governance. These dimensions are aligned with the core principles of digital technology and big data. The rationale for adopting these dimensions from smart city research stems from the prevailing argument that the smart campus is a scaled-down version of the smart city [18]. This makes the smart campus an ideal testbed, functioning as an urban living lab to explore and apply smart city concepts [11].
A recently developed smart campus assessment framework [19] builds on this foundation. Each of the four normative dimensions is further divided into their respective four categories, resulting in 16 categories in total, as shown in Figure 1 and Table 1. Additionally, each category includes three indicators, leading to a total of 48 indicators for assessment. These indicators are central to the framework, as they represent specific aspects within university settings [20]. The framework provides a hierarchical structure, allowing for a comprehensive and systematic assessment to be conducted.
There are various techniques for group interaction and data collection, such as nominal group technique, focus groups, brainstorming, and the Delphi method, have been used in research [21,22]. Among these, the Delphi method is particularly effective in achieving consensus among experts, making it well-suited for evaluating the components of a framework. This study engages both academic and professional participants in a structured assessment to reach the desired consensus. In this context, building consensus is challenging, as it involves assessing components with limited evidence in the higher education environment, requiring the consolidation of subjective expert judgments [23,24,25,26]. The study aims to validate the smart campus assessment framework through a combination of qualitative and quantitative expert input, representing the first Delphi study conducted on a smart campus.
The questionnaire is designed to gather both qualitative and quantitative data necessary for framework validation. Likert-scale questions are used for quantitative responses, while open-ended questions capture qualitative insights [22,27]. The comprehensive questionnaire consists of 61 questions divided into three sections, addressing the dimensions, categories, and indicators of the framework. The questionnaire design follows a pattern wherein each dimension is covered by one Likert-scale question and two open-ended questions. The 16 categories are assessed through four Likert-scale questions (grouped by dimensions) and eight open-ended questions. The 48 indicators are examined using 16 Likert-scale questions (grouped by 16 categories) and 32 open-ended questions. The final question invites participants to share their overall opinion on the survey, providing valuable feedback for further improvement and refinement of the framework.
To achieve consensus, two rounds of the survey were conducted. In the first round, 1,210 experts were invited to participate globally, with 34 completing the survey. These 34 experts were then invited to the second round, where 21 provided complete responses, resulting in a 62% response rate. Both qualitative and quantitative methods were used to analyze the data, measuring central tendency (mean, median, mode) and dispersion (standard deviation, interquartile range). On the other hand, the consensus level is commonly expressed as the ‘percentage of agreement’ [28,29]. Interestingly, the 70% consensus level on the framework was reached in the first round. While the data was largely positive, some experts suggested improvements in the questionnaire’s navigation and access to information. These refinements were implemented in the second round, leading to an even stronger consensus, ultimately validating the framework. The ‘eyeball’ technique [30] was used to summarize the experts’ comments in the open-ended questions.
The study demonstrates that the smart campus assessment framework is a robust tool for managing the growing influence of AI in universities. The subsequent sections will cover the methodology, results, discussions, and conclusion.

2. Research Design

2.1. Survey Questionnaire

The questionnaire was designed to gather comprehensive insights into the smart campus assessment framework, with a focus on both qualitative and quantitative data. It was structured into three main sections: Dimensions, Categories, and Indicators. The goal was to collect data that would help to determine if expert consensus could be found on the framework. Participants were asked to rate the importance of each of the 68 components of the framework, which included 4 dimensions, 16 categories, and 48 indicators. For every closed-ended question, two open-ended questions were included, allowing participants to provide additional comments or feedback as needed [27].
To capture both types of data, the questionnaire utilized a combination of Likert-scale questions and open-ended responses. The Likert-scale questions assessed the suitability and adequacy of the recommended dimensions, categories, and indicators. This approach enabled an evaluation of the consensus level among experts by calculating mean scores. The open-ended questions, in contrast, sought to understand the rationale behind the experts’ ratings and invited suggestions for renaming or adding new components to the framework [24,29,31].
An 11-point Likert scale (ranging from 0—none to 10—high) was used to give participants a wider range of selection options, improving the precision of responses. This scale facilitated reporting and allowed for multi-category analysis of consensus levels (0-2: strongly disagree; 3-4: disagree; 5: neutral; 6-7: agree; 8-10: strongly agree). Figure 2 below provides an example of the quantitative and qualitative questions included in the questionnaire.

2.2. Delphi Study

The Delphi study is argued to be used for the first time in the research on smart campus. It promotes consensus by engaging experts in a structured, anonymous feedback process. Each round involves questionnaires based on previous results and gathered information [32,33]. The Delphi technique’s strength lies in integrating qualitative and quantitative data into the decision-making [34]. As a systematic approach, it effectively facilitates consensus-building among experts, making it ideal for this study and similar research [24,26].

2.3. Delphi Rounds

The number of Delphi survey rounds depends on when consensus is achieved among participants [29]. In this study, consensus was reached in the first round, but a second round was conducted to further refine and validate the framework. The second-round questionnaire was enhanced, particularly for the 48 indicators, improving navigation and access to information. This refinement minimized conflicting responses, allowing the study to be successfully completed in two rounds. Figure 3 illustrates the Delphi study process.

2.4. Selection of Experts from Relevant Journals and Organizations

For this Delphi study, potential participants were drawn from authorship of 352 recently published works related to the smart campus concept and the components of the smart campus framework (comprising 4 normative dimensions, 16 categories, and 48 indicators). This resulted in 1,160 potential participants based on the relevance of their publications to the smart campus framework, as illustrated in Figure 4. Additionally, 50 participants were directly sourced from organizations, including universities, private companies, and public entities, representing experts with research and management experience relevant to smart campus concepts.

2.5. Delphi Participants

In May 2024, the initial questionnaire was distributed via Qualtrics to 1,210 participants. Response rates were closely monitored with a follow-up conducted after two weeks to allow ample time for completion. Over two-months, 34 fully completed surveys were received. Figure 4 and Figure 5 below illustrate the geographic and professional distribution of participants, showing a broad regional representation from the Americas, Europe, Asia, Oceania, and Africa. Asia had the highest participations, while Africa had the lowest.
Figure 5 shows participant distribution across three main sectors: academic, private, and public. The academic sector had the highest representation at 76%, followed by the private sector. The public sector had the smallest presence.
This distribution aligns with the recent rise of the smart campus concept, which has primarily attracted academic research and scholarly interest. Researchers and academics in universities and colleges have been exploring solutions to global challenges in response to the growing influence of AI and IoT. While the private sector has contributed to technology development, it has produced fewer research publications, focusing more on economic opportunities. The public sector, with the least representation, has played a more passive role in research and publications. On average, participants had almost two decades of professional experience, adding substantial expertise to the study.
Table 2 shows the details of the experts under the three sectors of Academic, Public and Private. The academic sector included universities and colleges, the public sector on the other hand included state organizations, and finally, the private sector included all non-government and business organizations. As demonstrated, the professional expertise covered a wide area related to the smart campus assessment framework and with an average work experience of 19 years which is significant.

2.6. Statistical Analysis

Data from expert responses were analyzed using qualitative and quantitative methods. Qualitative analysis focused on evaluating expert suggestions and comments regarding potential revisions to the proposed smart campus assessment framework. Quantitative analysis employed statistical methods to assess reliability, internal consistency, and expert agreement levels. measure central tendency and dispersion, of the questionnaire. In Delphi studies, two key outcomes are typically analyzed: the statistical findings and the consensus levels. Common analytical statistics include measures of central tendency (mean, median, mode) and dispersion (standard deviation, interquartile range) Consensus levels are often defined by the percentage of agreement among experts [28,29].
This study used an 11-point Likert scale, following recommendations [29,35] that a standard deviation (SD) of less than 2 among expert scores indicates strong agreement. While a minimum consensus level of 50% is generally acceptable [29,36], this study set a higher threshold, aiming for a 70% consensus level to ensure robust quality.
Calculations for mean (M), standard deviation (SD), Cronbach’s alpha, and key metrics such as ‘specific agreement’ (SA), ‘overall agreement’ (OA), ‘lesser agreement,’ and the overall 70% consensus level were performed. The analyses are conducted using the Excel spreadsheet as shown in Table 3 below.
Qualitative analysis focused on expert feedback from open-ended questions in two main areas: (a) the rationale behind expert scores, as well as suggestions for adding, modifying, or replacing dimensions, categories, and indicators, and (b) comments on factors they considered important for a smart campus. The ‘eyeball technique’ [30] was used to summarize these qualitative responses in gauging how close they are for consensus to be reached.

3. Analysis and Results

3.1. Round 1 Results

3.1.1. Responses to Dimensions

In round 1, 34 experts provided a total of 136 scores across the four dimensions of the framework. The results indicated a high level of agreement, as shown in Table 4 and Figure 6 below. Only 4 scores fell below 5, while 10 were at the midpoint of 5. Notably, 89% of scores (121)—were above the midpoint, indicating a strong consensus among the experts.
Open-Ended Responses on Dimensions: The two open-ended questions on the four dimensions revealed strong agreement among 34 participants, with some (9%) reservations about including the Smart Economy dimension.
Overwhelming agreement examples included: “All four dimensions are interconnected”. “The key parameters for each dimension have been clearly identified”. “All dimensions are important. Smart economy and governance are essential for sustaining campus operations”. “Smart Society and Smart Governance are catalysts for enabling smart and smarter systems.”
Reserved comment examples included: “If the primary purpose of a university campus isn’t profitability, the Smart Economy dimension is less important compared to the others”. “I’m comfortable with the proposed dimensions, but ‘Smart Economy’ might seem less fitting in a university context”. “I’ve never encountered the term ‘Smart Society’ before”. “Looking at the aspects under business services, I don’t see a clear business link; they seem more aligned with quality of campus life.”

3.1.2. Responses to Categories

A total of 544 scores were recorded for the 16 Categories, indicating strong agreement among participants (Table 5 and Figure 7). Only 23 scores fell below 5, with 49 at the midpoint of 5. Notably, 87%—of the scores (472) were above the midpoint, demonstrating high consensus among the experts. It was noted that 4 out of 16 categories had higher standard deviation which is one of the factors contributed to the need to perform the second round.
Open-ended responses on categories: The two open-ended questions regarding the 16 Categories revealed strong overall agreement among the 34 participants. As with the Dimensions, a small minority (5 out of 34, or 15%) expressed reservations, particularly regarding the inclusion of Smart Economy Categories in a university setting. Below are typical responses:
Agreement examples include: “Most of the indicators encourage universities to be responsible and mindful of their societal role by building a smart campus infrastructure.” “In my view, all categories are important, though their relevance may depend on the specific university context.” “There needs to be a paradigm shift towards adopting ‘Smart Concepts’ in all areas of human endeavor.” “The overall framework is impressive and highly comprehensive.”
Reserved comment examples include: “A university is not a business; it is a hub for knowledge development, and research is inherently unpredictable. Universities shouldn’t be managed like businesses.” “For me, not all aspects are equally important.” “Some categories need more specificity—‘Sustainable Development’ feels too broad.” “I don’t believe Recreation Services or Art Services should be classified under Smart Economy.”

3.1.3. Responses to Indicators

For the 48 Indicators, a total of 1,632 scores were recorded. The results again showed strong agreement among participants, as illustrated in Table 6 and Figure 8 below. Only 52 scores were below 5, with 144 at the midpoint of 5. The vast majority—1,436 scores, or 88%—were above the midpoint, reflecting a high level of consensus on the indicators. Again, there were only 2 out of 48 indicators which had high standard deviation however this is a very low proportion having a marginal impact on the overall consensus.
Open-ended responses on indicators: The two open-ended questions regarding the 48 Indicators revealed strong overall agreement among the 34 participants, with 4 (12%) expressing concerns, primarily about the inclusion of Smart Economy Indicators in a university setting. However, several participants raised general concerns about the high number of indicators and the lack of easy access to related information in the questionnaire. Typical responses are summarized below.
Overwhelming agreement examples included: “All of these contribute to a smart campus.” “All indicators are important, but institutions will have varying priorities and budget constraints, meaning some areas will be emphasized more than others at different times.” “These indicators are closely tied to quality of life and sustainability on campus, which are critical for a smart campus.” “The 48 indicators of the Smart Campus model are both sufficient and relevant, given the intended outcomes.”
Reserved comments included: “Some of the indicators are difficult to distinguish, such as communication and connection, or the energy-related ones.” “I’m questioning the definition of ‘Smart Campus’ based on some of these indicators. It makes me wonder about the overall concept you’re using.” “There are too many indicators—try to reduce them.” “This feels like an overly ambitious attempt to create an all-encompassing smart campus model with too many indicators.”

3.1.4. Overall Responses

The final responses from the expert participants generally reflected a positive view of the Delphi study. Typical comments include: “A very good survey. Good study.”
“Not easy to navigate through such a large amount of information... it would be helpful to have the framework on a separate screen to better understand and provide more accurate answers and suggestions.” “Very insightful research, but always consider the end user in the design.” “The Smart Campus Model is a smart design, as it aligns with and consolidates the aspirations of the UN SDGs pillars.” “This is a solid effort, and I wish it success. I would encourage a critical interrogation of the research premises and expected outcomes.” “The dimensions and indicators synthesized in this study are highly relevant, though there is a concern about privacy.” “The ‘purpose’ of university campuses and campus life varies greatly.”

3.2. Round 2

3.2.1. Responses to Dimensions

In round 2, responses from the 21 experts yielded a total of 84 scores across the four Dimensions. The results demonstrated a strong consensus, as illustrated in Table 7 and Figure 9. Only one score fell below 5, and another was at the midpoint, while the vast majority—82 scores, or 98%—exceeded the midpoint, indicating a high level of agreement among participants.
Open-ended responses on dimensions: The two open-ended questions regarding the four Dimensions elicited well-informed feedback from expert participants. Below are typical responses, categorized by counter and complimentary statements.
Counter statements include: “Learning isn’t the only mission, but it is central and influences other aspects. Quality of campus life is important, but there might be too much emphasis on diversity and inclusion compared to other elements.” “In my view, business services are slightly less important than the other dimensions.” “As mentioned before, the Smart Economy is the most problematic dimension here. Why turn campuses into business spaces at all?” “Universal social responsibility doesn’t seem as critical as the other dimensions.” “‘Quality of Campus Life’ as a title doesn’t align with its definition, which feels more like ‘personal development’.” “I would revise all the Smart Environment categories and definitions to better reflect real built-environment objectives.” “The Smart Economy, particularly the focus on business services, may need to be reconsidered.”
Complimentary statements include: “Sustainability, resilience, and entrepreneurship are crucial.” “I believe Smart Economy and Smart Governance are more important than the other two dimensions.” “If the economy supports society and fosters entrepreneurship skills in students, then it’s an important aspect of the smart campus.” “Smart Governance scored the highest because it’s the key enabler for achieving a smart campus.” “Smart Economy, in a sense, allows economic priorities to shape the other smart dimensions.” “All dimensions are equally important for the success of a smart campus.” “These dimensions address the key aspects required for the development and functioning of a smart campus.” “Smart Society should be at the heart of decision-making, as it overlaps with all other areas. Governance is second, as sustainable governance is essential for institutions.”

3.2.2. Responses to Categories

For the 16 Categories, a total of 336 scores were recorded, reflecting strong overall agreement, as shown in Table 8 and Figure 10. Only 15 scores fell below 5, with 5 scores at the midpoint. The majority—316 scores, or 94%—were above the midpoint, demonstrating a high level of consensus among participants.
Open-ended responses on categories: The two open-ended questions regarding the 16 Categories revealed thoughtful and informed feedback from the expert participants. Below are typical responses, categorized into counter and complimentary statements. Note that more comments are considered in the discussion to demonstrate the crucial stage of round 2. More they do not represent an overwhelming response from the 21 experts. Nevertheless, the quantitative analysis demonstrated an overwhelming consensus.
Counter statements included; “If the focus is on diversity and inclusion, perhaps it should be its own category. Otherwise, the definition should be adjusted to give it equal weight with other aspects of campus life quality”. “Everything seems important here, but efficiency in this context doesn’t play the same role as it would in manufacturing environments”. “As mentioned earlier, the Smart Economy remains the most problematic dimension”. “The Smart Economy, especially business services, may need to be reconsidered. Recreation and Art Services might fit better under Smart Society.”
Complimentary statements included: “A university is, in many ways, a business and should be run efficiently and effectively to meet expectations for experience and outcomes”. “I prefer to maintain the same scores as in the previous questionnaire.” “Smart Economy is an important aspect for a successful smart campus”. “I don’t see a need to adjust my scores based on other participants’ input. I believe utility cost savings should be emphasized, as it could be a key rationale for establishing a smart campus.”

3.2.3. Responses to Indicator

A total of 961 scores were recorded for the 48 Indicators, reflecting strong agreement among participants (Table 9 and Figure 11). Only 52 scores fell below 5, with 144 at the midpoint, while the vast majority 88% of scores (765) were above the midpoint, demonstrating a high consensus among the experts.
The two open-ended questions on the 48 Indicators generated valuable feedback from the expert participants, with responses categorized as counter and complimentary remarks.
Counter statement examples: “Unfortunately, committees, policies, and regulations are often a necessary evil.” “This seems like a less important factor”. “Automation is useful, but not all university procedures can be automated. Also, contingency plans are needed for breakdowns or emergencies that could interrupt automated processes”. “At this point, there are so many variables that this is starting to feel like a management strategy exercise.” “There are already too many indicators, and many of them might conflict in practice”. “My responses are based on the situation in Papua New Guinea, where art services may not be as important”. “Replace ‘culture of diversity and inclusion’ with ‘a strong and distinctive campus culture’ that instills pride in students”. “Responsible suppliers may not always be the most efficient.” “Automation and Lean Principles allow staff to focus on innovation instead of routine tasks.”
Complimentary statements include: “These parameters are essential for a good system.” “In this section, I don’t feel the need to revise my previous considerations. In general, metrics and indicators serve as references or guidelines but should always be adjusted locally.” “It’s essential that the institution operates as a community, striving to meet the expectations of all stakeholders.” “All these activities are key to the success of a university. Depending on the subjects offered, industry engagement is crucial.” “I prefer to keep the same scores as in the previous questionnaire.” “I revisited my scores.”

3.2.4. Overall Responses

The final responses from the expert participants in Round 2 generally reflected a positive view of the Delphi study. Typical comments included: “The framework is holistic. I have a few suggestions, as mentioned in previous sections.” “I reiterate my earlier recommendation of Alvesson. The aspirational goals are solid, but this is a very lengthy survey, heavily reliant on discourse.” “I’m glad the earlier issues were addressed—responding to the questionnaire was much easier this time.” “Some areas are still challenging to evaluate.” “Congratulations, excellent work in assembling a comprehensive framework.”

3.3. Response Rate

A total of 34 experts provided complete responses in Round 1, and after 21 of the 34 experts had complete responses in Round 2, resulting in a 62% response rate.

3.4. Statistical Analysis Parameters

A’ (Agree): scales 6,7;
‘SA’ (Strongly Agree): scales 8, 9, and 10;
‘OA’ (Overall Agreement): A + SA (must be >70% for consensus);
Standard deviation: less than 2.0.
The analysis results for both rounds can be found in Table 10 below.
Key findings include the overall agreement (OA) levels for all components were above the 70% consensus threshold in both rounds, standard deviations for all components were below 2.0, indicating strong agreement among experts and consensus was reached for all components in both rounds, confirming the robustness of the smart campus assessment framework.
These results demonstrate the successful application of the Delphi method in validating the framework and achieving consensus among experts.

4. Findings and Discussion

4.1. Delphi Study Overview

This Delphi study marks the first to employ this method for identifying and reaching consensus on a smart campus assessment framework. Conducted over two rounds spanning four months, the study’s results are shown in Figure 12, Figure 13, Figure 14 and Figure 15. Notably, consensus was achieved in Round 1, with 67 of the 68 items scoring an Overall Agreement (OA) above the 70% threshold as shown in Figure 14. Thus, the smart campus makeup of the 4 smart dimensions of economy, society, environment and governance was observed to model a smart university. Moreover, the 16 categories and the 48 indicators provide the essential components for a robust smart campus framework.
A second round was conducted to further validate the framework, following the Delphi methodology.
The mean score was increased to a high of 9 in the second round as shown in Figure 12. This demonstrated an increase in the agreement by the experts on the 68 items of the framework.
The standard deviation was also improved as shown in Figure 13 to drop below the required threshold of 2.0. This demonstrated how close the experts’ responses were in confirming the 68 items of the framework. There were however 5 items out of the 68 in round 1 that had SDs higher than 2.0 which included innovation ecosystem, quality of campus life, automation systems, zero waste and diversity and inclusion. This is likely to be due to a degree of misunderstanding by the participants as the items are by their nature significant. Moreover, in round 2 there were also 9 out of 68 items which included business efficiency, utility cost saving, university social responsibility, sustainable development, zero waste, data governance, lean principles, workforce productivity, and diversity and inclusion. It is noted that these incidents are marginal and did not have any significant bearing on the overall agreement level in the Delphi analysis.
In addition, the Specific Agreement (SA) for the 48 indicators in Round 1 highlighted the frameworks complexity. Feedback indicated that some participants found the high number of indicators overwhelming, despite providing a diagram and detailed description table. Nevertheless, Figure 15 shows an increase throughout all the 68 items demonstrating improvement in consensus.
Positive feedback on the framework prevailed, with some concerns regarding the Smart Economy dimension due to contextual differences. The framework’s inclusion of Smart Economy represents a forward-looking approach, recognizing the emerging economic potential amidst the evolving higher education landscape.
To improve clarity and presentation. The Round 2 questionnaire was revised, increasing the number of questions from 9 in Round 1 to 60 in Round 2. The 16 categories were grouped under their corresponding dimensions, increasing the category-related questions from 3 to 12 and Indicator questions from 3 to 48. This restructuring aimed to enhance participants’ understanding and facilitate a more informed consensus.
As anticipated, the changes led to improved results in Round 2. The low Specific Agreement of 20 in Round 1 tripled to 60, significantly raising consensus levels. Additionally, Overall Agreement reached a perfect score of 68 items (100%), far exceeding the 70% consensus threshold. The Cronbach’s alpha was notably high and remained consistently high (0.98), indicating strong reliability and consistency among the participants’ responses. This bolstered confidence in the data collected through the Delphi process for the smart campus assessment framework
In the final feedback from Round 2, participants expressed positive views on the study, although some noted the length of the questionnaire as a concern. This feedback will be considered in future studies, particularly when reviewing the framework’s 48 indicators for potential refinement.

4.2. Research Limitations

This research recognizes several limitations inherent in Delphi studies. One key factor is the relatively high response rate of 62%, which, while positive, still leaves the potential for non-response bias. A lower response rate could significantly impact the validity and reliability of the results. Additionally, the composition of the expert panel may influence the quality of the survey outcome. For instance, the participants geographical concentration in Asia and Europe could introduce regional bias, potentially limiting the generalizability of the findings to other global contexts.
Moreover, there is always the risk of bias in the researcher’s personal views or preconceived notions could inadvertently influence the analysis and conclusions.
The very high Cronbach’s alpha value is also noted as a peculiar limitation which may be caused by some level of exaggeration by the participants to rush through the survey or maybe lack of comprehension.
To mitigate some of these limitations, the study ensured the involvement of highly credible experts from diverse back grounds in academia and the private and public sectors. Furthermore, providing participants with feedback from Round 1 allowed for more informed responses in subsequent rounds. The survey’s anonymity also helped reduce social desirability bias and enabled participants to freely express their views on the dimensions, categories, and indicators for the smart campus assessment framework.
While these measures improved the study’s robustness, it is essential to recognize that some bias and limitations are inherent in any Delphi process. Future research could address the limitations by increasing the diversity of the expert panel to ensure broader regional representation and employ triangulation methods to reduce researcher bias in interpreting findings.

5. Conclusion

This study validated a comprehensive framework for assessing smart campus performance, designed to meet the evolving technological demands and operational complexities of modern higher education institutions. Through a two-round Delphi study, the framework was subjected to rigorous expert scrutiny across diverse academic, public, and private sector backgrounds. The consensus achieved on the framework’s 68 components—comprising four core dimensions, 16 categories, and 48 specific indicators—affirms its robustness and relevance for smart campus applications.
The Delphi process underscored the importance of structured frameworks in navigating the digital transformation of campuses. Experts widely endorsed the framework’s multidimensional approach, which captures essential elements of a smart campus: Smart Society, Smart Economy, Smart Environment, and Smart Governance. This alignment with smart city principles enables the smart campus framework to serve as both an evaluative tool and a roadmap for strategic planning, fostering sustainable and adaptive campus environments.
Notably, the study highlighted several areas that may require further refinement to enhance practical applicability. While consensus on the framework’s overall structure was strong, feedback from participants suggests that the Smart Economy dimension may benefit from additional contextualization to better align with non-commercial academic settings. Adjustments in categorization and indicator definitions could make the framework even more universally applicable, allowing it to accommodate diverse institutional goals without compromising its core intent. Additionally, the study revealed that some indicators could be streamlined to avoid redundancy, ensuring ease of implementation and assessment.
The study’s findings also reflect broader trends in the digital transformation of higher education. The COVID-19 pandemic accelerated the adoption of smart technologies, spotlighting the urgent need for frameworks that can guide this rapid evolution. The validated framework offers institutions a structured approach to assess and optimize their smart campus initiatives, balancing technological innovation with sustainability, inclusivity, and operational efficiency. As higher education institutions embrace AI, IoT, and other advanced technologies, frameworks like this one will be instrumental in measuring progress and identifying areas for further development.
Future research should continue to refine and expand the framework to keep pace with technological advancements and emerging needs in higher education. Periodic reassessments and updates could ensure that the framework remains adaptable to evolving priorities, such as cybersecurity, data governance, and environmental resilience. Moreover, expanding the expert panel to include voices from additional geographic regions and institutional types would further strengthen the framework’s global applicability.
In conclusion, this research presents a foundational tool for smart campus assessment, validated through expert consensus and designed to support higher education institutions in their journey toward becoming more intelligent, responsive, and resilient. By integrating a robust set of indicators across key performance areas, the framework offers a pathway for institutions to strategically plan, implement, and evaluate smart campus initiatives. This pioneering effort lays the groundwork for ongoing research and development in smart campus frameworks, setting a standard for data-driven, inclusive, and sustainable campus transformation. Through continuous enhancement, the framework can evolve to remain relevant, supporting the long-term goal of advancing higher education in a digitally connected and dynamic world.

Author Contributions

K.P.: Data collection, processing, investigation, analysis, and writing—original draft; T.Y.: Supervision, conceptualization, writing—review and editing; M.L., T.W.: Supervision, writing—review and editing. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The data is available upon request from the corresponding author.

Acknowledgments

The authors thank the editor and anonymous referees for their invaluable comments on an earlier version of the manuscript.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Zhou, L.; Tang, Q. Construction of a Six-Pronge Intelligent Physical Education Classroom Model in Colleges and Universities, Scientific Programming, 2022, 1-11. [CrossRef]
  2. Dong, Z.; Zhang, Y.; Yip, C.; Swift, S.; Beswick, K. Smart campus: Definition, framework, technologies, and services. IET Smart Cities 2020, 2, 43–54. [CrossRef]
  3. Al-Shoqran,M.; Shorman,S. A Review on smart universities and artificial intelligence. In The fourth industrial revolution: implementation of artificial intelligence for growing business success studies in computational intelligence; Springer: Cham, Switzerland, 2021.
  4. Pagliaro, F.; Mattoni, B.; Gugliermenti, F.; Bisegna, F.; Azzaro, B.; Tomei, F.; Catucci, S. A roadmap toward the development of Sapienza Smart Campus. In Proceedings of the 2016 IEEE 16th International Conference on Environment and Electrical Engineering (EEEIC), Florence, Italy, (7–10 June 2016).
  5. Aion, N.; Helmandollar, L.; Wang, M.; Ng, J. Intelligent campus (iCampus) impact study. In Proceedings of the 2012 IEEE/WIC/ACMInternational Conferences on Web Intelligence and Intelligent Agent Technology, Macau, China, (4–7 December 2012); pp. 291–295.
  6. Luckyardi, S.; Jurriyati, R.; Disman, D.; Dirgantari, P.D. A Systematic Review of the IoT in Smart University: Model and Contribution. In J Sci Tech 2022, 7, 529–550. [CrossRef]
  7. Hidayat, D.; Sensuse, D. Knowledge Management Model for Smart Campus in Indonesia. Data 2022, 7, 7. [CrossRef]
  8. Horvath, D.; Csordas, T.; Ásvanyi, K.; Faludi, J.; Cosovan, A.; Simay, A.; Komar, Z. Will Interfaces Take Over the Physical Workplace in Higher Education? A Pessimistic View of the Future. JCRE 2021, 24, 108–123. [CrossRef]
  9. Zaballos, A.; Briones, A.; Massa, A.; Centelles, P.; Caballero, V. A Smart Campus’ Digital Twin for Sustainable Comfort Monitoring. Sust 2020, 12, 9196. [CrossRef]
  10. Imbar, R.; Supangkat, S.; Langi, A. Smart Campus Model: A Literature Review. In Proceedings of the 2020 International Conference on ICT for Smart Society (ICISS), IEEE, New York, NY, USA, (19–20 November 2020); pp. 1–7.
  11. Huertas J.; Mahlknecht J.; Lozoya-Santos J.; Uribe S.; López-Guajardo E.; Ramirez-Mendoza R. Campus City Project: Challenge Living Lab for Smart Cities Living Lab for Smart Cities; Ap Sc. 2021, 11, 23.
  12. Qureshi M., Khan N.; Bahru J.; Ismail F. Digital Technologies in Education 4.0. Does it Enhance the Effectiveness of Learning? A Systematic Literature Review; Int J Int Mob Tech, 2021, 15(4), 31-47.
  13. Zhang, Y. Challenges and Strategies of Student Management in Universities in the Context of Big Data. Mob. Inf. Syst. 2022, 1–10. [CrossRef]
  14. Chen, Z.; Liu Y. Research and Construction of University Data Governance Platform Based on Smart Campus Environment. In Proceedings of the 3rd International Conference on Artificial Intelligence and Advanced Manufacture, Manchester, UK, (23–25 October, 2021); pp. 450–455.
  15. Min-Allah, N.; Alrashed, S. Smart Campus—A Sketch. Sustain. Cities Soc. 2020, 59, 1–15.
  16. Polin, K.; Yigitcanlar, T.; Limb. M.; & Washington, T. The making of smart campus: a review and conceptual framework, Build, 2023, 13, 891. [CrossRef]
  17. Yigitcanlar, T.; Degirmenci, K.; Butler, L.; Desouza, K. What are the key factors affecting smart city transformation readiness? Evidence from Australian cities. Cities. 2022, 120, 103434. [CrossRef]
  18. Awuzie B.; Ngowi A.; Omotayo T.; Obi L.; Akotia J. Facilitating Successful Smart Campus Transitions: A Systems Thinking-SWOT Analysis Approach; Applied Sciences 2021, 11, 2044.
  19. Polin, K.; Yigitcanlar, T.; Washington, T.; Limb, M. Unpacking smart campus assessment: developing a framework via narrative literature review. Sustainability. 2024, 16, 2494. 10.3390/su16062494.
  20. AbuAlnaaj, K.; Ahmed, V.; Saboor, S. A strategic framework for smart campus. In Proceedings of the International Conference on Industrial Engineering and Operations Management, Dubai, United Arab Emirates, (10–12 March 2020); Volume 22, pp. 790–798.
  21. Landeta, J. Elmetodo Delphi. Unatecnica de prevision de la incertidumbre. Barcelona: Editorial Ariel, S.; 1999.
  22. Silverman, D. Qualitative Research. London, UK: Sage; 2016.
  23. Avella, J. 2016, ‘Delphi panels: Research design, procedures, advantages, and challenges’, In. J. Doc Stud; 2016, 11(1), 305–321.
  24. Hasson, F.; Keeney, S.; McKenna, H. Research guidelines for the Delphi survey technique. J Adv Nurs. 2000; 32 (4): 1008 1015. [CrossRef]
  25. Vernon, W. The Delphi technique: A review, I J The Reh; 2009, 16(2), pp. 69–76. [CrossRef]
  26. Young, S.; Jamieson, L., Delivery methodology of the Delphi: A comparison of two approaches’, J Pa & Rec Ad. 2001, 19(1), 42–58.
  27. Shah K.; Naidoo K.; Loughman J. Development of socially responsive competency frameworks for ophthalmic technicians and optometrists in Mozambique. Clin Exp Optom. 2016; 99 (2), 173-182. [CrossRef]
  28. Diamond, I.; Grant, R.; Feldman, B.; Pencharz, P.; Ling, S., Moore, A.; Wales, P. Defining consensus: a systematic review recommends methodologic criteria for reporting of Delphi studies. J Clin Epid; 2014, 67(4), 401,409. [CrossRef]
  29. Esmaeilpoorarabi, N.; Yigitcanlar, T.; Guaralda, M.; Kamruzzaman, M. Does place quality matter for innovation districts? Determining the essential place characteristics from Brisbane’s knowledge precincts. Land Use Pol. 2018, 79, 734-747. [CrossRef]
  30. McVie, R. A Methodology for Identifying Typologies to Improve Innovation District Outcomes: The case of South East Queensland, Degree of Doctor of Philosophy, Queensland University of Technology, Australia, 2023.
  31. . [CrossRef]
  32. Brady, S. Utilizing and adapting the Delphi method for use in qualitative research. I J Qual Met, 2015, 14(5), 1–6. [CrossRef]
  33. Hanafin, S. Review on literature on the Delphi technique. Department of Children and Youth Affairs, Ireland. (2004). https://www.dcya.gov.ie/documents/publications/Delphi_Technique_A_Literature_Review.pdf.
  34. Ludwig, B. Predicting the future: Have you considered using the Delphi methodology? J Ext, 1997. 35(5).
  35. Schmiedel, T.; Vom Brocke, J.; Recker, J. Which cultural values matter to business process management? Results from a global Delphi study. Bus Pro Man J, 2013, 19(2), 292-317.
  36. Zeeman, H.; Wright, C.; Hellyer, T. (2016). Developing design guidelines for inclusive housing: a multi- stakeholder approach using a Delphi method. Journal of Housing and Built Environment, 2016, 31(4), 761-772. [CrossRef]
  37. Yigitcanlar, T.; Dur, F. Making space and place for knowledge communities: lessons for Australian practice. Australasian Journal of Regional Studies, 2013, 19(1), 36-63.
  38. Metaxiotis, K.; Carrillo, J.; Yigitcanlar, T. Knowledge-based development for cities and societies: integrated multi-level approaches. IGI Global, Hersey, PA, USA, 2010.
  39. Baum, S.; Yigitcanlar, T.; Horton, S.; Velibeyoglu, K, Gleeson, B. The role of community and lifestyle in the making of a knowledge city. Griffith University, Brisbane. 2007.
  40. Pancholi, S.; Yigitcanlar, T.; Guaralda, M. Place making for innovation and knowledge-intensive activities: the Australian experience. Technological Forecasting and Social Change. 2019, 146, 616-625. [CrossRef]
Figure 1. Smart campus framework, adopted from [19].
Figure 1. Smart campus framework, adopted from [19].
Preprints 138745 g001
Figure 2. Sample questions of the survey.
Figure 2. Sample questions of the survey.
Preprints 138745 g002
Figure 3. Delphi study diagram.
Figure 3. Delphi study diagram.
Preprints 138745 g003
Figure 4. Round 1 Delphi experts by region.
Figure 4. Round 1 Delphi experts by region.
Preprints 138745 g004
Figure 5. Round 1 Delphi experts by sector.
Figure 5. Round 1 Delphi experts by sector.
Preprints 138745 g005
Figure 6. Round 1 frequency of choices on dimensions.
Figure 6. Round 1 frequency of choices on dimensions.
Preprints 138745 g006
Figure 7. Round 1 frequency of choices on categories.
Figure 7. Round 1 frequency of choices on categories.
Preprints 138745 g007
Figure 8. Round 1 frequency of choices on indicators.
Figure 8. Round 1 frequency of choices on indicators.
Preprints 138745 g008
Figure 9. Round 2 frequency of choices on dimensions.
Figure 9. Round 2 frequency of choices on dimensions.
Preprints 138745 g009
Figure 10. Round 2 frequency of choices on categories.
Figure 10. Round 2 frequency of choices on categories.
Preprints 138745 g010
Figure 11. Round 2 frequency of choices on indicators.
Figure 11. Round 2 frequency of choices on indicators.
Preprints 138745 g011
Figure 12. Mean score of 68 items.
Figure 12. Mean score of 68 items.
Preprints 138745 g012
Figure 13. Standard deviation of 68 items.
Figure 13. Standard deviation of 68 items.
Preprints 138745 g013
Figure 14. Overall agreement consensus level.
Figure 14. Overall agreement consensus level.
Preprints 138745 g014
Figure 15. Specific agreement consensus level.
Figure 15. Specific agreement consensus level.
Preprints 138745 g015
Table 1. Indicators of the smart campus assessment framework, adopted from [19].
Table 1. Indicators of the smart campus assessment framework, adopted from [19].
Dimension Category Indicator Description Definition
Smart economy Business services Retail services Quantity and quality of food, retail, and other business services A variety of food, retail and other business services on campus with opening hours during peak periods and to the public during off peak and weekends.
Recreation services Quantity and quality of sports and recreation services A variety of sports and recreation services on campus with opening hours during peak periods and to the public during off peak and weekends.
Art services Quantity and quality of art galleries and performance services A variety of art galleries and performance services on campus with opening hours during peak periods and to the public during off peak and weekends.
Business efficiency Lean principles Effective practice of lean principles Improvement of workplace efficiency for value creation in business on campus.
Automation systems Effectiveness of collaborative automation systems The robust automation system in business on campus.
Workforce productivity Level of workforce productivity The output of workforce activity in business on campus.
Utility cost saving Energy saving Effectiveness of energy saving mechanisms The robustness of energy saving mechanisms on campus.
Smart monitoring Effectiveness of smart monitoring and control mechanisms The robustness of smart monitoring and control mechanisms on campus.
Maintenance Level of appropriate maintenance and service upgrade The adequate maintenance and service upgrade of campus facilities to reduce cost.
Innovation ecosystem R&D support Level of incentives and support for R&D Having an R&D promotion culture of incentives and adequate support.
Innovation hubs Effectiveness of incubators and accelerators Provision of incubators and accelerators for innovation on campus.
Industry engagement Level of industry engagement and partnership The extent of industry engagement and participation in innovation.
Smart society Versatile learning and research Responsive curriculum Successful adoption of responsive curriculum Having curriculum meeting changes in the market and employment sector.
Collaborative learning Offering of operational collaborative teaching and learning resources Having robust teaching and learning resources.
Living labs Effectiveness of Living labs for knowledge sharing Having robust Living Labs for knowledge sharing.
University social responsibility Managing social responsibility Effective management of the social responsibility agenda The resilient leadership and successful implementation of the social responsibility agenda
Teaching social responsibility Effective integration of social responsibility into teaching The success in integrating social responsibility into teaching.
Researching social responsibility Effective integration of social responsibility into research and projects The success in integrating social responsibility into research and projects.
Quality of campus life Diversity and inclusion Practice of diversity and inclusion Presence of a culture of diversity and inclusion on campus.
Comfort, safety and security Presence of comfort, safety and security Having a comfort, safe and secure campus community.
Social interactions Presence of vibrant social interactions Promotion of vibrant social interactions on campus.
Campus community engagement Communication Effective communication channels Means of enabling communication on campus.
Connection Level of connectedness among the campus community The campus community getting connected together through some means.
Involvement Level of socially involved members of the campus community Members of the campus community getting involved along their social identities.
Smart environment Environmentally friendly services Sustainable lifestyle Level of sustainable lifestyle on campus The livelihood of campus residents which harmonises with the natural environment.
Responsible suppliers Level of engagement with environmentally responsible suppliers Suppliers of goods and services to the university conforming to environmental sustainability agenda.
Eco-friendly initiatives Level of development and practice of eco-friendly initiatives Eco-friendly development and practice endeavours.
Renewable energy Energy transformation culture Effectiveness of energy transformation culture Shift towards positive changes in campus energy.
Energy efficiency systems Effectiveness of energy efficiency system utilisation Implementation of systems to improve energy efficiency.
Energy best practices Level of good practices in renewable energy use, generation, and storage Promotion of good practices in renewable energy use, generation, and storage
Sustainable development Sustainability policy Effectiveness of environmental sustainability policy Successful implementation of environmental sustainability policies.
Social equity Effective practice of social equity Campus citizens’ sense of responsibility in sustainable development.
Partnerships for sustainability Effective practice of building partnerships for sustainability Identifying and engaging parties of common interest in sustainability.
Zero waste Waste reduction programs Effective utilisation of waste reduction programs Promotion of initiatives to reduce waste in the overall operation of the university.
Recycling program Effective utilisation of recycling programs Promotion of reusing used materials and goods to sustain the operations of the university.
Incentive programs Effective establishment and practice of incentive programs Implementation of incentive programs to promote zero waste.
Smart governance Cybersecurity Overseeing committee Effectiveness of the cybersecurity committee Establishment of a select group of people to be responsible for cybersecurity.
Policy and regulations Effective cybersecurity policy and regulations Establishment of overall guidelines and rules for cybersecurity implementation.
Monitoring programs Effective monitoring programs for follow-ups Establishment of means for checking and following up the implementation of cybersecurity.
Data governance Data governance policies Effectiveness of data governance policies Establishment of overall guidelines for the use of data in governance.
Data governance processes Effectiveness of data governance processes Establishment of means to deal with data for governance.
Management structures Effectiveness of data governance management structures Establishment of levels of power and authority to control use of data in governance.
Decision-making Consultation and collaboration Effective consultation and collaboration A wide inquisitive and interactive campus community.
Communication practices Effective communication practices Presence of robust communication avenues throughout the campus community.
Monitoring and evaluation Effective monitoring and evaluation mechanisms The consistent checking and evaluation of campus operations.
Service management Cataloguing and design Effectiveness of cataloguing and design Constant updating of product and services stock and meeting campus stakeholders needs.
Service delivery efficiency Effectiveness of service delivery efficiency Means of providing services to campus stakeholders when needed.
Resource allocation Effectiveness of resource allocation Resource distribution reaching all the campus community as required.
Table 2. Salient characteristics of the Delphi experts.
Table 2. Salient characteristics of the Delphi experts.
Academic Public Private Average Professional Experience
  • Agriculture Science
  • Business Administration
  • Chemistry
  • Construction Engineer and Manager
  • Corporate Governance
  • Education Policy
  • Electrical Engineering
  • Environmental Scientist and Planner
  • Human Resource Management
  • ICT Programming
  • Information and Computer Science
  • Management Scholar, Automotive Technology or Vocational Training
  • Manufacturing Engineering
  • Manufacturing Engineering
  • Policy Governance
  • Public Policy
  • Real Estate Management, Engineering
  • Software Engineer
  • Urban and Regional Planning
  • Urban Desing and Architecture
  • Development Consultant and Project Manager
  • Computer Programming
  • University Librarian
  • Public Administrator
  • Computer Science
  • Management Executive
  • Urban Design and Architecture
19 years
Table 3. Analysis spreadsheet.
Table 3. Analysis spreadsheet.
Preprints 138745 i001
Table 4. Round 1 dimension data.
Table 4. Round 1 dimension data.
(N=34) Likert scale Mean Std Dev Var
Dimension 0 1 2 3 4 5 6 7 8 9 10
1. Smart Economy 0 0 0 0 2 3 1 3 6 12 7 8.12 1.79 3.20
2. Smart Society 0 0 0 0 0 1 1 7 7 10 8 8.35 1.30 1.69
3. Smart Environment 0 0 0 1 1 2 0 4 8 12 6 8.15 1.73 2.98
4. Smart Governance 0 0 0 0 0 4 2 3 4 13 8 8.29 1.64 2.70
Cronbach’s Alpha = 0.81
Table 5. Round 1 category data.
Table 5. Round 1 category data.
(N=34) Likert scale Mean Std Dev Var
Category 0 1 2 3 4 5 6 7 8 9 10
1. Business Services 0 0 1 1 0 6 3 6 9 5 3 7.12 1.93 3.74
2. Business Efficiency 0 0 2 0 0 3 8 6 7 5 3 7.06 1.94 3.75
3. Utility Cost Saving 0 1 0 1 0 5 1 7 10 6 3 7.29 1.99 3.97
4. Innovation Ecosystem 0 0 1 1 0 2 1 1 11 11 6 7.47 2.79 7.77
5. Versatile Learning and Research 0 0 0 0 0 3 3 2 8 8 10 8.32 1.61 2.59
6. University Social Responsibility 0 0 1 1 0 2 2 7 6 8 7 7.82 1.98 3.91
7. Quality of Campus Life 0 0 1 1 0 3 3 4 6 6 10 7.88 2.13 4.53
8. Campus Community Engagement 0 0 0 0 2 4 1 5 9 8 5 7.88 1.63 2.65
9. Environmentally Friendly Services 0 0 0 1 0 2 1 3 8 14 5 7.88 1.63 2.65
10. Renewable Energy 0 0 0 0 2 2 0 3 11 8 8 8.21 1.67 2.77
11. Sustainable Development 0 0 0 0 0 4 0 2 9 11 8 8.38 1.52 2.30
12. Zero Waste 0 1 0 1 1 2 2 5 6 10 6 7.74 2.15 4.62
13. Cybersecurity 0 0 0 0 1 3 0 3 8 7 12 8.44 1.69 2.86
14. Data Governance 0 0 0 0 0 3 0 7 4 6 14 8.53 1.62 2.62
15. Decision Making 0 0 0 0 1 1 1 4 7 11 9 8.47 1.48 2.20
16. Service Management 0 0 0 0 1 4 1 4 7 9 8 8.09 1.75 3.05
Cronbach’s Alpha = 0.90
Table 6. Round 1 indicator data.
Table 6. Round 1 indicator data.
(N=34) Likert scale Mean Std Dev Var
Indicator 0 1 2 3 4 5 6 7 8 9 10
1. Retail Services 0 1 0 2 3 5 4 5 6 7 1 6.59 2.18 4.73
2. Recreation Services 0 1 0 0 1 3 5 5 13 5 1 7.18 1.77 3.12
3. Art Services 0 1 0 1 2 4 9 5 6 5 1 6.59 1.94 3.76
4. Lean Principles 1 0 0 0 1 6 7 8 3 7 1 6.74 1.96 3.84
5. Automation Systems 0 1 0 0 1 2 5 7 8 5 5 7.62 1.89 3.58
6. Workforce Productivity 0 1 0 0 0 3 3 7 8 10 2 7.32 2.10 4.41
7. Energy Saving 0 0 0 1 1 2 0 7 7 11 5 7.97 1.71 2.94
8. Smart Monitoring 0 0 1 0 1 2 2 7 3 11 7 7.94 1.94 3.75
9. Maintenance 0 0 0 0 0 3 3 8 6 9 5 7.88 1.51 2.29
10. R&D Support 0 0 0 0 2 2 3 2 9 11 5 7.97 1.70 2.88
11. Innovation Hubs 0 0 1 0 0 3 2 2 5 15 6 8.18 1.83 3.36
12. Industry Engagement 0 1 0 0 0 1 2 0 7 16 7 8.47 1.76 3.11
13. Responsive Curriculum 0 0 0 0 1 3 2 4 6 9 9 8.18 1.73 3.00
14. Collaborative Learning 0 0 0 0 0 1 3 4 8 8 10 8.47 1.35 1.83
15. Living Labs 0 0 0 0 0 3 3 7 3 16 2 7.94 1.46 2.12
16. Managing Social Responsibility 0 0 0 0 1 2 2 10 7 6 6 7.82 1.57 2.45
17. Teaching Social Responsibility 0 0 1 0 0 2 5 9 3 9 5 7.65 1.81 3.27
18. Researching Social Responsibility 0 0 1 0 0 2 2 8 10 8 3 7.71 1.62 2.64
19. Diversity and Inclusion 0 0 1 0 0 4 3 4 10 7 5 7.53 2.15 4.62
20. Comfort, Safety and Security 0 0 0 1 0 2 4 4 8 5 10 8.03 1.83 3.36
21. Social Interactions 0 0 1 0 0 4 2 5 8 8 6 7.79 1.87 3.50
22. Communication 0 0 0 0 0 3 3 7 2 10 9 8.15 1.64 2.67
23. Connection 0 0 0 0 0 2 6 3 6 7 10 8.18 1.66 2.76
24. Involvement 0 0 0 0 0 3 5 5 6 8 7 7.94 1.63 2.66
25. Sustainable Lifestyle 0 0 0 0 0 3 3 7 7 9 5 7.91 1.50 2.26
26. Responsible Suppliers 0 1 0 0 0 2 6 7 9 6 3 7.41 1.78 3.16
27. Eco-friendly Systems 0 0 0 0 0 4 4 9 5 7 5 7.65 1.59 2.54
28. Energy Transformation Culture 0 0 0 1 0 3 8 4 5 9 4 7.50 1.78 3.17
29. Energy Efficient Systems 0 0 0 0 0 4 4 4 6 9 7 7.97 1.68 2.82
30. Energy Best Practices 0 0 0 0 0 3 4 3 10 6 8 8.06 1.59 2.54
31. Sustainable Policy 0 0 0 0 1 3 4 3 7 8 8 7.97 1.75 3.06
32. Social Equity 0 0 0 0 1 3 5 4 8 10 3 7.68 1.61 2.59
33. Partnerships for Sustainability 0 0 0 0 0 3 3 6 6 8 8 8.09 1.60 2.57
34. Waste Reduction Programs 0 0 0 1 1 2 6 4 7 6 7 7.68 1.89 3.56
35. Recycling Program 0 0 0 1 0 4 4 7 5 7 6 7.59 1.84 3.40
36. Incentive Programs 0 0 0 0 1 5 7 5 4 10 2 7.29 1.70 2.88
37. Overseeing Committee 0 0 0 1 0 3 8 5 5 8 4 7.41 1.78 3.16
38. Policy and Regulations 0 0 0 1 0 3 5 5 4 9 7 7.82 1.85 3.42
39. Monitoring Programs 0 0 0 1 0 3 3 5 1 13 8 8.12 1.85 3.44
40. Data Governance Policies 0 0 0 0 1 3 2 6 2 11 9 8.15 1.78 3.16
41. Data Governance Processes 0 0 0 0 0 4 3 4 4 9 10 8.15 1.78 3.16
42. Management Structures 0 0 0 0 2 3 2 4 6 9 8 8.00 1.84 3.39
43. Consultation and Collaboration 0 0 0 0 1 3 2 10 4 10 4 7.65 1.72 2.96
44. Communication Practices 0 0 0 0 0 4 3 6 8 9 4 7.79 1.53 2.35
45. Monitoring and Evaluation 0 0 0 0 2 1 4 3 5 13 6 8.09 1.71 2.93
46. Cataloguing and Design 0 0 0 1 0 4 3 6 7 9 4 7.68 1.74 3.01
47. Service Delivery Efficiency 0 0 0 1 0 3 2 5 9 8 6 8.18 1.45 2.09
48. Resource Allocation 0 0 0 0 1 4 5 2 5 9 8 7.91 1.86 3.48
Cronbach’s Alpha = 1
Table 7. Round 2 dimension data.
Table 7. Round 2 dimension data.
N=21 Likert scale Mean Std Dev Var
Dimension 0 1 2 3 4 5 6 7 8 9 10
1. Smart Economy 0 0 1 0 0 1 0 1 6 9 3 8.24 1.81 3.29
2. Smart Society 0 0 0 0 0 0 2 2 4 8 5 8.71 1.10 1.21
3. Smart Environment 0 0 0 0 0 0 3 1 6 7 4 8.52 1.17 1.36
4. Smart Governance 0 0 0 0 0 0 1 1 5 9 5 8.90 0.83 0.69
Cronbach’s Alpha = 0.76
Table 8. Round 2 category data.
Table 8. Round 2 category data.
N=21 Likert scale Mean Std Dev Var
Category 0 1 2 3 4 5 6 7 8 9 10
1. Business Services 0 0 1 0 0 1 3 1 7 6 2 7.71 1.87 3.51
2. Business Efficiency 1 1 0 0 0 0 2 5 6 3 3 7.29 2.55 6.51
3. Utility Cost Saving 0 0 1 1 0 0 2 3 5 4 5 7.48 2.64 6.96
4. Innovation Ecosystem 0 1 0 0 0 0 0 0 4 9 7 8.76 1.92 3.69
5. Versatile Learning and Research 0 0 0 0 2 0 0 2 4 6 7 8.48 1.78 3.16
6. University Social Responsibility 0 0 1 1 0 1 0 6 4 5 3 7.57 2.11 4.46
7. Quality of Campus Life 0 0 1 1 1 0 1 3 1 5 8 8.33 1.62 2.63
8. Campus Community Engagement 0 0 0 0 0 2 0 4 6 1 8 8.33 1.62 2.63
9. Environmentally Friendly Services 0 0 0 0 0 1 2 2 6 7 3 8.19 1.36 1.86
10. Renewable Energy 0 0 1 0 0 0 3 1 6 4 6 8.14 1.96 3.83
11. Sustainable Development 1 0 0 0 0 1 1 2 4 6 6 8.14 2.33 5.43
12. Zero Waste 0 0 1 0 0 1 2 4 1 6 6 8.05 2.09 4.35
13. Cybersecurity 0 0 0 1 1 0 0 1 2 6 10 8.76 1.95 3.79
14. Data Governance 0 1 0 0 0 0 1 1 3 4 11 8.76 2.12 4.49
15. Decision Making 0 0 0 0 0 0 0 1 0 12 8 9.29 0.72 0.51
16. Service Management 0 0 0 0 1 1 0 0 3 8 8 8.86 1.59 2.53
Cronbach’s Alpha = 0.94
Table 9. Round 2 indicator data.
Table 9. Round 2 indicator data.
N=21 Likert scale Mean Std Dev Var
Dimension 0 1 2 3 4 5 6 7 8 9 10
1. Retail Services 0 0 0 1 0 1 1 4 5 7 2 7.86 1.68 2.83
2. Recreation Services 0 0 0 1 0 0 0 1 8 6 4 8.29 1.59 2.51
3. Art Services 0 0 0 0 0 1 2 2 6 7 2 7.86 1.71 2.93
4. Lean Principles 1 0 0 0 0 0 0 5 5 6 3 7.90 2.10 4.39
5. Automation Systems 0 1 0 0 0 0 0 3 6 5 5 8.25 2.00 3.99
6. Workforce Productivity 0 1 0 0 0 0 1 3 2 11 2 7.95 2.13 4.55
7. Energy Saving 0 0 0 0 0 1 1 4 2 6 6 8.52 1.50 2.26
8. Smart Monitoring 0 0 0 1 0 0 0 4 4 5 6 8.48 1.69 2.86
9. Maintenance 0 0 0 0 0 0 0 5 2 7 6 8.76 1.18 1.39
10. R D Support 0 0 0 0 1 0 0 3 5 4 7 8.52 1.50 2.26
11. Innovation Hubs 0 0 0 1 0 0 0 0 6 6 7 8.81 1.57 2.46
12. Industry Engagement 0 0 1 0 0 0 0 0 3 9 7 8.86 1.71 2.93
13. Responsive Curriculum 0 0 0 0 1 0 1 1 4 7 6 8.48 1.60 2.56
14. Collaborative Learning 0 0 0 0 0 0 0 1 5 6 8 8.95 1.02 1.05
15. Living Labs 0 0 0 0 0 0 0 2 6 8 4 8.71 0.90 0.81
16. Managing Social Responsibility 0 0 0 0 0 2 0 3 6 7 2 7.90 1.61 2.59
17. Teaching Social Responsibility 0 0 0 0 0 1 1 3 4 9 2 7.95 1.86 3.45
18. Researching Social Responsibility 0 0 0 0 0 1 1 3 5 8 2 7.90 1.84 3.39
19. Diversity and Inclusion 0 0 0 1 0 0 1 0 7 7 4 8.10 2.10 4.39
20. Comfort, Safety and Security 0 0 0 0 0 0 1 0 5 7 7 8.67 1.65 2.73
21. Social Interactions 0 0 0 0 0 0 0 2 6 6 6 8.48 1.78 3.16
22. Communication 0 0 0 0 0 0 1 3 4 8 4 8.52 1.12 1.26
23. Connection 0 0 0 0 0 0 4 1 6 5 4 8.24 1.37 1.89
24. Involvement 0 0 0 0 0 1 0 2 8 2 7 8.43 1.43 2.06
25. Sustainable Lifestyle 0 0 0 0 0 0 1 4 4 6 5 8.43 1.21 1.46
26. Responsible Suppliers 0 0 0 0 0 0 1 2 2 10 5 8.81 1.08 1.16
27. Eco-friendly Systems 0 0 0 0 0 0 2 2 4 6 6 8.48 1.40 1.96
28. Energy Transformation Culture 0 0 0 0 0 1 1 0 6 6 6 8.52 1.44 2.06
29. Energy Efficient Systems 0 1 0 0 0 0 1 1 6 8 3 8.29 1.95 3.81
30. Energy Best Practices 0 0 0 0 0 0 0 3 3 10 4 8.71 0.96 0.91
31. Sustainable Policy 0 0 0 0 1 0 1 3 3 6 6 8.33 1.65 2.73
32. Social Equity 0 0 0 0 1 0 1 1 6 6 5 8.33 1.56 2.43
33. Partnerships for Sustainability 0 0 0 0 2 0 3 2 1 9 3 8.05 1.88 3.55
34. Waste Reduction Programs 0 0 0 1 0 0 2 1 4 6 6 8.33 1.77 3.13
35. Recycling Program 0 0 0 1 0 0 1 2 2 8 6 8.48 1.72 2.96
36. Incentive Programs 0 0 0 1 0 0 0 1 2 9 7 8.90 1.58 2.49
37. Overseeing Committee 0 0 0 0 0 1 0 1 3 9 6 8.90 1.22 1.49
38. Policy and Regulations 0 0 0 0 0 1 0 0 2 9 8 9.14 1.15 1.33
39. Monitoring Programs 0 0 0 0 0 1 0 0 2 11 6 9.05 1.12 1.25
40. Data Governance Policies 0 0 0 0 0 0 0 3 1 12 4 8.90 0.94 0.89
41. Data Governance Processes 0 0 0 0 0 0 0 2 5 9 4 8.81 0.93 0.86
42. Management Structures 0 0 0 0 0 1 1 0 1 11 6 8.95 1.28 1.65
43. Consultation and Collaboration 0 0 0 1 0 1 1 3 6 5 3 8.00 1.76 3.10
44. Communication Practices 0 0 0 1 0 0 1 2 5 8 3 8.33 1.62 2.63
45. Monitoring and Evaluation 0 0 0 0 0 0 2 2 3 7 6 8.71 1.31 1.71
46. Cataloguing and Design 0 0 0 0 0 1 2 2 5 6 4 8.33 1.46 2.13
47. Service Delivery Efficiency 0 0 0 0 0 0 1 1 8 7 3 8.57 1.03 1.06
48. Resource Allocation 0 0 0 0 0 0 0 2 5 9 4 8.81 0.93 0.86
Cronbach’s Alpha = 0.98
Table 10. Analysis results.
Table 10. Analysis results.
Parameter Round 1 Round 2
SD standard deviation max 2.79 2.64
SD standard deviation min 1.73 0.72
% SD < 2 93% 85%
OA – overall agreement 99% 100%
SA – specific agreement 29% 96%
CA – Cronbach’s Alpha 0.982 0.976
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated