Preprint
Article

This version is not peer-reviewed.

Institutional Architecture for Credible Online and Blended Provision in African Higher Education: A University-Wide Framework for Digital Transition

Submitted:

02 April 2026

Posted:

03 April 2026

You are already at the latest version

Abstract
Background: Online and blended provision has expanded rapidly in higher education, yet much of the literature still treats digital transition as a pedagogical or technological adjustment rather than an institutional transformation problem. Problem: Conventional universities, especially in African higher education, often face pressure to move online under conditions of constrained infrastructure, uneven digital access, evolving regulation, and heightened concern about academic standards. Existing scholarship is rich on course design, faculty attitudes, and learner satisfaction, but comparatively weak on the full institutional architecture required for credible transition. Objective: This article develops a university-wide framework for bringing conventional institutions online in ways that are regulatorily legitimate, academically credible, operationally resilient, socially inclusive, and financially sustainable. Research question: What institutional architecture is required to move a university from conventional face-to-face delivery to credible, quality-assured online or blended provision in African higher education? Design: A systematized integrative review combined with comparative policy analysis was conducted across peer-reviewed higher education literature and authoritative framework and regulatory documents. The synthesis drew together institutional adoption studies, quality assurance guidance, digital transformation frameworks, and policy texts, with Rwanda used as a policy-reference environment rather than a single-country case. Findings: Credible digital transition depends on the alignment of five layers: contextual boundary conditions, a steering layer of governance and policy, seven operational domains, phased implementation sequencing, and outcome-focused feedback loops. The review shows that digital provision fails when institutions treat the learning management system as the reform, underinvest in staff and student support, delay policy redesign, or reduce assessment integrity to surveillance alone. It succeeds when governance, curriculum, quality assurance, infrastructure, data governance, and financing are intentionally coupled. Principal contribution: The article contributes an original Institutional Architecture for Credible Digital Transition framework and a companion University Online Readiness and Transition Toolkit comprising a readiness rubric, phased roadmap, and policy checklist. Implications: The framework offers an actionable basis for institutional leaders, regulators, and scholars seeking to design, evaluate, and sequence digital transition in African higher education without reproducing techno-solutionist assumptions.
Keywords: 
;  ;  ;  ;  ;  ;  
Subject: 
Social Sciences  -   Education

1. Introduction

1.1. Background: Digital provision has moved from the institutional periphery to the strategic core of higher education. That shift, accelerated by the disruptions of the COVID-19 period but not reducible to them, has exposed a persistent analytical problem. Universities frequently discuss online and blended learning as though the central question were whether individual lecturers can teach differently or whether a particular platform can host courses at scale. Yet the more difficult question is institutional: how a conventional university reorganizes governance, rules, infrastructure, academic work, quality assurance, and support systems so that digital provision is not merely available but credible. The difference is decisive. A university can upload content quickly and still fail to deliver a legitimate academic experience. It can purchase an enterprise learning management system and still lack the policy, staffing, assessment design, and quality controls necessary for trustworthy online education.
1.2. Problem Statement: That problem is especially salient in African higher education. Universities across the continent operate within fast-expanding enrolment systems, sharp inequalities in devices and connectivity, uneven public financing, strong demands for employability and access, and regulatory environments that are becoming more explicit about virtual learning, internal quality assurance, and the use of artificial intelligence. Resource constraints do not negate the case for digital provision. On the contrary, they intensify the need for institutionally disciplined transition because poorly sequenced digitization can increase fragility, deepen exclusion, and erode confidence in academic standards. The relevant issue is therefore not whether African universities should remain conventional or become digital in some abstract sense. It is how they can design digital provision in ways that are educationally sound, regulatorily legitimate, financially viable, and socially inclusive.
The present article addresses a gap in the literature. Research on online and blended higher education is substantial, but much of it remains fragmented. One body of work examines student satisfaction, engagement, or self-regulation in online environments. Another focuses on faculty attitudes or professional development. A third emphasizes learning management systems, learning analytics, or digital transformation as broad organizational agendas. A fourth turns to quality assurance, accreditation, or policy. Each literature is useful, yet the prevailing tendency is analytical disaggregation. Technology is treated separately from governance, governance separately from pedagogy, pedagogy separately from regulation, and regulation separately from institutional capability. This fragmentation obscures the fact that online transition is a systems problem. Universities do not move online by changing only one layer of their operation. They do so by aligning authority structures, financial commitments, infrastructure, curricula, assessment regimes, staff capability, student support, and monitoring mechanisms across time (Ali & Georgiou, 2025; Fernández et al., 2023; Graham et al., 2013; Jisc, 2023).
This article therefore treats online transition not as a narrow modality question but as a problem of institutional architecture. The phrase institutional architecture is used here to denote the structured arrangement of governance, policy, people, processes, technologies, quality controls, and resource commitments that together make a mode of provision workable and legitimate. The argument is that credible online and blended provision emerges when these institutional components are deliberately coupled and sequenced. It does not emerge automatically from technological acquisition, from emergency teaching improvisation, or from isolated enthusiasm among academic innovators. The literature on digital transformation in higher education increasingly points in this direction, warning against models that equate transformation with digitization while neglecting organizational design, leadership, data governance, and long-term sustainability (Castro Benavides et al., 2020; Farias-Gaytan et al., 2023; Wang, 2023).
Rwanda offers a useful policy-reference environment for examining these questions. It should not be treated as the whole of African higher education, but it is analytically instructive because recent Higher Education Council guidance now addresses distance learning, virtual learning and artificial intelligence, internal quality assurance, institutional infrastructure, learning and assessment, and student support through an increasingly explicit regulatory architecture (Rwanda Higher Education Council, 2023a, 2023b, 2023c, 2024, 2025, 2026). This indicates that digital transition in African higher education is no longer merely an internal managerial preference. It is becoming part of the formal quality and legitimacy environment in which institutions operate. Figure 1 summarizes the sequence through which Rwanda’s policy architecture for digital and virtual higher education became progressively more explicit between 2023 and 2026.
1.3. Study objective and research questions: Against that background, the objective of the article is to develop a university-wide framework for bringing conventional institutions online in African higher education. The central research question is: What institutional architecture is required to move a university from conventional face-to-face delivery to credible, quality-assured online or blended provision in African higher education? Four subsidiary questions guide the analysis: which institutional components recur most consistently across the literature; how those components interact; what sequencing patterns and bottlenecks characterize successful and unsuccessful transitions; and how these insights can be translated into a practical toolkit for institutional planning.
1.4. Study Contribution: The article makes three contributions. First, it synthesizes heterogeneous evidence that is usually discussed in separate scholarly and policy conversations. Second, it proposes an original framework, the Institutional Architecture for Credible Digital Transition, that explains digital transition as the interaction of contextual conditions, steering mechanisms, operational domains, implementation stages, and feedback loops. Third, it translates that framework into a companion University Online Readiness and Transition Toolkit containing a readiness rubric, phased roadmap, and policy checklist that can support institutional leaders, regulators, and digital-learning directors. By proceeding in this way, the article seeks to contribute both to scholarship on higher education transformation and to the practical governance of online and blended provision in African contexts.

2. Literature Review and Analytical Framework

2.1 Key concepts and working definitions: Online transition is frequently described with terms that are analytically adjacent but not interchangeable. A conventional institution refers here to a university whose dominant delivery model, governance routines, quality controls, staffing assumptions, and student support systems were designed primarily for face-to-face provision. Online provision denotes programmes or courses in which teaching, learning, communication, assessment, and support are delivered predominantly through digital environments. Blended provision refers to structured combinations of face-to-face and online learning activities within an intentionally designed pedagogical and institutional model rather than an improvised mixture of modalities. Digital transformation is broader than either. It concerns the reconfiguration of processes, structures, competencies, and decision-making through digital means across the institution as a whole, not merely in the classroom (Castro Benavides et al., 2020; Fernández et al., 2023; Jisc, 2023).
Two further concepts require precision. Online readiness is not the possession of hardware or an LMS in isolation. It is the extent to which a university has the strategic, regulatory, infrastructural, pedagogical, financial, and organizational capabilities needed to deliver and continuously improve digital provision. Credibility, in turn, is used in a thick institutional sense. It includes regulatory compliance, academic legitimacy, dependable operations, fair and secure assessment, inclusive learner support, and the ability to withstand scrutiny from students, regulators, employers, and peer institutions. This matters because some institutions achieve digital availability without digital credibility. They can host teaching online but cannot yet demonstrate that quality, standards, and support remain intact.
2.2 Technology-led and institution-led explanations of digital change: The literature reveals a recurring tension between technology-led and institution-led accounts of change. Technology-led narratives present digital transition as the diffusion of platforms, analytics, AI tools, or communication systems into existing university structures. Institution-led accounts argue that tools matter, but only as part of broader transformations in governance, workflows, capability, and quality culture. Systematic and multivocal reviews of digital transformation in higher education consistently conclude that higher education institutions have often pursued fragmented initiatives rather than holistic redesign, with strategy, human capability, and process alignment lagging behind technological adoption (Castro Benavides et al., 2020; Farias-Gaytan et al., 2023; Fernández et al., 2023). Jisc’s framework similarly treats digital transformation as a whole-organization agenda involving leadership, investment, infrastructure, people, and process alignment rather than isolated innovation projects (Jisc, 2023; Newman et al., 2025). UNESCO has made the same point at the systems level by arguing that digital transformation concerns content, pedagogy, governance, and management simultaneously (Wang, 2023).
2.3 Adoption research and the organizational scaling of blended provision: The blended learning adoption literature offers a particularly useful bridge between classroom and institutional analysis. Graham et al. (2013, 2014) and Ali and Georgiou (2025) show that institutional adoption follows identifiable stages and depends on the coordinated development of strategy, structure, and support. Those findings remain highly relevant beyond blended learning, narrowly conceived because they illuminate the organizational conditions under which institutions move from experimentation to scaled provision. A related systematic review by Anthony et al. (2022) demonstrates that adoption is shaped not only by student and lecturer attitudes but also by administrative arrangements, policy, and implementation practice. What this literature suggests is that digital transition is cumulative and staged. Mature provision does not arise from a singular decision to go online. It emerges from the repeated alignment of institutional routines.
2.4 Institutional legitimacy and regulatory conformity: At the same time, the literature is marked by unresolved tensions. One concerns legitimacy. Universities are organizations embedded in regulatory, professional, and reputational fields. They do not innovate in a vacuum. Institutional theory is therefore highly relevant because it helps explain why universities seek conformity with accrediting expectations, peer norms, and professional standards even while attempting local innovation (DiMaggio & Powell, 1983). In online transition, coercive pressures come from regulators and quality-assurance bodies, normative pressures from academic and professional communities, and mimetic pressures from the imitation of prestigious institutions’ digital models. This helps explain why many universities adopt the language of digital transformation, but it also warns against the uncritical importation of models developed in high-resource contexts. Isomorphic pressures can produce legitimacy, yet they can also encourage superficial mimicry when underlying capabilities are weak.
2.5 Sociotechnical systems and the joint design of work and technology: A second tension concerns the relationship between technical and social systems. Sociotechnical perspectives are useful here because they reject the assumption that technology can be analyzed independently of work design, decision rights, support processes, and human capability. Digital environments change not only what tools are used but how academic and administrative work is organized, how information moves, how decisions are made, and how staff and students interact. Govers and van Amelsvoort (2023) argue that digital transformation requires the joint optimization of social and technical elements rather than the dominance of one over the other. This insight is especially salient for universities, where technology choices affect curriculum workflows, assessment practice, record systems, helpdesk design, library access, and data governance. An LMS can therefore not be understood as a neutral platform. It is part of a sociotechnical configuration that either supports or destabilizes institutional coherence.
2.6 Capability, change, and professional support: A third tension concerns capability and change. Organizational change scholarship suggests that transition capacity depends on leadership commitment, middle-management translation, incentives, resources, and routineized learning rather than declarative strategy alone. In higher education, this is visible in evidence on professional development and staff adoption. Gao et al. (2022) show that online faculty professional development has expanded, but the field remains uneven, and technical training alone is insufficient. Sanders and Mukhari (2024) similarly show that lecturers’ willingness to sustain blended provision depends on managerial support, reliable technology, time, and professional development. Resistance in this context is often not hostility to innovation but a rational response to unsupported workload transfer.
2.7 Student experience, support, and equity: The literature on student experience adds further caution. Martin and Bolliger’s (2022) review indicates that learner satisfaction depends on multiple factors, yet programme quality, assessment, and learner support remain underexamined. García-Machado et al. (2024) show that the support students receive in online learning environments influences academic performance, reinforcing the point that digital transition cannot be judged by course content delivery alone. In resource-constrained contexts, the importance of support is even more evident. Tulinayo et al. (2018) found that access, awareness, capacity, and lecturer characteristics shaped students’ acceptance of digital technologies in a resource-constrained higher education setting. Such findings challenge celebratory narratives that equate access to platforms with equitable participation. Rwanda-specific evidence likewise suggests a mixed readiness profile: device access and basic digital familiarity may be substantial, yet trust, accreditation concerns, and willingness to participate remain uneven, reinforcing the need to treat learner support and legitimacy as institutional conditions rather than mere technical add-ons (Sangwa et al., 2020).
2.8 Quality assurance, learning analytics, and AI governance: Quality assurance and data governance introduce additional complexity. ENQA (2018) argues that e-learning should be assessed through existing quality standards, but with mode-specific attention to design, support, staffing, information provision, and monitoring. Learning analytics can strengthen monitoring and improvement, yet they also raise questions of privacy, autonomy, consent, and institutional power (Gašević et al., 2022; Jones, 2019). AI governance intensifies this dilemma. UNESCO’s guidance on generative AI and more recent work on rights-based AI governance both emphasize human-centred regulation, transparency, data protection, and institutional preparedness rather than unrestrained deployment (UNESCO, 2023, 2025a). The implication is that digital transition produces new governance objects: data flows, algorithmic tools, authorship disputes, and analytics dashboards that require formal oversight if academic legitimacy is to be preserved.
2.9 Composite analytical framework and guiding propositions: These debates support a composite analytical framework built around three mutually reinforcing lenses. Institutional theory explains why digital provision must secure legitimacy in regulatory and professional fields. Sociotechnical systems theory explains why technical tools must be aligned with work processes, support structures, and human roles. Capability- and change-oriented perspectives explain why sequencing, investment, and routinization determine whether institutional ambitions become sustainable practice. Taken together, these lenses suggest that credible digital transition depends on three propositions. First, legitimacy must be designed, not assumed; governance, policy, and quality assurance are constitutive rather than auxiliary. Second, digital provision is a joint social and technical accomplishment, not a platform feature. Third, transition is staged; preparation and consolidation are as important as launch. These propositions guide the analysis that follows and ground the Institutional Architecture for Credible Digital Transition framework developed in Section 4.9.

3. Methodology

3.1 Design: This study employed a systematized integrative review combined with comparative policy analysis. The design was chosen because the research question is explanatory and framework-building rather than effect-size oriented. The relevant evidence base is heterogeneous: empirical studies, systematic reviews, theoretical texts, organizational frameworks, and official quality-assurance and regulatory documents all bear on the question of how universities move credibly online. A narrow systematic review restricted to one study type would therefore have obscured essential dimensions of the problem. The integrative review logic made it possible to synthesize diverse forms of evidence, while the comparative policy component ensured that institutional architecture was examined not only as an organizational matter but also as a question of public regulation and legitimacy (Whittemore & Knafl, 2005).
3.2 Search scope, sources, and temporal boundaries: The search and selection process was deliberately transparent but pragmatic. Searches were undertaken between January and March 2026, with the main publication window set from 2013 to February 2026 in order to capture the post-MOOC, post-pandemic, and AI-affected phases of digital higher education. Seminal earlier methodological and theoretical works were retained where necessary. Academic retrieval drew on Google Scholar and on publisher platforms accessible through web indexing, including ScienceDirect, SpringerLink, Nature Portfolio, Frontiers, MDPI, and PubMed/PMC. Policy and framework retrieval targeted UNESCO, Jisc, ENQA, and official Rwandan higher education sources, especially the Higher Education Council and the Ministry of Education. This approach was appropriate for a systematized review whose goal was analytical saturation across institutional domains rather than exhaustive enumeration of every course-level study.
3.3 Search strings and retrieval logic: Search strings combined higher education, digital provision, and institutional architecture terms. Typical combinations included: “higher education” AND (“online learning” OR “blended learning” OR “distance learning” OR “digital transformation”) AND (institution* OR governance OR strategy OR policy OR quality assurance OR assessment OR infrastructure OR faculty development OR student support OR learning analytics OR AI governance OR Africa). Additional targeted strings were used for Rwanda and regulation, such as “Rwanda higher education virtual learning guidelines”, “Rwanda internal quality assurance higher education”, and “distance learning accreditation Rwanda”. Backward and forward citation checking was used selectively to strengthen conceptual coverage.
3.4 Eligibility criteria: Eligibility criteria reflected the study objective. Included sources addressed higher education and spoke either directly to institution- or system-level digital transition or to a domain that becomes decisive at scale, such as assessment integrity, faculty development, learner support, or data governance. Peer-reviewed studies, systematic reviews, major organizational frameworks, and official policy or regulatory documents were included. Sources had to offer analytical, empirical, or regulatory relevance to online or blended provision. Excluded sources were K-12 focused studies, vendor marketing materials, opinion pieces without analytical substance, and narrowly course-specific papers whose findings did not travel meaningfully to institutional design. Emergency remote teaching accounts were included only when they yielded durable institutional lessons rather than descriptive crisis narratives.
3.5 Screening, corpus construction, and appraisal logic: Screening proceeded in three stages: title and abstract or webpage review, full-text or extended abstract review where available, and final relevance assessment against the study’s institutional architecture focus. Duplicates and near-duplicates were removed. The final corpus comprised 37 sources: peer-reviewed empirical studies and reviews, methodological and theoretical texts, international framework documents, and official policy and regulatory guidance. Rather than assigning spurious precision to heterogeneous evidence, the study used a reasoned appraisal strategy. Peer-reviewed studies were read with MMAT-informed attention to design clarity, coherence between question and method, transparency of evidence, and relevance to institutional transition (Hong et al., 2018). Policy and framework documents were appraised for authority, currency, specificity, implementation relevance, and connection to recognized quality-assurance or governance mandates. Reporting logic was informed by methodological guidance on integrative and scoping reviews, particularly the importance of explicit eligibility criteria, transparent search logic, and clear synthesis procedures (Arksey & O’Malley, 2005; Levac et al., 2010; Tricco et al., 2018).
3.6 Data extraction and analytical synthesis: Data extraction was guided by an analytical matrix built around six fields: source type; geographical or policy context; principal institutional domain addressed; level of analysis (macro, meso, or micro); identified dependencies or interactions with other domains; and implications for sequencing, credibility, or failure. Thematic coding was then followed by relational synthesis. In practice, this meant moving from identification of recurring institutional components to analysis of how those components interact and under what conditions they support or undermine credible digital provision. A final round of abductive synthesis was used to derive the proposed framework, testing whether the emerging model could account for both positive institutional conditions and recurrent failure points.
3.7 Limitations of the review design: This design has limitations. The corpus was restricted to English-language materials and to sources accessible through open web or official repositories. The study was systematized rather than fully exhaustive and did not attempt meta-analysis. It also did not generate primary data from African universities. Nevertheless, these limits are compatible with the study’s objective. The purpose was not to rank interventions statistically but to construct a defensible institutional architecture from the best available conceptual, empirical, and regulatory evidence. That objective requires breadth across domains, transparent synthesis, and caution against overclaiming, all of which shaped the present analysis. Table 1 summarizes the review’s search dimensions, typical search terms, and eligibility emphasis.

4. Findings and Discussion

The synthesis shows broad convergence on the proposition that credible digital transition is institutional rather than merely pedagogical. The subsections that follow integrate findings and discussion across the principal domains that recur in the literature and policy corpus. Rather than treating these domains as separable checklists, the analysis emphasizes their interaction, their sequencing, and the recurrent implementation bottlenecks that arise when institutions strengthen one domain while neglecting others.

4.1. Institutional Governance and Leadership Architecture

The literature converges on a foundational point: credible digital transition begins as a governance question before it becomes a delivery question. Institutions that scale online and blended provision more successfully tend to place digital strategy under formal senior leadership authority, link it to academic planning and budget processes, and establish cross-functional structures that bridge academic affairs, ICT, quality assurance, registry, finance, and student services (Ali & Georgiou, 2025; Graham et al., 2013; Jisc, 2023). Where that architecture is absent, digital provision is often relegated to an e-learning unit with limited authority, producing what may be termed pilotization without institutionalization. Courses appear online, but governance routines, accountability lines, and budget frameworks remain face-to-face by default.
This point is more than managerial. Institutional theory suggests that universities must demonstrate that online provision is governed through recognizable and legitimate structures if it is to be trusted by regulators, professional bodies, and employers (DiMaggio & Powell, 1983). In that sense, senate approval processes, board-level oversight, academic regulations, and formal risk ownership are not bureaucratic afterthoughts. They are part of the credibility infrastructure of digital provision. Rwanda’s policy environment illustrates the growing explicitness of this expectation. HEC now publishes separate guidance on distance learning, virtual learning and AI, internal quality assurance, infrastructure standards, learning and assessment, and student support, indicating that digital provision is regulated across multiple institutional functions rather than treated as a purely pedagogical matter (Rwanda Higher Education Council, 2023a, 2023b, 2023c, 2024, 2025, 2026).
Leadership architecture also shapes whether institutions adopt sustainable financing models. Many digital initiatives begin with project funds or donor-supported procurement, but fail when recurring costs become visible. A credible transition requires budget lines for platform licensing or maintenance, connectivity support, instructional design, staff development, accessibility, technical support, cybersecurity, and periodic review. The recurring theme in the literature is that underfunded transition externalizes costs to faculty through unpaid redesign labour and to students through device, data, or access burdens. Financial sustainability is therefore not ancillary to governance; it is one of its most concrete tests.

4.2. Digital Infrastructure and LMS Ecosystem Requirements

The literature strongly rejects any reduction of digital transition to LMS acquisition. An institution can possess a stable platform and still fail institutionally because the LMS is only one component in a wider ecosystem. Credible provision depends on interoperable infrastructure linking course environments, student information systems, digital identity and authentication, library access, content storage, communication tools, technical support, analytics, backup procedures, cybersecurity, and, in many contexts, power and connectivity resilience (Fernández et al., 2023; Jisc, 2023). Where these elements are weakly connected, students experience discontinuity, staff duplicate work across systems, and quality monitoring becomes unreliable.
Evidence from resource-constrained contexts makes these dependencies particularly visible. Tulinayo et al. (2018) found that student acceptance and usability were shaped not only by technology availability but also by awareness, capacity, access, and lecturer characteristics. Mabidi’s (2024) review of South African higher education likewise identifies poor infrastructure, inadequate funding, and persistent inequalities as structural barriers to digital transition. These findings challenge universalized models of digital maturity derived from high-bandwidth environments. In many African institutions, infrastructure strategy must prioritize bandwidth sensitivity, mobile compatibility, asynchronous functionality, local caching or offline access where possible, and service models that assume intermittent connectivity rather than ideal continuous access.
A further issue is technological dependency. Platform decisions can produce vendor lock-in, fragmented data ownership, or procurement obligations that exceed institutional bargaining power. For universities in low-resource and policy-evolving contexts, technological dependency is also a governance issue because it affects continuity, sovereignty over student data, and the total cost of ownership over time. Infrastructure strategy must therefore be linked to procurement policy, contract review, interoperability standards, exit planning, and data governance. Rwanda’s infrastructure standards are useful in this regard because they anchor digital provision in broader institutional adequacy rather than treating technical procurement as self-validating (Rwanda Higher Education Council, 2023a).

4.3. Curriculum Redesign and Pedagogical Adaptation

The literature is clear that credible online or blended provision does not result from uploading lecture notes or recording face-to-face classes with minimal redesign. Curriculum transition is a process of academic re-specification. It requires programme-level reflection on learning outcomes, contact patterns, activity design, sequencing, media choice, interaction structures, workload, and assessment coherence (Anthony et al., 2022; Graham et al., 2014). The significance of this point is often underestimated because technology debates can obscure the amount of academic labour required to redesign programmes rather than merely migrate content.
The strongest adoption studies suggest that mature institutions move from course-by-course improvisation to programme-level design routines. Those routines usually include instructional design support, templates or quality standards, review processes, and explicit decisions about the balance between synchronous and asynchronous learning (Ali & Georgiou, 2025; Graham et al., 2013). This transition matters because programme coherence is difficult to sustain when each course is independently digitized according to lecturer preference. A credible online programme must allow students to navigate a recognizable architecture of expectations, communication patterns, deadlines, learning activities, and support pathways across modules.
The literature also complicates simplistic narratives of flexibility. Flexible delivery can widen access, but only if curriculum design acknowledges diverse student circumstances and digital realities. In African contexts this often means designing for low-bandwidth participation, allowing asynchronous engagement where appropriate, ensuring mobile access, and curating learning resources in ways that do not presume constant access to large files or synchronous meetings. Flexibility without structure can easily become abandonment. Thus the relevant design principle is not flexibility in the abstract, but structured flexibility: pathways that widen participation while preserving academic challenge, feedback, and progression.

4.4. Assessment Integrity and Academic Standards

Assessment is one of the most contested domains in digital transition because it sits at the intersection of standards, trust, technology, and student rights. The literature shows that online assessment debates are often framed too narrowly around cheating detection and remote proctoring. Holden et al. (2021) and Butler-Henderson and Crawford (2020) both show that integrity problems are broader and that sustainable responses require institutional rather than merely technological solutions. Authentic task design, staged submissions, oral components, assessment variety, secure item management, clear misconduct procedures, staff training, and student induction all matter. Heavy surveillance technologies may address one integrity risk while creating others related to privacy, equity, and mistrust.
This is where academic standards and rights-based governance meet. A university that moves provision online without revising assessment regulations risks two symmetrical failures. It can become permissive in ways that weaken the meaning of grades and awards, or excessively punitive in ways that compromise fairness and student dignity. Recent AI developments intensify this dilemma by introducing new questions about authorship, disclosure, permissible assistance, and evidence of learning. UNESCO’s AI guidance argues that institutions need explicit, human-centred governance rather than ad hoc reactions to emerging tools (UNESCO, 2023). Rwanda’s recent virtual learning and AI guidance and its national learning, teaching and assessment policy indicate the growing regulatory recognition that digital assessment requires updated rules, not merely new software (Rwanda Higher Education Council, 2023b, 2025).
A critical implication follows. Assessment integrity should be treated as an institutional design problem, not a surveillance procurement problem. Where universities frame integrity primarily as invigilation, they risk conflating credibility with control. The literature reviewed here suggests a more balanced model: integrity through assessment redesign, identity assurance, policy clarity, due process, academic support, and proportionate use of technology.

4.5. Faculty Development and Change Management

No institutional architecture for digital transition can succeed without sustained faculty capability building. Yet the literature also shows that faculty development is frequently misconstrued as a short course in platform usage. Gao et al. (2022) demonstrate that professional development for online teaching has grown, but the field remains uneven and often under-conceptualized. What universities require is not episodic training but a structured development system combining digital pedagogy, assessment design, accessibility, feedback practice, learner support, AI literacy, and data ethics.
The change-management dimension is equally important. Sanders and Mukhari (2024) found that lecturers in a South African institution associated successful blended learning with management support, time, improved professional development, and reliable technology. This resonates strongly with the broader adoption literature. Faculty members do not experience digital transition only as pedagogical innovation. They also experience it as altered workload, changed communication expectations, new forms of visibility, and, at times, managerial intensification. Institutions that overlook this social reality often misdiagnose resistance. What appears as resistance to change may actually be resistance to unfunded redesign work, unstable infrastructure, or unrealistic implementation timelines.
Effective faculty architecture therefore includes more than training. It includes workload recognition, access to instructional design expertise, communities of practice, peer mentoring, revised promotion and recognition criteria, and responsive help channels during live teaching periods. In mature models, digital capability becomes part of academic professionalism rather than an optional specialization. This institutionalization matters because credible provision depends not on heroic innovators but on routinized, distributed competence across departments.

4.6. Student Support, Inclusion, and Digital Equity

Student support is often treated as a service adjunct to online learning, but the literature suggests it is one of the clearest markers of institutional maturity. Martin and Bolliger’s (2022) review found that learner support, programme quality, and assessment were comparatively less examined in satisfaction research even though they are central to student success. García-Machado et al. (2024) reinforce the point by showing that the support students receive in online environments influences academic performance. The implication is that digital transition cannot be evaluated on teaching design alone. It must also be judged on whether students can successfully navigate admission, orientation, advising, communication, technical problems, library access, wellbeing needs, and academic difficulties in a digital environment.
In African higher education, the equity dimension is especially acute. Device access, data costs, disability support, home study conditions, language, and geographical location can all shape participation. Tulinayo et al. (2018) show that usability and acceptance depend partly on access and student capacity, while UNESCO’s broader digital and AI work warns repeatedly that technology can deepen inequality when connectivity and rights protections are uneven (UNESCO, 2025a). Consequently, support architecture must move beyond generic helpdesks. It should include onboarding into online study, digital study skills, academic advising, counselling or wellbeing referral routes, accessibility compliance, multi-channel communication, and targeted support for students facing connectivity or device constraints.
Rwanda’s National Student Support and Guidance Policy is instructive because it locates support within formal institutional responsibilities rather than discretionary student services (Rwanda Higher Education Council, 2023c). That principle is broadly transferable. Online inclusion should be treated as a core quality condition of digital provision, not as a charitable supplement. Institutions become credible when they can show that students are not merely admitted into digital environments but are supported to progress within them.

4.7. Quality Assurance, Data Governance, and Continuous Improvement

Quality assurance is the domain that most clearly distinguishes emergency digitization from credible institutional transition. ENQA (2018) argues that e-learning does not require a separate concept of quality so much as mode-sensitive application of existing standards regarding design, delivery, staffing, information, and review. That position is valuable because it resists the false choice between exceptionalizing online learning and ignoring its specific demands. For universities, the practical implication is that internal quality assurance systems must adapt their approval, monitoring, and review mechanisms to digital provision. This includes evidence on course design standards, platform reliability, student participation, complaints, progression, feedback timeliness, assessment quality, and support responsiveness.
Data governance is now inseparable from this agenda. Learning analytics promises better monitoring and earlier intervention, yet it also expands the university’s ability to observe and categorize student behaviour. Jones (2019) argues that informed consent and student autonomy should not be treated as peripheral concerns. Gašević et al. (2022) similarly show that stakeholders, strategy, and scale are central challenges in learning analytics adoption. A university-wide architecture for digital provision therefore requires clear rules on data collection, purpose limitation, retention, access, human oversight, and appeal. It must also clarify how analytics data feeds improvement rather than becoming a detached surveillance layer.
AI governance broadens the same concerns. UNESCO’s 2023 guidance and 2025 survey of higher education institutions both suggest that universities are increasingly recognizing the need for institution-level AI frameworks, yet uneven confidence and ethical uncertainty persist (UNESCO, 2023, 2025b). The challenge is not merely whether AI tools are allowed. It is how they are governed in relation to academic integrity, student rights, staff work, environmental cost, and epistemic quality. Rwanda’s recent HEC guidance on virtual learning and AI shows that regulators are beginning to integrate AI into the quality architecture of higher education itself (Rwanda Higher Education Council, 2025). Taken together, these findings suggest that quality assurance, data governance, and AI governance form a trust infrastructure. Without them, digital provision may scale operationally while losing legitimacy.

4.8. Sequencing Models for Transition from Conventional to Blended and Online Provision

One of the clearest conclusions from the reviewed literature is that sequencing matters. Institutions do not move successfully from conventional delivery to credible online or blended provision by trying to optimize every domain simultaneously. Rather, they tend to progress through distinguishable phases, albeit unevenly. The stage models proposed in blended learning adoption studies remain useful here, especially when extended by more recent digital transformation frameworks (Ali & Georgiou, 2025; Graham et al., 2013; Newman et al., 2025).
The first stage may be described as preparation. Its defining tasks are strategic and diagnostic: clarifying institutional purpose, mapping regulatory conditions, auditing readiness, identifying programme priorities, and establishing minimum infrastructure and policy baselines. Institutions that skip this stage often conflate digital aspiration with capability. The second stage is transition, in which carefully selected pilots, course and programme redesign, staff development, support protocols, and revised assessment arrangements are implemented under heightened monitoring. The critical challenge here is to avoid treating pilots as self-sufficient successes. The point is to generate institutional learning, not isolated exemplars.
The third stage is consolidation. At this point the institution formalizes policies, stabilizes support services, improves interoperability, routinizes quality review, and aligns budgeting with actual delivery demands. The shift is from innovation projects to institution-wide operating models. Only then does a fourth stage of optimization become meaningful. Optimization includes analytics-informed improvement, refined AI governance, differentiated student support, partnerships, micro-credentials, or more advanced blended and virtual mobility models. Attempting optimization without consolidation is a common error because it produces a sophisticated discourse on top of unstable foundations.
The literature also reveals recurrent failure points. First, institutions confuse emergency remote teaching with online strategy. Second, they underprice transition by ignoring staff redesign labour and student affordability. Third, they delay policy revision, leaving assessment, workload, or data practices governed by face-to-face assumptions. Fourth, they rely excessively on surveillance-based integrity mechanisms instead of redesigning assessment. Fifth, they overlook student support and digital equity. Sixth, they allow fragmented systems and vendor lock-in to undermine coherence. These failures are not random. They arise when institutions treat digital transition as an additive project rather than a reconfiguration of institutional architecture.

4.9. Toward a University-Wide Framework for Bringing Conventional Institutions Online

The evidence synthesized above supports a five-layer framework termed the Institutional Architecture for Credible Digital Transition. The framework is intended to explain not simply what factors matter, but how they relate. Figure 2 presents this five-layer framework schematically, showing the relationship among contextual boundary conditions, steering architecture, operational architecture, temporal sequencing, and outcomes with feedback loops.
The second layer is the steering architecture. This includes governing bodies, senior leadership, digital strategy, finance and procurement authority, risk management, and the formal policy framework governing learning, assessment, data, student support, staff development, and quality assurance. This layer confers legitimacy and direction. It determines whether digital provision is authorized, resourced, and reviewable.
The third layer is the operational architecture, consisting of seven interdependent domains: infrastructure and LMS ecosystem; curriculum redesign; assessment integrity; faculty capability; student support and inclusion; quality assurance; and data and AI governance. These domains should not be imagined as parallel silos. They are coupled. Weakness in one can destabilize the others. For example, strong curriculum redesign without student support weakens retention; strong infrastructure without policy redesign weakens legitimacy; strong analytics without governance weakens trust.
The fourth layer is temporal sequencing. The framework proposes four stages: preparation, transition, consolidation, and optimization. This temporal layer matters because institutions rarely possess full readiness at the outset. Credibility emerges progressively as the steering and operational layers are aligned, reviewed, and stabilized over time. Sequencing therefore functions as a causal mechanism, not merely a project-management convenience.
The fifth layer is the outcomes layer. The desired outcomes are credible provision, academic quality, resilience, inclusion, and sustainability. These outcomes are mediated by feedback loops. Continuous improvement data, quality review findings, learner experience evidence, financial monitoring, and policy review all feed back into the steering and operational layers. Without these loops, institutions cannot learn from implementation or correct drift.
The framework’s central claim is that digital transition becomes credible when legitimacy, capability, and sociotechnical integration are jointly produced. Legitimacy is secured through governance, policy, and quality assurance. Capability is secured through investment in infrastructure, staff, support, and finance. Sociotechnical integration is secured by designing technical systems and academic-administrative processes together. The framework also has boundary conditions. It does not assume that every institution should move immediately to fully online provision. For some universities, credible transition may culminate in robust blended models rather than predominantly online programmes. Nor does it assume that digital transformation is inherently progressive. Poorly governed transition can expand access numerically while degrading quality or intensifying exclusion. The framework is therefore descriptive, explanatory, and cautionary at once. Table 2 provides a compact analytical summary of the framework’s layers, core functions, and typical institutional expressions.

4.10. Companion output: University Online Readiness and Transition Toolkit

To translate the framework into operational guidance, this study proposes a companion University Online Readiness and Transition Toolkit. The toolkit is not a substitute for institutional judgment. Rather, it offers structured prompts that can support planning, self-audit, and regulatory dialogue. An extended version of the toolkit, together with a Preparatory Alignment Memo, is available as supplementary material in the study’s Open Science Framework project (DOI: 10.17605/OSF.IO/Z7KQP). Figure 3 shows the toolkit’s overall structure and its three linked instruments: the readiness rubric, the phased implementation roadmap, and the policy and governance checklist.
Table 3. University online-readiness rubric.
Table 3. University online-readiness rubric.
Domain Ad hoc Emerging Structured Assured
Leadership and governance Pilots lack formal authority Committee exists but with limited mandate Digital transition owned by senior leadership and academic governance Board/senate oversight, clear accountability, recurring review
Policy and regulation Rules absent or face-to-face only Provisional guidance in place Approved policies for delivery, assessment, QA, and student support Policies reviewed routinely and aligned with regulation
Infrastructure and LMS ecosystem Standalone platform, unstable support Basic LMS and communication tools Interoperable systems, helpdesk, identity, library, backup Resilient ecosystem with service standards and continuity planning
Curriculum redesign Content upload dominates Selected modules redesigned Programme-level redesign with standards and templates Continuous evidence-led redesign across programmes
Assessment integrity Replication of invigilated exams Mixed online methods with partial safeguards Authentic design plus identity and misconduct procedures AI-aware, privacy-conscious, standards-aligned assessment regime
Faculty capability Voluntary technical training only Structured workshops and basic support Instructional design support, workload recognition, communities of practice Capability embedded in CPD, promotion, and QA
Learner support and inclusion Reactive technical help only Orientation and limited advising Integrated academic, technical, library, and wellbeing support Targeted equity measures and accessible multi-channel support
QA, data, and AI governance Fragmented reporting and weak oversight Basic indicators and local practice Integrated QA review, analytics rules, data responsibilities Transparent, rights-based governance with improvement loops
Finance and sustainability Project-funded and uncertain Partial budgeting for key systems Recurring budget and cost model established Sustainable financing and periodic value review
Note. Institutions may score each domain from 1 (ad hoc) to 4 (assured). Readiness for scale is strongest when no critical domain remains at level 1.
The second instrument is a phased implementation roadmap. The roadmap distinguishes preparation, transition, consolidation, and optimization. Each phase specifies its strategic purpose, essential actions, expected outputs, and main risk if bypassed. This responds directly to a recurring weakness in digital transition literature and practice: the treatment of scaling as an act of will rather than a staged institutional process. Table 4 presents the phased institutional transition roadmap, including the strategic purpose, essential actions, and principal risk associated with bypassing each stage.
The third instrument is a policy and governance checklist for credible online or blended provision. The checklist identifies the minimum policy architecture required before scale can be claimed with confidence. These policies include digital learning strategy, academic regulations for online and AI-affected assessment, internal quality assurance procedures, data and analytics governance, accessibility and student support protocols, workload and professional development provisions, and technology procurement and continuity rules. The checklist is intentionally modest in form but significant in implication. It asks not whether an institution has online courses, but whether it has created the rule system capable of governing them credibly. Table 5 translates this third component into a minimum policy and governance checklist for credible online or blended provision.

5. Conclusion

5.1 Reasserting the central argument: The central argument of this article is that bringing a conventional university online is not primarily a question of adopting a new instructional modality. It is a question of constructing an institutional architecture capable of making digital provision credible. The review showed that such credibility depends on the deliberate coupling of governance, policy, infrastructure, curriculum redesign, assessment integrity, faculty capability, student support, quality assurance, data governance, and sustainable financing. Online and blended provision become trustworthy not when they are technologically possible, but when they are institutionally organized.
5.2 Answering the research question through the proposed framework: The study answered the research question by proposing the Institutional Architecture for Credible Digital Transition framework. The framework explains digital transition through five interacting layers: contextual boundary conditions, a steering layer of governance and policy, seven operational domains, phased sequencing, and outcome-oriented feedback loops. This model makes visible what fragmented literatures often obscure: that regulation, academic standards, platform design, staffing, learner inclusion, and data governance are interdependent rather than optional add-ons. In African higher education, where digital transition often unfolds under low-resource conditions and evolving quality regimes, this system’s perspective is particularly important.
5.3 Theoretical, practical, and policy contributions: The article contributes theoretically by integrating institutional, sociotechnical, and capability-oriented perspectives into a single architecture for university-wide digital transition. It contributes practically by offering the University Online Readiness and Transition Toolkit, which translates the framework into a readiness rubric, a phased roadmap, and a policy checklist. For institutional leaders, these tools can support sequencing and self-audit. For regulators and quality-assurance bodies, they provide a way to evaluate online or blended provision beyond narrow platform or content indicators. For scholars, the framework offers a basis for comparative work on how universities negotiate legitimacy, inclusion, and sustainability in the digital transition.
5.4 Limitations: Several limitations should be recognized. The study is based on secondary evidence and open-access or official sources, not primary fieldwork across African universities. It is systematized and transparent, but not exhaustive in the strictest systematic-review sense. It also uses Rwanda as a policy-reference environment rather than presenting a full country case. These choices were appropriate to the framework-building objective, but they limit claims about institutional variation on the ground.
5.5 Future research directions: The most important next step is empirical validation. Future research should test the framework across different African higher education systems, examine how institutions cost and govern digital transition over time, and explore how AI policies, cross-border recognition, and student-support models reshape credibility in online and blended provision. The deeper implication, however, is already clear. Universities will not move online credibly by digitizing isolated teaching episodes. They will do so by redesigning themselves as institutions.

Funding

This research received no external funding. The study was completed through the authors’ own scholarly effort, with only routine in-kind institutional support from their affiliated institution(s).

Ethical Approval

This study was based exclusively on secondary sources, including peer-reviewed literature, institutional frameworks, and official policy and regulatory documents available in the public domain. It did not involve human participants, interviews, surveys, experiments, or access to identifiable personal data. Formal ethical approval and informed consent were therefore not required.

Data Availability

The materials supporting this study are available from publicly accessible sources cited throughout the manuscript. Supplementary materials associated with the study, including the Preparatory Alignment Memo and the University Online Readiness and Transition Toolkit, are archived in the Open Science Framework project associated with this article (DOI: https://doi.org/10.17605/OSF.IO/Z7KQP). All files are released under a Creative Commons Attribution 4.0 licence. No separate raw dataset was generated because the study is based on secondary literature, institutional frameworks, and official policy and regulatory documents. The study materials can therefore be traced through the reference list, the methodology section, and the archived supplementary files.

Conflicts of Interest

The authors declare no financial, institutional, professional, or personal relationships that could reasonably be understood as having influenced the design, synthesis, interpretation, or writing of this manuscript.

Use of AI Tools

During manuscript preparation, the authors used OpenAI’s GPT-5.4 Thinking model for limited research-support and editorial assistance, including support with source discovery, organizational structuring, phrasing refinement, and language polishing. All substantive decisions concerning argumentation, source selection, verification, interpretation, and final revision were made by the authors, who accept full responsibility for the content of the manuscript.

References

  1. Ali, R.; Georgiou, H. A process for institutional adoption and diffusion of blended learning in higher education. High. Educ. Policy 2025, 38, 523–544. [Google Scholar] [CrossRef]
  2. Anthony, B.; Kamaludin, A.; Romli, A.; Mat Raffei, A.F.; Phon, D.N.A.L.E.; Abdullah, A.; Ming, G.L. Blended learning adoption and implementation in higher education: A theoretical and systematic review. Technol. Knowl. Learn. 2022, 27, 531–578. [Google Scholar] [CrossRef]
  3. Arksey, H.; O’Malley, L. Scoping studies: Towards a methodological framework. Int. J. Soc. Res. Methodol. 2005, 8, 19–32. [Google Scholar] [CrossRef]
  4. Butler-Henderson, K.; Crawford, J. A systematic review of online examinations: A pedagogical innovation for scalable authentication and integrity. Comput. Educ. 2020, 159, 104024. [Google Scholar] [CrossRef]
  5. Castro Benavides, L.M.; Tamayo Arias, J.A.; Arango Serna, M.D.; Branch Bedoya, J.W.; Burgos, D. Digital transformation in higher education institutions: A systematic literature review. Sensors 2020, 20, 3291. [Google Scholar] [CrossRef] [PubMed]
  6. DiMaggio, P.J.; Powell, W.W. The iron cage revisited: Institutional isomorphism and collective rationality in organizational fields. Am. Sociol. Rev. 1983, 48, 147–160. [Google Scholar] [CrossRef]
  7. ENQA. Considerations for QA of e-Learning Provision. 2018. Available online: https://www.enqa.eu/publications/considerations-for-qa-of-e-learning-provision/.
  8. Farias-Gaytan, S.; Aguaded, I.; Ramirez-Montoya, M.-S. Digital transformation and digital literacy in the context of complexity within higher education institutions: A systematic literature review. Humanit. Soc. Sci. Commun. 2023, 10, 386. [Google Scholar] [CrossRef]
  9. Fernández, A.; Gómez, B.; Binjaku, K.; Kajo Meçe, E. Digital transformation initiatives in higher education institutions: A multivocal literature review. Educ. Inf. Technol. 2023, 28, 12351–12382. [Google Scholar] [CrossRef] [PubMed]
  10. Gašević, D.; Tsai, Y.-S.; Drachsler, H. Learning analytics in higher education: Stakeholders, strategy and scale. Internet High. Educ. 2022, 52, 100833. [Google Scholar] [CrossRef]
  11. Gao, Y.; Wong, S.L.; Md Khambari, M.N.; Noordin, N. A bibliometric analysis of online faculty professional development in higher education. Res. Pract. Technol. Enhanc. Learn. 2022, 17, 17. [Google Scholar] [CrossRef]
  12. García-Machado, J.J.; Martínez Ávila, M.; Dospinescu, N.; Dospinescu, O. How the support that students receive during online learning influences their academic performance. Educ. Inf. Technol. 2024, 29, 20005–20029. [Google Scholar] [CrossRef]
  13. Govers, M.; van Amelsvoort, P. A theoretical essay on socio-technical systems design thinking in the era of digital transformation. Gr. Interakt. Organ. Z. Angew. Organ. 2023, 54, 27–40. [Google Scholar] [CrossRef]
  14. Graham, C.R.; Woodfield, W.; Harrison, J.B. A framework for institutional adoption and implementation of blended learning in higher education. Internet High. Educ. 2013, 18, 4–14. [Google Scholar] [CrossRef]
  15. Graham, C.R.; Woodfield, W.; Harrison, J.B. Blended learning in higher education: Institutional adoption and implementation. Comput. Educ. 2014, 75, 185–195. [Google Scholar] [CrossRef]
  16. Holden, O.L.; Norris, M.E.; Kuhlmeier, V.A. Academic integrity in online assessment: A research review. Front. Educ. 2021, 6, 639814. [Google Scholar] [CrossRef]
  17. Hong, Q.N.; Fàbregues, S.; Bartlett, G.; Boardman, F.; Cargo, M.; Dagenais, P.; Gagnon, M.-P.; Griffiths, F.; Nicolau, B.; O’Cathain, A.; et al. The Mixed Methods Appraisal Tool (MMAT) version 2018 for information professionals and researchers. Educ. Inf. 2018, 34, 285–291. [Google Scholar] [CrossRef]
  18. Jisc. Framework for Digital Transformation in Higher Education. 2023. Available online: https://www.jisc.ac.uk/guides/framework-for-digital-transformation-in-higher-education.
  19. Jones, K.M.L. Learning analytics and higher education: A proposed model for establishing informed consent mechanisms to promote student privacy and autonomy. Int. J. Educ. Technol. High. Educ. 2019, 16, 24. [Google Scholar] [CrossRef]
  20. Levac, D.; Colquhoun, H.; O’Brien, K.K. Scoping studies: Advancing the methodology. Implement. Sci. 2010, 5, 69. [Google Scholar] [CrossRef]
  21. Mabidi, N. A systematic review of the transformative impact of the digital revolution on higher education in South Africa. S. Afr. J. High. Educ. 2024, 38, 97–113. [Google Scholar] [CrossRef]
  22. Martin, F.; Bolliger, D.U. Developing an online learner satisfaction framework in higher education through a systematic review of research. Int. J. Educ. Technol. High. Educ. 2022, 19, 50. [Google Scholar] [CrossRef]
  23. Newman, T.; McGill, L.; Knight, S. How to Approach Digital Transformation in Higher Education: Report and Case Studies. JISC. 2025. Available online: https://www.jisc.ac.uk/reports/how-to-approach-digital-transformation-in-higher-education.
  24. Rwanda Higher Education Council. Higher Education Institutional Infrastructure and Academic Standards. 2023a. Available online: https://www.hec.gov.rw/publications/guidelines.
  25. Rwanda Higher Education Council. National Learning, Teaching and Assessment Policy. 2023b. Available online: https://www.hec.gov.rw/publications/policies.
  26. Rwanda Higher Education Council. National Student Support and Guidance Policy. 2023c. Available online: https://www.hec.gov.rw/publications/policies.
  27. Rwanda Higher Education Council. Guidelines for Internal Quality Assurance (IQA) Mechanisms for Higher Education. 2024. Available online: https://www.hec.gov.rw/publications/guidelines.
  28. Rwanda Higher Education Council. Guidelines and Assessment Tools for the Virtual Learning and the Use of Artificial Intelligence in Rwanda’s Higher Learning Institutions. 2025. Available online: https://www.hec.gov.rw/publications/guidelines.
  29. Rwanda Higher Education Council. Guidelines and Assessment Tools for Distance Learning. 2026. Available online: https://www.hec.gov.rw/publications/guidelines.
  30. Sanders, D.A.; Mukhari, S.S. The perceptions of lecturers about blended learning at a particular higher institution in South Africa. Educ. Inf. Technol. 2024, 29, 11517–11532. [Google Scholar] [CrossRef]
  31. Sangwa, S.; Manirakiza, R.; Mutabazi, P. Assessing students’ readiness for online and distance learning in Rwanda. Res. J. Educ. 2020, 8, 1–9. [Google Scholar]
  32. Tricco, A.C.; Lillie, E.; Zarin, W.; O’Brien, K.K.; Colquhoun, H.; Levac, D.; Moher, D.; Peters, M.D.J.; Horsley, T.; Weeks, L.; et al. PRISMA extension for scoping reviews (PRISMA-ScR): Checklist and explanation. Ann. Intern. Med. 2018, 169, 467–473. [Google Scholar] [CrossRef]
  33. Tulinayo, F.; Ssentume, P.; Najjuma, R. Digital technologies in resource constrained higher institutions of learning: A study on students’ acceptance and usability. Int. J. Educ. Technol. High. Educ. 2018, 15, 36. [Google Scholar] [CrossRef]
  34. UNESCO. Guidance for Generative AI in Education and Research. 2023. Available online: https://doi.org/10.54675/EWZM9535.
  35. UNESCO. AI and Education: Protecting the Rights of Learners. 2025a. Available online: https://www.unesco.org/en/articles/ai-and-education-protecting-rights-learners.
  36. UNESCO. UNESCO Survey: Two-Thirds of Higher Education Institutions Have or are Developing Guidance on AI Use. 2025b. Available online: https://www.unesco.org/en/articles/unesco-survey-two-thirds-higher-education-institutions-have-or-are-developing-guidance-ai-use.
  37. Wang, L. Putting Digital Transformation at the Heart of HE Systems. UNESCO. 2023. Available online: https://www.unesco.org/en/articles/putting-digital-transformation-heart-he-systems.
  38. Whittemore, R.; Knafl, K. The integrative review: Updated methodology. J. Adv. Nurs. 2005, 52, 546–553. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Rwanda’s emerging policy architecture for digital and virtual higher education, 2023–2026. The timeline summarizes the sequence of Higher Education Council policy and guidance documents cited in the manuscript and shows the increasingly explicit regulatory environment surrounding infrastructure, teaching and assessment, student support, internal quality assurance, virtual learning, artificial intelligence, and distance learning.
Figure 1. Rwanda’s emerging policy architecture for digital and virtual higher education, 2023–2026. The timeline summarizes the sequence of Higher Education Council policy and guidance documents cited in the manuscript and shows the increasingly explicit regulatory environment surrounding infrastructure, teaching and assessment, student support, internal quality assurance, virtual learning, artificial intelligence, and distance learning.
Preprints 206385 g001
Figure 2. Institutional Architecture for Credible Digital Transition. The framework models credible digital transition as the interaction of five layers: contextual boundary conditions, steering architecture, operational architecture, temporal sequencing, and outcomes with feedback loops. The operational layer comprises seven interdependent domains: infrastructure and LMS ecosystem, curriculum redesign, assessment integrity, faculty capability, student support and inclusion, quality assurance, and data and AI governance. The outer layer comprises contextual boundary conditions. These include the regulatory environment, national quality frameworks, public financing conditions, connectivity and device realities, institutional mission, labour-market demands, and the university’s social contract. These conditions define the room for manoeuvre within which transition occurs. In African higher education they often include stronger constraints on bandwidth, affordability, and infrastructural reliability than are assumed in much Northern literature, which means that credible digital transition must be context-responsive rather than model-copying.
Figure 2. Institutional Architecture for Credible Digital Transition. The framework models credible digital transition as the interaction of five layers: contextual boundary conditions, steering architecture, operational architecture, temporal sequencing, and outcomes with feedback loops. The operational layer comprises seven interdependent domains: infrastructure and LMS ecosystem, curriculum redesign, assessment integrity, faculty capability, student support and inclusion, quality assurance, and data and AI governance. The outer layer comprises contextual boundary conditions. These include the regulatory environment, national quality frameworks, public financing conditions, connectivity and device realities, institutional mission, labour-market demands, and the university’s social contract. These conditions define the room for manoeuvre within which transition occurs. In African higher education they often include stronger constraints on bandwidth, affordability, and infrastructural reliability than are assumed in much Northern literature, which means that credible digital transition must be context-responsive rather than model-copying.
Preprints 206385 g002
Figure 3. Structure of the University Online Readiness and Transition Toolkit. The toolkit translates the proposed framework into three operational instruments: a readiness rubric, a phased implementation roadmap, and a policy and governance checklist. Together, these instruments support institutional planning, self-audit, regulatory dialogue, and staged implementation. The first instrument is a readiness rubric. The rubric organizes readiness across leadership and governance, policy and regulation, infrastructure, curriculum, assessment, faculty capability, learner support, quality assurance and data governance, and financial sustainability. Each domain is assessed across four maturity positions: ad hoc, emerging, structured, and assured. The purpose is not to produce a simplistic score but to force institutions to confront uneven development across domains. A university may be technically advanced yet policy-poor, or governance-strong yet learner-support weak. The rubric is therefore diagnostic rather than celebratory. Table 3 operationalizes this readiness rubric across the principal institutional domains and four maturity levels, from ad hoc to assured.
Figure 3. Structure of the University Online Readiness and Transition Toolkit. The toolkit translates the proposed framework into three operational instruments: a readiness rubric, a phased implementation roadmap, and a policy and governance checklist. Together, these instruments support institutional planning, self-audit, regulatory dialogue, and staged implementation. The first instrument is a readiness rubric. The rubric organizes readiness across leadership and governance, policy and regulation, infrastructure, curriculum, assessment, faculty capability, learner support, quality assurance and data governance, and financial sustainability. Each domain is assessed across four maturity positions: ad hoc, emerging, structured, and assured. The purpose is not to produce a simplistic score but to force institutions to confront uneven development across domains. A university may be technically advanced yet policy-poor, or governance-strong yet learner-support weak. The rubric is therefore diagnostic rather than celebratory. Table 3 operationalizes this readiness rubric across the principal institutional domains and four maturity levels, from ad hoc to assured.
Preprints 206385 g003
Table 1. Search and eligibility logic.
Table 1. Search and eligibility logic.
Search Dimension Focus Typical Terms Eligibility Emphasis
Sector and modality Higher education and forms of digital provision higher education; online learning; blended learning; distance learning; digital transformation University-level relevance
Institutional architecture Governance and operating model governance; strategy; policy; regulation; accreditation; quality assurance Institution or system-level explanatory value
Operational domains Capabilities required at scale LMS; infrastructure; curriculum redesign; assessment integrity; faculty development; student support; data governance; AI governance Transferable implications for scaled provision
Context filter African and policy-reference relevance Africa; African higher education; Rwanda Contextual transferability and regulatory significance
Note. The review was systematized and framework-building; the table summarizes the analytical search logic rather than a database-specific search log.
Table 2. Institutional Architecture for Credible Digital Transition.
Table 2. Institutional Architecture for Credible Digital Transition.
Layer Core Function Typical Institutional Expressions
Contextual boundary conditions Define opportunities and constraints Regulation, mission, financing environment, connectivity realities, qualifications frameworks, labour-market demands
Steering architecture Confers legitimacy, direction, and resources Council/senate oversight, digital strategy, policy architecture, budget model, procurement and risk governance
Operational architecture Builds delivery capability Infrastructure ecosystem, curriculum redesign, assessment integrity, faculty capability, learner support, QA, data and AI governance
Temporal sequencing Orders implementation and learning Preparation, transition, consolidation, optimization
Outcomes and feedback loops Stabilize credibility and improvement Quality monitoring, learner experience evidence, analytics with safeguards, financial review, policy revision
Note. The framework explains digital transition as an interaction among boundary conditions, steering arrangements, operational domains, temporal sequencing, and feedback loops.
Table 4. Phased institutional transition roadmap.
Table 4. Phased institutional transition roadmap.
Stage Strategic Purpose Essential Actions Main Risk if Bypassed
Preparation Establish mandate and minimum conditions Readiness audit; regulatory mapping; governance assignment; baseline policies; infrastructure minimums; programme prioritization Launch without capability or legitimacy
Transition Pilot and learn under controlled conditions Curriculum redesign; staff development; learner onboarding; assessment redesign; intensified QA monitoring Isolated pilots mistaken for scalable model
Consolidation Move from projects to operating model Policy formalization; interoperability improvements; support stabilization; budget alignment; routine quality review Persistent fragmentation and quality drift
Optimization Deepen improvement and innovation Analytics with safeguards; AI governance refinement; differentiated support; external partnerships; advanced blended or online models Sophisticated rhetoric built on unstable foundations
Table 5. Policy and governance checklist for credible online or blended provision.
Table 5. Policy and governance checklist for credible online or blended provision.
Policy Area Minimum Provision Lead Office(s)
Digital learning strategy Institutional purpose, scope, target modes, resourcing principles, review cycle Senior leadership; academic affairs; planning
Academic regulations Rules for online attendance, participation, progression, records, appeals, and equivalence Senate; registry; academic affairs
Assessment integrity and AI use Permissible AI use, disclosure, authorship, misconduct, identity assurance, proportional safeguards Academic affairs; QA; legal/ethics
Internal quality assurance Approval, monitoring, review, learner feedback, complaint handling, enhancement cycle QA unit; faculties; senate committees
Data and learning analytics governance Purpose limitation, access rights, retention, consent/notice, human oversight, security ICT; data protection/legal; QA
Student support and accessibility Orientation, advising, disability support, library access, wellbeing referral, communication standards Student affairs; library; ICT; faculties
Staff workload and development Workload recognition, training expectations, support roles, incentives, review HR; academic affairs; faculties
Procurement, cybersecurity, and continuity Vendor due diligence, interoperability, backup, incident response, business continuity ICT; procurement; finance; legal/risk
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated