Preprint
Article

This version is not peer-reviewed.

Digital Developmental Advising Systems for Engineering Students Based on ABET Student Outcomes Evaluations

A peer-reviewed article of this preprint also exists.

Submitted:

03 June 2024

Posted:

04 June 2024

You are already at the latest version

Abstract
The purpose of this research is to examine the benefits and limitations of implementation of novel digital academic advising systems using automated collection and reporting processes for ABET student outcomes data based on principles of authentic OBE for effective developmental advising. We examine digital developmental advising models of undergraduate engineering programs in two universities that employ customized features of the web-based software EvalTools® - Advising Module based on assessment methodology incorporating the Faculty Course Assessment Report, Performance Indicators and hybrid rubrics classified according to the affective, cognitive and psychomotor domains of Bloom’s learning model. A case study approach over a six-year period is adopted for this research. The two case studies present results of samples of developmental advising activity employing sequential explanatory mixed methods models using a combination of quantitative and qualitative analyses of a) detailed students’ outcomes and Performance Indicators information; and b) self-evaluation of their professional development and lifelong learning skills. The findings of this study show that digital advising systems employing the Faculty Course Assessment Report using Performance Indicators and hybrid rubrics can provide comprehensive and realistic outcomes data to help both developmental advisors and students easily identify the specific cause of performance failures, implement practical recommendations for remedial actions and track improvements. Inherent strong skills can also be identified in academically weak students by observing patterns or trends of relatively better performing outcomes to reinforce natural affinity for learning specialized competencies and pursue related and successful career paths.
Keywords: 
;  ;  ;  ;  ;  ;  
Subject: 
Social Sciences  -   Education

1. Introduction

Outcome Based Education (OBE) is an educational theory that bases every component of an educational system around essential outcomes. At the conclusion of the educational experience, every student should have achieved the essential or culminating outcomes. Classes, learning activities, assessments, evaluations, feedback, and advising should all help students attain the targeted outcomes [1,2,3,4,5]. OBE models have been adopted in educational systems at many levels around the world today [5,6,7,8]. However, the tight race for ranking and accreditation has forced many institutions to pursue minimal requirements for the fulfillment of accreditation standards [9,10,11]. As a direct result, several aspects of established educational processes in many institutions may not truly reflect the paradigm and principles of authentic OBE [5,6,9,12,13,14,15,17,18,19].
Academic advising forms a fundamental aspect of OBE systems and is driven by outcomes information. Established engineering institutions have implemented advanced academic advising systems to guide students in curricular or career matters. However, an extensive web search of research literature for developmental advising based on outcomes in online databases of popular advising and engineering education journals produced no tangible information for advising systems based on skills data for individual students. Partly relevant literature, that covered some form of assessment of advising found a dearth of advising systems based on outcomes or presented some samples of assessment of advising systems that do not incorporate outcomes data collected from direct assessments [5,20,21,22,23,24,25,26,27,28,29,30,31,32]. Quality assurance agencies mention the importance of improving outcomes using academic advising, but do not list assessment of individual student skills as a requirement for accreditation. This is due to the staggering amount of work and resource requirements that would otherwise be imposed on institutions related to manual assessment, collection and reporting of outcomes data for each student. Most academic advising today is not based on accurate and realistic outcomes data which provide qualitative and quantitative analysis of every student’s skills, but rather on summative transcript scores and abstract derivations of student-advisor communications. In this research, we present a case study of two engineering campuses implementing digital developmental advising systems based on sequential explanatory mixed methods approach for evaluation of accurate outcomes data collected for every individual student by employing the Faculty Course Assessment Report (FCAR) embedded assessments methodology, and specific Performance Indicators (PIs) and their hybrid rubrics classified per Bloom’s 3 domains and learning levels.

2. Purpose of Study

The driving force behind this research is to demonstrate the benefits of application of essential theory of the authentic OBE model for the implementation of a holistic and comprehensive educational process that maximizes opportunities for the attainment of successful student learning. The objective is to study the implementation of a state-of-the-art academic advising system that employs best assessment practices such as FCAR, specific PIs and hybrid rubrics using digital technology to tap the maximum potential and benefits of the authentic OBE model and overcome the limitations of contemporary advising mechanisms.
In particular, the researchers sought to answer the following research questions:
1. To what extent should engineering programs shift from program to student centered models that incorporate learning outcomes for evaluation of individual student performances besides program evaluations for accreditation requirements?
2. To what extent can manual assessment processes collect, store and utilize detailed outcomes data for providing effective developmental academic advising to every student in an engineering campus where several hundred are enrolled?
3. To what extent can the assessment process be automated using digital technology so that detailed outcomes information for every student on campus can be effectively utilized for developmental advising?
4. What specific benefits can digital automated advising systems provide to developmental advisors and their students?

3. Research Framework

3.1. Methodology

This research invovles a case study approach over a six year period from 2014-20 of two engineering campuses at the Islamic (IU) and Ganon (GU) Universities. A qualitative analysis of developmental advising based on outcomes direct assessment data was obtained using a selective literature review covering academic advising topics with focus on outcomes assessment in online databases of popular advising and engineering education journals. An extensive web search of last 20 years of research literature, was conducted using key words ‘outcomes’, ‘assessment’, ‘engineering’ and ‘advising’, in online databases of popular advising and engineering education journals such as the National Academic Advising Association (NACADA) Journal, The Mentor, American Society of Engineering Education’s (ASEE) Journal of Engineering Education (JEE), and Institute of Electrical and Electronics Engineers (IEEE) Transactions. Partly relevant literature, that covered some form of assessment of advising, either employed some national or regional study and found a dearth of advising systems based on outcomes, or presented some samples of assessment of advising systems that do not incorporate outcomes data collected from direct assessments [5,20,21,22,23,24,25,26,27,28,29,30,31,32].
An in-depth description of the theoretical, conceptual, and practical frameworks that helped establish authentic OBE pedagogy at two engineering campuses of IU and GU, supporting the implementation of state of the art digital developmental advising systems based on valid and reliable outcomes direct assessement data, over a period of 6 years is provided. Essential elements of an authentic OBE assessment methodology utilizing a digital platform, web-based software, EvalTools® employing the FCAR, specific/generic PIs and corresponding rubrics, classified per Bloom’s 3 domains and their learning levels, to ensure the quality of direct assessment outcomes data at the two campuses are discussed.
Finally, we present a sequential explanatory mixed methods approach for the examination of direct assessment outcomes data and self-evaluation information using electronic diagnostic tools enabling feedback for knowledge and skills improvement in developmental advising. The sequential explanatory mixed methods approach adopted at the Islamic University’s Civil, Electrical and Mechanical engineering programs first involved a quantitative analysis of every individual student’s skills based on SOs and PIs scores for a given term conveniently presented in organized digital performance reports. The quantitative analysis is then followed by a qualitative semantic analysis of language of SOs and specific PIs statements, course names and assessment types to accurately identify the specific cause of failures or exceptional performance. Any observed trend or pattern of failures or exceptional performances for specific or related skills were easily identified using single or multi-term outcomes data conveniently presented by electronic diagnostic tools. In some cases, if the semantic analyses were inconclusive, advisors also qualitatively analyzed course instructor feedback to zone in and verify specific details of students’ performances. Once core student learning deficiency or strength was identified, then developmental advisors could identify specific curricular learning activity to either recommend practical remedial actions for improvement of required student knowledge and skills or develop a suitable plan of study to target specific career paths.
The sequential explanatory mixed methods approach adopted by the Electrical Engineering program at Gannon University first involved a quantitative analysis of every individual student’s skills performances in a given term followed by a qualitative analysis of survey responses for each student’s self-evaluations of lifelong learning skills. Advisors then applied rubrics to score individual student’s survey responses corresponding to various PIs for assessing lifelong learning skills. A customized advanced feature of EvalTools®, called ‘SOs Evaluation by Alternatives’, was developed with the help of Makteam Inc. (EvalTools ® by Makteam Inc.) to aggregate the various PIs results corresponding with students’ lifelong learning skills to compute the program level SO value for a given term [33]. Finally, the single or multi-term quantitative SO results were reviewed to estimate the attainment of program level performance of students’ lifelong learning skills.
The findings of this research highlight essential elements of authentic OBE assessment methodology that need to be incorporated into educational practice to ensure collection of accurate student learning outcomes data for establishing effective developmental advising based on outcomes information. The study specifically highlights novel mixed methods approaches in digital developmental advising systems to systematically examine accurate student knowledge and skills information gathered by streamlining the sequential collection and reporting of course outcomes data by instructors from direct assessments, for all enrolled students by using FCAR + Specific PIs embedded assessment methodology. The process flow charts indicated for the qualitative and quantitative analyses of digital outcomes data provide clear guidelines to developmental advisors on how to effectively conduct quick and accurate evaluations to identify patterns of failure for remedial actions or strengths for alignment of education with successful future career paths. The process flow charts are provided in Figure 7 and Figure 13 and elaborately explained in the Section V. Results of this paper. The case study presents samples of novel investigative models exhibited by advisors’ ingenious usage of electronic reporting features to easily track achievement and progress of student learning outcomes. The models can therefore act as viable prospects for the design, and use of practical and effective technology-based pedagogical solutions for accurate evaluations and comprehensive advisor feedback to attain holistic developmental academic advising. The scope of this research, as outlined by the research questions, involves detailed analyses of empirical data related to sample cases of developmental advising in the two campuses since each sample requires careful examination of an individual student’s single and multi-term outcomes data and corresponding advising actions. As per authentic OBE principles, the overall impact of developmental advising models is directly measured by evaluating the attainment of SOs based on multi-year cumulative results of SOs summary and trend analyses reports. If engineering programs achieve positive trend analyses results for a majority of the SOs, a Meeting Expectations decision is thereby attained for the implemented digital developmental advising models. As a summary of findings, we present a detailed qualitative comparison of digital developmental and prevalent traditional advising models by using 22 pedagogical aspects in 6 broad areas of education that are extracted from the literature review, theoretical, conceptual and practical frameworks, and results of this study.

3.2. Participants

In this sequential mixed methods research conducted from 2014 to 2020, we shall present some samples of digital academic advising systems that employ the developmental advising model based on authentic OBE theory at the Faculty of Engineering, Islamic University (IU) and Electrical Engineering program, Gannon University (GU). The engineering programs in the two institutions were selected for this study, since programs in both institutions employ the Advising Module of EvalTools®. The research conducted at the Faculty of Engineering, IU’s Departments of Civil, Electrical and Mechanical Engineering involved 43 faculty members and 823 students from multiple cohorts of the 4-year bachelor of science programs. The study implemented developmental advising based on accurate diagnostics and mechanisms of failure analysis extracted from observations of specific trends or patterns of deficient ABET SOs/PIs performances to enhance overall teaching effectiveness. Two random samples of developmental advising with ranging academic performance were considered for the study at the Faculty of Engineering, IU. One sample consisted of a student with an above average academic performance (3.5 < GPA > 5.0) and the other was an underperforming student with GPA less than 3.0. The research study conducted at the Electrical Engineering, bachelor of science program, Gannon University involved 8 faculty members and 272 students from multiple cohorts. The EvalTools® Advising Module implemented at Electrical Engineering program, GU involved the use of an additional feature, called ‘SOs Evaluation by Alternatives’. This research implemented self-evaluation mechanisms in advising systems that empower students with the capability of measuring their own performances in program-level learning outcomes using digital reporting features corroborated by advisor inputs to enhance lifelong learning skills. For this case, one random student advising sample with an above average academic performance (3.5 > GPA < 5.0) was considered.

3.3. Developmental Advising and Its Assessment - A Qualitative Review

3.3.1. Developmental Advising

Since several approaches to advising are practiced worldwide, each with its own purposes and goals, assessment of current advising systems or programs is a complicated affair. There are two popular approaches: traditional and developmental advising. The traditional or prescriptive approach is highly structured, with the advisor assuming an authoritative role, controlling the amount of information given and the way it is presented. Jeschke, Johnson, and Williams (2001) described traditional, or prescriptive, advising as a “quick and efficient” method in which the advisor explains the sequence of courses that the advisee should take and makes sure the student understands the course registration process [34]. As cited by Kadar (2001), Raushi defined developmental advising as “a process that enhances student growth by providing information and an orientation that views students through a human development framework [35].” Gordon (2019), explained that the developmental approach to advising entails focusing on the individual student's concerns, needs, and aspirations—is accepted as an ideal by many writers and practitioners in the field of advising [26]. Developmental advisors consider the advisee as a student who is maturing throughout the educational career. The developmental advisor, while assisting the student to choose appropriate course plans, also attempts to address the needs of the transitioning student by using student development theory and providing the required information about the academic environment. The advisor is therefore required to evaluate the student’s current developmental stage and use this information to work with the student and concerned instructors to design an appropriate plan of study. This approach adopted by developmental advising can significantly enhance the effectiveness of the teaching process.
According to Appleby (2002, 2008), “Well-delivered developmental advising helps students understand why they are required to take certain classes, why they should take their classes in a certain sequence . . . what knowledge and skills they can develop in each of their classes . . . and the connection between student learning outcomes of their department’s curriculum and the knowledge and skills they will be required to demonstrate in graduate school and/or their future careers” [21,22].
Banta, Hansen, Black, and Jackson (2002) summarize these two main schools of thought: one for prescriptive advising, (the most important aspect of advising is the assurance that students register for correct courses), and the other (the developmental approach, where knowledge, skills, academic environment, and other aspects of students’ lives must be considered) for the proper administration of student advising [36]. These two approaches have different goals and may require different approaches to assessment.
According to Campbell & Nutt (2008), assessment of advising based on a learning-centered paradigm that focuses on outcomes must be used to understand whether student learning outcomes have been achieved [25]. Campbell (2005a, 2005b) clearly states that advising programs and administrators need systematically gathered and specific outcomes assessment data for achieving academic improvement [23,24]. Since our focus in this research is on the OBE model of education, we will concentrate on the evaluation of learning outcomes in developmental advising for the enhancement of student knowledge, skills, and effectiveness of teaching.

3.3.2. National Academic Advising Association and ABET Standards

The National Academic Advising Association (NACADA, 2023) guidelines for academic advising also state that each institution must develop its own set of student learning outcomes and the methods to assess them [37]. NACADA states student learning outcomes for academic advising are “an articulation of the knowledge and skills expected of students as well as the values they should appreciate as a result of their involvement in the academic advising experience.” These learning outcomes answer the question, “What do we want students to learn as a result of participating in academic advising?” Assessment of student learning should be an integral part of every advising program [37].
ABET criterion 1 for accreditation specifically states “Student performance must be evaluated. Student progress must be monitored to foster success in attaining Student Outcomes (SOs), thereby enabling graduates to attain program educational objectives. Students must be advised regarding curriculum and career matters” (ABET, 2023) [12]. So individual student skills data or results would be both a fundamental requirement and a pivotal base for the entire academic advising process to initiate and continue successfully. In fact, the ongoing and continual assessment of individual student skills would be the litmus test for a successful academic advising process.

3.3.3. Assessing Advising

Unfortunately, even though the importance of student performance is touted far and wide throughout academia, student learning outcomes information is rarely implemented and evaluated in academic advising systems. An extensive web search of last 20 years of research literature, using key words ‘outcomes’, ‘assessment’, ‘engineering’ and ‘advising’, in online databases of popular advising and engineering education journals such as the National Academic Advising Association (NACADA) Journal, The Mentor, American Society of Engineering Education’s (ASEE) Journal of Engineering Education (JEE), and Institute of Electrical and Electronics Engineers (IEEE) Transactions on Education produced no tangible information for advising systems based on skills data for individual students. Partly relevant literature, that covered some form of assessment of advising, either employed some national or regional study and found a dearth of advising systems based on outcomes, or presented some samples of assessment of advising systems that do not incorporate outcomes data collected from direct assessments.
In a scholarly work on advising assessment, Lynch (2000) observed: One might expect that academic advising would be evaluated with somewhat the same regularity and thoroughness as classroom instruction [30]. Such is not the case. In its fifth national survey of academic advising based on eleven criteria (27), American College Testing (ACT) found that the evaluation of advising programs and academic advisors received the ninth and tenth lowest effectiveness ratings. Lynch (2000) concluded that designing comprehensive assessments for advising is a complex affair because advising systems can differ based on various theoretical models and are also implemented at various levels of complexity [30]. Swing (2001), in his work for the Policy Center on the First Year of College, noted that only 63% of academic advising programs are regularly evaluated [32]. Aiken-Wisniewski, Smith, and Troxel (2010) suggested that the literature lacks evidence of advisor access to information related to program assessment and evaluation, specifically citing a need for advising units to design curricula with intentionality i.e. curriculum planned for assessment of SOs [20].
Recently, Powers, Carlstrom, and Hughey (2014) stated that best practices of academic advising assessment involve identification of student learning outcomes, the development and use of multiple measures of student learning, and sound professional judgment to understand the information gathered and to improve student learning [31]. In their exhaustive national study, 499 individuals were invited from US NACADA regions (NACADA, 2019) and data was collected from 291 people, a 58% response rate. Out of this number, 230 (46% of the invited participants) had offered complete data. The highest percentage of participants by institution type came from public and private, nonprofit, doctoral degree–granting institutions (37.8%, n = 87). 53.0% (n = 122) of participants reported job responsibilities associated with institution-wide undergraduate advising. Collected demographic data indicated that most held the title of advising director/coordinator (45.7%, n = 105), and 21.7% (n = 50) said they work as an academic advisor. Assistant/associate dean described 9.6% (n = 22) of the respondents while 5.2% (n = 12) identified themselves as dean. The fewest self-reported being a faculty advisor (1.7%, n = 4). 87% (n = 200) of the participants indicated having some direct advising responsibilities, with 32.6% (n = 75) representing situations exclusive to professional advisors and 20.0% (n = 46) from situations in which only faculty advisors were employed [31]. The national study was based on a survey of academic advising assessment, the researchers noted that the assessment results often come from minimal, narrow, and inconsistent evaluation practices, often based on student satisfaction surveys. To generate a better picture of the current state of assessment, they surveyed those conducting or deemed responsible for academic advising assessment [31]. Although 80% of survey participants identified academic advising student learning outcomes, one half assessed the achievement of those outcomes, with most using student surveys [31].
It is evident from research literature that most advising systems use student surveys and do not use actual knowledge and skills information collected from direct assessments to verify the progress of academic advising. He and Hutson (2017) suggested that advisors need to incorporate direct assessment into advising to demonstrate value to the institution and to contribute to the scholarship of advising [28]. A recent research study by Kraft-Terry & Cheri (2019) attempts to fill the gap in advising integrated with assessment of SOs [29]. But unfortunately, they defined 7 generic SOs targeting student knowledge of university admission standards, graduation policies, development of academic plans, identification and utilization of appropriate campus resources to achieve academic success. They state the importance of student centered advising systems, but refer to GPA as a benchmark for identifying academic success. In summary, their work emphasizes the importance of integrating SOs into curriculum with a mechanism called backward design but the SOs they propose were not related to curricular content and therefore could not be integrated into the direct assessments that comprise of curriculum delivery. Their work did not provide a detailed mechanism or institutional resource to identify performance failures related to SOs data aligned with curricular assessments [29]. At best, most attempts to integrate SOs assessment and evaluation with academic advising did not apply the essential principles of OBE by overlooking the fact that SOs should align tightly with teaching and curriculum. This oversight is actually intentional due to lack of information in literature related to digital advising systems that are based on authentic OBE methodology and target attainment of SOs fully integrated and aligned with delivery and assessment of curricular course content.
Accreditation assessment models exacerbate the situation by suggesting manual processes for data collection of SOs information using generic and vague performance criteria and rubrics, which are based on the selection of small samples of students in select courses [12,13,17,18,38,39,40,41,42]. It is generally observed, as in the Gloria Rogers (GR) model [5,12,43,44,45], that most program evaluation models do not incorporate comprehensive and accurate assessment of all students using specific performance criteria and corresponding rubrics. Contrary to OBE systems, the GR program evaluation model is program-centered, collects ‘relevant’ pieces of information supposedly sufficient for evaluating a program, and does not implement comprehensive assessment of all students’ performance. This renders the collected SOs information inaccurate and insufficient to evaluate individual student performance for academic advising [5,43,44,45]. OBE-based developmental academic advising systems should employ student learning outcomes information collected sequentially for all enrolled students using direct assessments in various phases of the educational process to evaluate the progression of advising. To resolve this dilemma, engineering institutions, programs and quality assurance agencies should promote comprehensive assessment models that employ specific performance criteria and corresponding rubrics to implement authentic OBE using web-based digital technology [5,13,38,39,40,41,42,43,44,45,46].
In this research, we examine digital advising models that are based on authentic OBE frameworks, employ the FCAR embedded assessments methodology, and specific PIs classified per Bloom’s 3 domains and learning levels to track student knowledge and skills for effective developmental advising. We elaborate on several theoretical, conceptual and practical frameworks that have been established over a 6-year time period with intensive team efforts to support holistic pedagogy and multi-dimensional benefits for curriculum design and delivery, strategies of teaching and learning, and assessment and evaluation (Hussain et. al. 2020) [47]. The frameworks will show how digital developmental advising models presented in this research specifically address major deficiencies of prevalent advising by utilizing pedagogical solutions that: (a) support automated collection and reporting of valid and reliable outcomes data for every individual enrolled student (b) collect accurate outcomes data using specific PIs and hybrid rubrics that are accurately aligned with intended course topics and their learning activity (c) provide high precision qualification for student attainment of holistic learning by assessing specific PIs classified per Bloom’s 3 domains and their learning levels (d) enable novel mixed methods approaches for the quick and accurate evaluation of student failure and/or strength based on detailed objective assessment data for achieving effective developmental advising (e) and enable students to easily access detailed multi-term outcomes data, reinforce remediation efforts with close collaboration and follow up with advisors, and use outcomes based self-evaluations to enhance their metacognition and lifelong learning skills.

4. Theoretical, Conceptual and Practical Frameworks

The philosophy, paradigm, premises and principles of Authentic OBE form the basis for theoretical frameworks that lead to the development of crucial models which act as the foundation of the Integrated Quality Management Systems implemented at the Faculty of Engineering. Several essential concepts are then induced from OBE theory, assessment best practices and ABET criterion 4, CR4 on continuous improvement [12]. Essential techniques and methods based on this conceptual framework are then constructed as a practical framework of automation tools, modules and digital features of a state of the art web-based software EvalTools ® [47].

4.1. Theoretical Framework

4.1.1. OBE Model

Educational institutions following the OBE model should ensure all learning activities, assessments, evaluations, feedback, and advising help students attain the targeted outcomes. International and regional Quality Assurance (QA) agencies and academic advising organizations strongly recommend educational institutions implement academic advising based on learning outcomes. However, many engineering programs’ advising systems follow the traditional or prescriptive approach based on summative transcript scores and do not utilize individual student’s learning outcomes information for enhancing students’ knowledge and skills. Most developmental advising systems implement a scaled down limited model that does not employ the Accreditation Board of Engineering and Technology (ABET) learning outcomes and detailed performance indicators information for every enrolled student [12]. To better understand the scope of this research and the limitations of current advising systems for outcomes-based approaches, we begin with a brief introduction to some essential elements of OBE which were developed by the High Success Network (6, 3,4; 48).
The keys to having an outcomes-based system are:
  • Developing a clear set of learning outcomes around which all of the educational system’s components can be focused; and
  • b. Establishing the conditions and opportunities within the educational system that enable and encourage all students to achieve those essential outcomes.
OBE’s two key purposes that reflect its “Success for all students and staff” philosophy are:
  • Ensuring that all students are equipped with the knowledge, competence, and qualities needed to be successful after they exit the educational system; and
  • b. Structuring and operating schools so that those outcomes can be achieved and maximized for all students.
In this research, we specifically concentrate on two major aspects advocated by an authentic OBE model:
  • All components of the education system including academic advising should be based on, achieve and maximize a clear and detailed set of learning outcomes for each student; and
  • b. All students should be provided with detailed real time and historic records of their performances based on learning outcomes for making informed decisions for improvement actions.
Therefore, all components of educational systems that implement an OBE model should focus on aiding all students to successfully attain the targeted outcomes for achieving the intended learning aimed by the curriculum.

4.2. Conceptual Framework

4.2.1. FCAR + Specific/Generic Performance Indicators Assessment Model

Figure 1 shows a comprehensive Continual Quality Improvement (CQI) process flow for a FCAR + specific/generic Performance Indicators (PIs) model classified per Bloom’s 3 domains and a 3-Levels Skills Grouping Methodology (51; 53; 54; 35] adopted by institutions implementing the developmental advising model in this research. ABET criteria for CQI [12] have been implemented in the assessment model, which requires course faculty and academic advisors to make decisions using assessment data collected from students and other program constituencies, ensuring a comprehensive CQI process. This requires the development of quantitative/qualitative measures to ensure that students have satisfied the Course Outcomes (COs) which are measured using a set of specific or generic PIs/assessments and consequently the program level ABET SOs [12,15,40,41,44,47]. Course faculty are directly involved in the teaching and learning process, using detailed outcomes results to interact closely with advisors, and providing students with on-time feedback for performance improvement. On the other hand, models which involve assessment teams that are not directly involved with the students will not support real-time comprehensive CQI, which is an essential element of an authentic OBE system [1,3,4,5,48,49,50]. Such CQI processes do not involve on-time course faculty and advisor interactions based on real-time, relevant and detailed outcomes information for improving performance failures and severely limit comprehensive quality improvement efforts. An ideal CQI cycle would therefore include the course faculty in most levels of its process to generate and execute action items that can directly target real time improvement in student performance for ongoing courses. The noteworthy aspect of this model is that course faculty work closely with academic advisors and are involved directly with students in most CQI processes whether at the course or program level.
A “design down” [3,5,6,48,49,50] mapping model was developed as shown in Figure 2 exhibiting authentic OBE design-down flow from goals, Program Educational Objectives (PEOs), SOs, course objectives, COs to PIs. This figure illustrates trends in levels of breadth, depth, specificity and details of technical language related to the development and measurement of the various components of a typical OBE ‘design down’ process [3,5,6,48,49,50]. Goals and objectives are futuristic in tense and use generic language for broad application. The term ‘w/o’ (without) in the figure highlights essential characteristics of goals and objectives. Goals and objectives do not contain operational action verbs, field specific nominal subject content, or performance scales. Student and course outcomes do not contain performance scales. Performance scales should be implemented with the required descriptors in rubrics [62,63]. The FCAR + PIs model uses the Excellent, Adequate, Minimal and Unsatisfactory (EAMU) performance levels in rubrics.
The reliability and validity of outcomes assessment are ensured using an elaborate set of generic and specific PIs and their corresponding hybrid rubrics [44]. The hybrid rubric is a combo of the holistic and analytic rubrics developed to address the issues related to validity: precision, accuracy of assessment alignment with outcomes, PIs; and inter, intra-rater reliability: detail of specificity of acceptable student performances; when dealing with assessment of complex and very specialized engineering activities. The hybrid rubric is an analytic rubric embedded with a holistic rubric to cater to the assessment of several descriptors that represent all the required major steps of specific student learning activity for each PI/dimension listed [44].
The research reported in [44] provides procedures for developing and implementing hybrid rubrics for the accurate assessment of PIs related to each CO. These rubrics developed by groups of course specialists in each program are stored in a digital database and provide both faculty and students clear and accurate details of expected performances in various student learning activities based on the ‘high expectations’ principle of authentic OBE [3,5,6,47,48,49]. Figure 3 shows a sample portion of the database listing the hybrid rubrics, EAMU scales, descriptors and percentages of score allocations for Electrical Engineering program’s PIs 55, 56 and 57 associated with ABET SO ‘e’ (SO_5): ‘An ability to identify, formulate and solve engineering problems.’ Performance criteria as defined by instructors in the descriptors for the various scales of the hybrid rubrics override the general performance criteria shown in Table 1 to provide academic freedom in assessment.

4.3. Practical Framework – Digital Platform EVALTOOLS ®

EvalTools® (Information on EvalTools®) is chosen as the platform for outcomes assessment over Blackboard® (http://www.blackboard.com) [64] since it is the only tool that employs the Faculty Course Assessment Report (FCAR) and EAMU performance vector methodology [43,45,46,53,54,55,56,57,58]. This embedded assessments methodology employing specific PIs facilitates effective use of routine course assessments for outcomes measurement to achieve a high level of automation of the data collection process. The EvalTools® FCAR module provides summative/formative options and consists of the following components: course description, COs indirect assessment, grade distribution, course reflections, old action items and new action items; COs direct assessment; PIs assessment; SOs assessment; assignments list; and learning domains and skills levels assessment distribution [45,46,56,57,58]. The FCAR uses a performance vector conceptually based on a performance assessment scoring rubric developed by Miller and Olds (1999) [59]. Course instructors collect PIs data from a set of course assignments which is presented in the form of an “EAMU performance vector” categorizing aggregate student performance. The EAMU performance vector counts the number of students that passed the course whose proficiency for that outcome was rated Excellent, Adequate, Minimal, or Unsatisfactory as defined by: Excellent: scores >= 90%; Adequate: scores >= 75% and < 90%; Minimal: scores >= 60% and < 75%; and Unsatisfactory: scores < 60%. The EAMU performance vector constitutes a direct measure of aggregate student performance that neatly encapsulates information into categories which can then be quickly reviewed for indicators of non-standard performance. In addition to the performance vector, the instructor reports details regarding assignments used for acquiring the data along with any relevant observations. Heuristic rules and indicator levels for performance vectors called EAMU have been explained in research work related to the FCAR [45,46,56,57,58]. To study the application of this methodology in actual course examples, the scales, indicator levels for the EAMU, and heuristic rules for the performance vector have been listed in Table 1 below. As mentioned earlier, the descriptors for EAMU scales shown in Table 1 are generic and applied to all PIs unless instructors opt to apply topic-specific descriptors of hybrid rubrics for assessing certain PIs of interest.
In Figure 4, we see the performance vector for a mechanical engineering course, THERMODYNAMICS 1, showing the performances of 11 students for several Course Outcomes (COs). In this clipped portion of the entire table generated by EvalTools®, we see COs 1, 2 and 3 assessed for all 11 students in the class using multiple assessments. Aggregation of different types of assessments aligned to a specific learning outcome at the course level is achieved using a scientific weighted averaging scheme. This scheme gives priority to certain types of assessments over others based on their coverage of learning domains, percentage of course grading scales, and the maturity of student learning at the time of assessment. Hussain, Mak & Addas (2016), have provided details of this weighted averaging approach at the 123rd annual conference and exposition of the American Society for Engineering Education (ASEE), Columbus, Ohio in 2016 [43]. The CO1: ‘Explain fundamental concepts of thermodynamics and Analyze systems that use pressure measurement devices,’ is assessed for every student in the class using relevant multiple assignments such as homework 1 (HW_1), quiz 1 (QZ_1) and midterm-1 question 1 (Mid Term-1 Q-1) which are aligned to specific performance indicators and are aggregated together using this scientific weighted averaging scheme [43]. The performance vector provides details of each student’s performance in multiple assessments aligned to performance indicators that correspond to all the COs in the course. EvalTools®, employing the FCAR assessment model, facilitates electronic storage of the outcomes and assessment information for each student collected from several courses in every term. The FCARs from each course are further processed into a Performance Vector Table (PVT) for each SO (Information on EvalTools®) [33].
Figure 5 shows the PVT with information of all assessed PIs for ABET SO ‘h’ (SO8): ‘Broad education necessary to understand the impact of engineering solutions in a global, economic, environmental and societal context.’ The PIs assessed for each student corresponding to a specific SO are then averaged with weights based on a 3-Levels Skills Grouping methodology [5,43,53,54,55]. This aggregation methodology ensures that PIs information corresponding to various skills levels collected from multiple course levels for a specific SO are averaged by weights that consider both the level of the course: Mastery, Reinforced or Introductory; and skill: Elementary, Intermediate or Advanced. This gives the highest precedence to an advanced skill measured in a mastery level course.The EvalTools® FCAR methodology deployed for assessment of student learning outcomes has facilitated effective integration of outcomes data in advising systems to help individual students fulfill expected knowledge and skills requirements for their plan of study. The methodology implemented by EvalTools® supports the developmental advising process based on learning outcomes information. A YouTube video also presents some detail of the features of this module and how individual student skills data are collected by using specific PIs, course assessments and then integrated by faculty into academic advising [51]. A digital database of essential and accurate outcomes information for all students provided by the EvalTools® Advising Module effectively supports developmental advising based on the principles of authentic OBE.

4.4. Practical Framework – Summary of Digital Technology and Assessment Methodology

In summary, several essential elements were implemented by the institutions involved in this research to ensure the outcomes data collected for every student represents realistic and accurate information for academic advising:
  • Measurement of outcomes information in all course levels of a program curriculum: introductory (100/200 level course), reinforced (300 level course) and mastery (400 level course). Engineering fundamentals and concepts are introduced in 100/200 level courses; then they are reinforced in 300 level courses through application and analysis problems; and finally in 400 level courses students attain mastery in skills with activity such as synthesis and evaluation [43,53,54,55].
  • The Faculty Course Assessment Report (FCAR) utilizing the EAMU (Excellent, Adequate, Minimal, Unsatisfactory) performance vector methodology [45,56,57,58].
  • Well-defined performance criteria for course and program levels.
  • A digital database of specific PIs [43,44,47,53,54,55] and their hybrid rubrics classified as per Bloom’s revised 3 domains of learning and their associated levels (according to the 3-Level Skills Grouping Methodology).
  • Unique Assessment mapping to one specific PI [43,44,53,54,55].
  • Scientific Constructive Alignment for designing assessments to obtain realistic outcomes data representing information for one specific PI per assessment [1,2,4,5,43,47,53,54,55,60,61].
  • Integration of direct, indirect, formative, and summative outcomes assessments for course and program evaluations.
  • Calculation of program and course level ABET SOs, COs data based upon weights assigned to type of assessments, PIs and course levels [43,53].
  • Program as well as student performance evaluations considering their respective measured ABET SOs and associated PIs as a relevant indicator scheme.
  • 6 comprehensive Plan Do Check Act (PDCA) quality cycles to ensure quality standards, monitoring, and control of education process, instruction, assessment, evaluation, CQI, and data collection and reporting [47].
  • Customized web-based software EvalTools® facilitating all of the above (Information on EvalTools®) [33].

5. Student Evaluations

Transcript grades are composite performance results derived from an aggregation of several hundred specific student learning activities in any given discipline. It is impossible to extract student performance relating to specific curricular content, skill levels, or learning domains from composite transcript grades. As shown in another YouTube video presentation [52], digital transcripts with a detailed list of specific student performance corresponding to ABET SOs would provide an excellent source of information for academic advising, career counseling, and recruitment. Digital transcripts would help academic advisors easily identify deficient skills in students with excellent GPAs or patterns of high-performing activity in academically weak students. In both cases, focused advising in specific areas would significantly help students identify their weaknesses or strengths for appropriate on-time corrective action or career path selection. EvalTools® Advising Module’s diagnostic tools are used by developmental advisors for a qualitative and quantitative review of detailed reports of single and multi-term ABET SOs, PIs and assessments information. In this study, the researchers establish attainment of effective developmental advising by employing a sequential mixed methods approach involving a combination of quantitative and qualitative analyses of advising information such as individual student’s SOs, PIs, assessments, instructor feedback and students’ self-evaluation data. Quantitative analyses involved identifying failing performances for SOs/PIs based on observing red and yellow flags, corresponding scores and any patterns or trends of serial failures or exceptional performance. Qualitative analyses of specific performances for SOs/PIs included semantic analysis of language of failing SOs/PIs statements, type of assessments, instructor feedback and/or responses for students’ self-evaluation of their lifelong learning skills. The Figure 6 provides a detailed process flow for the mixed methods approach adopted by the Faculty of Engineering programs for implementing effective developmental advising. The objective of the evaluation is to identify patterns or trends of failure or exceptional performance so that developmental advisors can target either development of remedial action or career path plans respectively. As mentioned earlier, authentic OBE advising models contribute to providing holistic education for students by either helping them achieve mastery by identifying and overcoming deficiencies or promoting exceptional talent by identifying and developing inherent skills for successful career paths. Therefore, locating patterns of exceptional performances helps advisors identify core learning strengths and recognizing patterns of deficient performances helps them to identify core student learning deficiencies.
For the benefit of effective training in the field of developmental advising, we consider two types of students based on ranging academic performances, one with an above average GPA (greater than 3.5 on a 5.0 scale) and another with below average GPA (less than 3.0 on a 5.0 scale). The approach for failure analysis involves three levels of evaluation depending upon the nature and complexity of student performances. Level 1 is straight forward, involving quantitative and qualitative analyses of SOs data. Level 3 is more complex, and involves mixed methods evaluation of SOs, PIs, assessments and FCAR instructor reflections. All the 3 levels involve locating patterns or trends of failure or exceptional performance for identifying core learning deficiency or strength, subsequently leading to the development of precision plans for remedial action or career path selection:
  • Level 1: Quantitative review of single or multi-term SOs data followed by a qualitative semantic analysis of language of SOs statements coupled with qualitative review of curriculum and course delivery information.
  • Level 2: Quantitative review of single or multi-term SOs and PIs data followed by a qualitative semantic analysis of language of SOs and PIs statements, course titles and assessment types coupled with qualitative review of curriculum and course delivery information.
  • Level 3: Quantitative review of single or multi-term SOs and PIs data followed by a qualitative semantic analysis of language of SOs and PIs statements, course titles and assessment types coupled with qualitative review of curriculum, course delivery and FCAR instructor reflections information.
After identifying the core learning deficiencies, advisors qualitatively review the curriculum and course delivery information to map the learning deficiency to key learning activities that students can target in specific phases of the curriculum or course delivery for precision remedial action and subsequent improvement. In case core learning strengths are identified, experienced advisors use this information to recommend a focused plan of study for suitable career paths aligned with students’ inherent skills.

5.1. Automated ABET SOs Data for Every Enrolled Student

Specific performance indicators information collected for every enrolled student using digital technology and appropriate methodology such as the FCAR PVT, if scientifically applied to academic advising and adequately popularized in academia and industry, would revolutionize how institutions can provide learning improvement and career opportunities. A much broader spectrum of learning improvement and career opportunities can then be generated for all students with any range of academic performance. To substantiate this statement, let us look at an Electrical Engineering (EE) student evaluation in Figure 7.
A consolidated view of ABET SOs information calculated from PIs measurements is shown for 3 consecutive terms. The student skills SOs data is realistic and corresponds closely with actual student performance since 10 essential elements of the assessment model (55; 54; 53; 43) have been implemented to ensure that outcomes data is as accurate as possible.

5.2. Quantitative and Qualitative Analyses of Each Student’s ABET SOs Data

As discussed earlier, academically high-performing students with above average transcript grades may have failing or underperforming skills. The EvalTools® Advising Module provides detailed skills information as shown in Figure 8 for such a case of an above average student of the EE program. A logical, structured, and deductive sequential mixed methods approach for failure analysis provides quick and accurate results due to utilization of elaborate web-based quantitative analytical tools and scientific diagnostics based on outcomes information that seamlessly streamlines the feedback and performance improvement processes [44,53]. Developmental advisors first review quantitative single or multi-term SOs data for a student of interest. Identification of failures is a straight forward process which involves locating red flags in the SOs results. The red flags indicate EAMU-based aggregate averages below 60% as mentioned in the performance criteria listed in Figure 3. Once advisors identify single or multiple SOs failures, they are required to qualitatively review the semantics of failing outcomes statements to deduce any patterns of failure which are based on a common and core deficiency in student learning. For the case shown in Figure 8, the red flags clearly highlight a pattern of failures related to ABET SOs ‘h,’ ‘i,’ and ‘j,’ corresponding to the study of the impact of engineering solutions, lifelong learning, and contemporary issues respectively. Based on a qualitative semantic analysis of language of SOs statements, the pattern of failure observed in this case refers to a core deficiency in developing good comprehension of issues related to contemporary engineering solutions. The last step dealing with the development of remedial actions involves mapping the core deficiency to student learning activity in a specific phase of the curriculum. Obviously, developmental advisors would require comprehensive knowledge of the curriculum and mechanism of course delivery to complete this step and create accurate remedial action for targeted improvement. As mentioned earlier, if needed, advisors can also view course instructor reflections on failures by using an advanced feature of the Advising Module called FCAR Activation Mode. Coming back to the case under discussion, developmental advisors can accurately target certain skills for improving the deduced core student learning deficiency by prescribing specific student learning activities that are based on self-motivated research coupled with an ability to elaborate and compare the benefits and limitations of contemporary engineering solutions based on their impact on societal, environmental and economic aspects. To implement effective CQI for achieving holistic learning, it is necessary for advising systems to provide easy access to detailed outcomes information for every individual student and to represent data in a convenient format resulting in the quick identification of failures. Such advising systems will promote early identification of areas of weakness in performance for otherwise “successful” students to better prepare them for the challenges of leading career roles.

5.3. Quantitative and Qualitative Analyses of Each Student’s PIs and Assessment Data

Let us proceed further and review EvalTools® Advising Module’s color-coded representation of PIs in the comprehensive evaluation of each SO for an individual student. Figure 8 shows for an individual EE student, a detailed list of PIs, assessments, weighting factors and course information utilized for the multi-term quantitative evaluation of ABET SO ‘a’ (SO_1): ‘An ability to apply the knowledge of mathematics, science and engineering.’ It also indicates some aspects of the assessment model which directly contribute to the high level of accuracy required for the aggregation of SOs and PIs data for academic advising. The outcomes information is computed using weighting factors based on the 3-Levels Skills Methodology for the scientific aggregation of multiple skills measured using various types of assessments and multiple raters in several courses over a period of multiple terms [43,53,54,55]. Advisors can use the comprehensive SO evaluations represented in a scientific color-coded format to easily identify patterns or trends in failures related to specific types of skills. Using a detailed examination of academically weak student’s performance, it is also possible to identify certain areas of strength in learning which are due to the students’ natural affinity for and interest in certain topics of the curriculum.
In Figure 9, the quantitative multi-term evaluation of ABET SO ‘a’ (SO_1) for a typical underperforming EE student, shows certain areas of comparatively better patterns of learning that are highlighted in green. We observe that PI_1_12: “Employ basic electrical power formulations and quantities, such as complex vectors, delta star transformation, network flow matrices (network topology and incidence matrices) and symmetrical components,” PI_1_41: “Convert a given number from one system to an equivalent number in another system,” and PI_1_45: “Explain basic semiconductors theory concepts such as applied electrical field, junction capacitance, drift/diffusion currents, semiconductor conductivity, doping, electron, hole concentrations, N-type, P-type semiconductors,” show better performance and are in stark contrast to the majority of the other PIs measured for the two terms 351 and 352. Based on qualitative semantic analysis of language of PIs statements and type of course assessments, one significant observation is that these three PIs measure elementary math skills and engineering concepts and cover relatively easier topics such as Boolean algebra. The other failing PIs, which deal with topics such as operating principles of various electronic devices and components, application of Gauss’s Law, Maxwell’s equations etc., require several advanced engineering concepts coupled with a basic understanding of differential and integral calculus.
Upon further analysis, from the FCAR instructor reflections, it was confirmed that many students did exhibit a minimal understanding of differential and integral calculus. Therefore, the core learning deficiency for this advising case was identified as poor understanding and application of differential and integral calculus. Learning deficiency in fundamental knowledge such as this would require students to refresh their basics in calculus by electing for additional math course work or tutorial sessions. The failing PIs information also strongly suggests that students had initiated learning with the required level of interest, but at later stages of the course, needed other mechanisms of course delivery such as active learning for retention of focus and enhanced interest. On the other hand, developmental advisors can also use performance information related to the student’s core learning strength in digital electronics and application of Boolean algebra for solving circuits to suggest concentration on specific learning activity in core and elective courses such as Microprocessors and Digital System Design to enhance career prospects as a digital design or test engineer. Student advising based on such a mixed methods approach would help faculty to identify potential areas of strength or weakness in student performance through the observation of patterns of relatively high or low scores for certain ABET SOs and their corresponding PIs.
Specific/generic PIs corresponding to the Engineering Accreditation Commission (EAC), ABET SOs for a specific program, targeting a variety of skills sets, and measured using specific assessments in multiple courses for each student form the main source of diagnostic information on which effective failure analyses depend. Remediation efforts based on identification of strong or deficient performance is a vast and complex topic which can be adequately covered in another exclusive research article to elaborate on detailed steps implemented by academic advisors for CQI. In general, advisors trained in degree plan, and course and PIs requirements propose to their advisees’ specific areas of requisite knowledge, learning strategies and relevant activities in courses for improvement. Advisors also communicate with concerned faculty members to provide performance information on students and highlight course content for concentration and preferable strategies of teaching. EvalTools® Advising Module maintains a digital repository of notes containing advisor and student meeting information. Advisors electronically report specific information related to deficiencies in outcomes and suggestions for improvement for each student. Concerned faculty and program administrators regularly access the recorded digital advising information for follow up and implementation of student specific remediation efforts. Advisors also provide effective career counseling by aligning students’ intended career paths to specializations that match their top performing skills. Institutions or programs that do not employ appropriate digital technology and assessment methodology to implement automation and principles of authentic OBE do not have options for outcomes-based advising, but rely on traditional mechanisms based on transcript grades.

5.4. An Outcomes Based Advising Example

The initial format for academic advising based on outcomes data for the Faculty of Engineering was implemented in the spring term of 2017. The first iteration of the implemented outcomes advising format required advisor input for the advisee’s consolidated 11 SOs summary of results. A sample of three terms summarized 11 SOs evaluation data for a typical EE student is shown in Figure 7. The summary of SOs results was categorized in the notes to the advisees as Excellent, Adequate, Minimal or Unsatisfactory, according to the performance criteria presented in Figure 4. The advisor would then specifically focus on the SOs marked as Unsatisfactory to provide valuable guidance for areas of improvement and corrective actions. Figure 10 below shows a typical outcomes-based advising sample for an EE student showing consistent failure for SO_8 related to understanding the impact of engineering solutions on economic, societal and environmental aspects. In this case, the advisor identified learning activity in capstone courses to target improvement actions specific to SO_8. It is mandatory that students periodically review the notes documented by the advisor for improvement actions. The green check mark in the top left-hand corner of Figure 10 indicates that the advisee viewed the advisor’s electronic notes. Based on positive feedback received from advisees regarding performance enhancement due to implementation of the initial format of outcomes-based advising, the Faculty of Engineering will continue to employ this advising format to improve student performance until the close of the academic year 2019. The second planned iteration would expand advising formats to cover a review of performance indicators and assessment information to produce specific guidance for advisees focusing on course areas and involving concerned faculty members.

5.4. Students as Active Participants

Student empowerment is an integral component in achieving successful learning in authentic OBE systems. The Electrical Engineering program at Gannon University used state-of-the-art digital reporting features provided by EvalTools ® to implement student self-evaluation processes related to the ABET EAC student learning outcomes. These student self-reviews are then corroborated by academic advisor input to further guide advisees towards effective approaches in self-motivated improvement for successful learning. As a case study, the Electrical Engineering program has carefully selected soft skills mandated by student outcomes that are not easily measurable with assignments given in course activities to augment the student advising activities. The logical choice of soft skills such as life-long learning that are not easily measurable in class activities are factored into the advising process. By doing so, academic advisors attempted to achieve both goals of empowering students as active participants in their learning and measuring the soft skills critical to student outcomes. For SO9: Recognition of the need for, and an ability to engage in life-long learning, the three corresponding PIs that the Electrical Engineering program targeted are:
(1) PI_9_1: demonstrate self-managing ability to articulate the student’s own learning goals
(2) PI_9_2: demonstrate self-monitoring ability to assess the student’s own achievements; and
(3) PI_9_3: demonstrate self-modifying ability to make mid-course corrections.

5.4.1. Process for Measuring Soft Skills in Student Advising Activities

For each of the PIs to be assessed, Figure 11 shows the questionnaires and instructions to assist students address relevant issues accordingly. The questionnaires are based on the principles of metacognition. They guide students to examine their own learning progress and achievements toward the intended performance skills/student outcomes. The advisees first identify areas of strength and weakness in specific skills and knowledge areas. They are then required to track their performance and check whether they are meeting the required standards of the program’s student outcomes.
Finally, the advisees develop a plan of remedial action for overall improvement of performance related to the deficient outcomes. Students are asked to address each of the questionnaires before advising day and submit their responses electronically to EvalTools® for review by their advisors.

5.5. Quantitative and Qualitative Analyses of Student Responses and Overall SOs Results

The Figure 12 provides a detailed process flow for the mixed methods approach adopted by the Electrical Engineering program at GU for implementing effective developmental advising. The objective of this approach is to achieve both goals of empowering students as active participants in their learning and measuring the soft skills such as lifelong learning corresponding with EAC ABET SO ‘i’ or SO ‘9’ [ABET]. Advisees are required to provide responses every term to questionnaire that target lifelong learning skills measured by ABET SO ‘9’. As mentioned in the earlier section, advisees make a qualitative self-evaluation based on SOs results for the previous term to submit their responses to various lifelong learning aspects such as self-management, self-monitoring and self-modification.
Advisors then qualitatively evaluate student responses to PI_9_1, PI_9_2 and PI_9_3 for validity and provide supplemental feedback to corroborate any consistent student observations or recommendations for refinement of remedial actions for improvement. Developmental advisors also using scoring rubrics to assess student responses for PI_9_1, PI_9_2 and PI_9_3 and categorize them as E, A, M or U performances. The SO ‘9’ results of complete set of cohorts are then aggregated for a given term to qualitatively evaluate the overall program level attainment of lifelong learning skills. Finally, developmental advisors quantitatively review detailed multi-term ABET SOs (a-k) data and provide comments and feedback to advisees regarding crucial observations and remedial actions for improvement.
Figure 13 illustrates a sample overall attainment of student performance measured against all the SOs from fall 2016 to fall 2017. The questionnaires were developed in the spring of 2017. Students were charged to submit their responses from the fall of 2017 onwards. Based on the EAMU performance criteria shown in Figure 4, the red or ‘U’: Unsatisfactory (aggregate SO values < 60%) and yellow or ‘M’: Minimal performances (60% ˂ aggregate SO values <75%) flags indicate areas of concern. The advisor examined qualitatively the responses from this student for his results for spring 2017 and fall 2017. In this case, the student provided comments related to self-evaluation of his overall performance in the program SOs, identified areas of strengths and weakness, and suggested possible remedial actions for improvement. In addition, the three key aspects of the questionnaire, monitor, manage, and modify, help students to significantly focus on and improve lifelong learning SO9 skills.
Figure 14 shows sample responses from this student. The student noted areas of concern for spring 2017 in four SOs: SO3, SO4, SO7 and SO9; and SO4 and SO11 for his entries in fall 2017. He identified and articulated areas for improvement and suggested visiting the tutoring center for improving his Math skills. For his spring 2018 entries, he noted his performance in these four areas, and identified other knowledge and/or skills for improvement. On advising day, the faculty advisor electronically documented necessary observations after reviewing the student’s submission. Since the SO9 life-long learning skills are based on how students manage, monitor, and modify their own learning progress, advisors are required first, to qualitatively assess the students’ self-assessment, make the student aware of specific areas of improvement especially if the student’s self-evaluation is consistent with the advisor’s observations, and then make general comments regarding the overall status of SOs attainment as measured by a direct quantitative assessment of student performance.
In this developmental advising case, Figure 15 shows an overall improvement of the “concerned” SOs noted from fall 2016 till fall 2017, excepting the obvious red flag for SO4 related to teamwork. Apparently, the red flag highlights a problematic area that indicates student failure in the SO4. However, upon further investigation, based on detailed diagnostics using a drill-down menu that lists key assignments contributing to final SO performances in the interface shown in Figure 15, the advisor quickly identified the specific cause of the failing score for SO4. This student received a “0” score for not submitting a specific key assignment in ECE327 Senior Design course. Therefore, the SO4 final aggregate value resulted in a low score of 43.50%, below 60%, indicated by a red flag. This was not based on a ‘real’ performance failure. Not turning in the assignment was also noted by the student in his own self-assessment.
Since this advisee’s other key assignments pertaining to measurement of this specific SO4 had acceptable scores, his developmental advisor concluded that the student achieved required performance levels for the SO4 skill. Besides the failing or ‘U’: Unsatisfactory red flag specific to SO4 in Figure 13, the yellow flags also indicate SOs with ‘M.’—minimal performances that achieved aggregate values ranging from 60% to 75%. However, upon close observation, the final year (fall 2017), results showed an overall improvement in all SOs. Figure 16 elaborately illustrates the observations and recommendations reported by this student’s developmental advisor in digital advising records for both the spring 2017 and fall 2017 terms. Based on comprehensive evaluation of overall student performance, the developmental advisor therefore concluded that this student adequately met the required performance standards for the concerned SOs.
On advising day, developmental advisors not only deliver a comprehensive evaluation of student progress/performance related to the program’s student outcomes, but also directly assess the student’s lifelong learning skills as reported by SO9. This advising process repeats each semester so that both advisors and advisees can monitor performances for the program SOs, specifically those related to SO9 for lifelong learning skills. To reiterate, one of the key components of OBE is to establish the conditions and opportunities within the educational system that enable and encourage all students to achieve those essential outcomes. In this case, students are given multiple opportunities to achieve the essential outcomes throughout the curriculum in different courses and are also made aware of their progress in meeting them.

5.6. Added Advantage for Evaluating Advising at The Program Level

In addition to the advisor’s feedback to student responses for the performance skills for SO9, faculty advisors also directly measure each student’s attainment of the PIs for SO9 by ranking them using a score based rubric EAMU (Excellent, Adequate, Minimum, Unsatisfactory) performance vector as shown in Figure 17.
The individual results on SO9 are then automatically rolled up as shown in Figure 18 that indicates advising effectiveness for a given term and the achievement of SO9. Let’s examine ABET_PI_9_1 in Figure 18. There are 44 students being assessed for PI_9_1. There are 19 students rated as E, 20 As, 4 Ms and 1 U.
Hence, it gives a proportion of 2.27% for the U category and has an overall average of 3.83 out of a rating factor of 5 and interpreted as achieving 76.6% out of 100%. The corresponding PI_9_1 is color-coded as no flag or a “white” flag to indicate meeting the attainment. Refer to Figure 4 a for detailed classification of the EAMU performance criteria. These specific roll-up data are reviewed along with other SOs data for program evaluation. Although program evaluation is not the focus of this paper, the direct involvement of students in self-evaluation as part of their advising activities has provided a constructive means to gauge the effectiveness of the advising activities at the program level as well. The trend observed in our advising systems shows an overall enhancement of student learning in achieving the desired student outcomes. We believe this is due to improved metacognition, especially once students are aware of the specific cause of their failures and align remedial actions with their advisors’ recommendations targeting specific knowledge and skills areas for improvement. In general, our experience with advising systems that employ the direct involvement of advisees in self-evaluation in their attainment of SOs has been profoundly beneficial, resulting in significant enhancement of students’ lifelong learning skills.

5.7. Quality Standards of Digital Developmental Advising Systems

OBE models advocate student centered impact evaluations to qualify education systems, these primarily include the monitoring of overall improvement in SOs performance over time [5,6,47]. The multi-term ABET SOs [a-k] summary and detailed trend analysis reports [2014–18] for the engineering programs in both institutions indicated positive trends results for majority of the SOs thereby providing objective evidence to substantiate attainment of holistic student learning directly resulting from contributions of a comprehensive education process that includes successful integration of both curriculum delivery and developmental advising systems. From the OBE perspective, the positive trends in multi-term SOs results indicate sustainable systems promoting successful collaboration of both students and staff to achieve CQI [5,6,47]. The multi-term SOs results for programs in both institutions are lengthy reports that were also reviewed by an External Advisory Committee or Industrial Advisory Board, as an institutional and accreditation requirement, for approval of major components of education delivery, CQI processes, and subsequent improvements and adequately reported in recent publications [47].
Additionally, the engineering programs at both GU and IU campuses attained 6 full years of ABET accreditation in 2018 and 2020, respectively, with exceptional results, by fulfillment of the 9 mandatory ABET EAC criteria, and auditors reporting several program strengths without any documented weakness or concern [47]. The digital academic systems at programs in GU and IU were qualified by the fulfillment of the ABET Criterion 1 on Students, which deals with feasibility analyses of accredited programs’ student enrollment, training, academic advising and graduation details. Positive and credible internal and external reviews and feedback of the digital advising systems implemented at both institutions confirm compliance of international quality standards by comprehensively including student and staff perspectives related to sustainability of advising systems, their attainment of academic goals and overall quality improvement.

5.8. Qualitative Comparison of Digital Developmental Advising with Prevalent Traditional Advising

Most engineering programs generally use vague and generic language of outcomes that does not follow some consistent format based on authentic OBE frameworks. Usually, generic and holistic rubrics without detailed topic specific descriptors are applied to assess the program outcomes. Manual assessment models do not assess all students but rather use sampling methods to fulfill minimal accreditation requirements. Generally, advisors do not have access to outcomes assessment or evaluation information. The quality of outcomes direct assessment data and their availability per individual student are therefore, two key factors that drive the initiative for developmental advising based on outcomes. The literature review of this study clearly exhibits dearth in advising systems based on valid and reliable outcomes data collected from direct assessments. Digital developmental advising models employing the FCAR + specific PIs methodology offer solutions to the major deficiencies observed in prevalent advising. Table 5 summarizes several important pedagogical aspects extracted from the literature review, frameworks and results of this study and used as key quality criteria for qualitatively comparing the benefits of digital developmental advising over prevalent traditional advising systems. The 22 pedagogical aspects act as overarching multi-dimensional quality standards in 6 broad areas of education which are authentic OBE and conceptual frameworks, assessment practices, data, staff, student, and process.

6. Discussion

The purpose of this study was to present a state-of-the-art academic advising system that employs best assessment practices and digital technology to tap the maximum potential and benefits of the authentic OBE model and overcome the limitations of contemporary advising mechanisms.

6.1. Research Question 1: To What Extent Should Engineering Programs Shift From Program to Student Centered Models That Incorporate Learning Outcomes for Evaluation of Individual Student Performances Besides Program Evaluations for Accreditation Requirements?

Based on the literature review of this research, engineering programs following the OBE model should implement its student centered approach and provide to all students accurate and detailed outcomes evaluation information to achieve the two major aspects advocated by an authentic OBE model as referenced in Section IV.A of this paper.

6.2. Research Question 2: To What Extent Can Manual Assessment Processes Collect, Store And Utilize Detailed Outcomes Data For Providing Effective Developmental Academic Advising To Every Student In An Engineering Campus Where Several Hundred Are Enrolled?

As per the numerous citations mentioned in Section 3 and Section 4 of this paper it is practically impossible for manual assessment processes to collect, store and utilize such staggering amounts of outcomes data required for effective advising of several hundred students in any engineering campus.

6.3. Research Question 3: To What Extent Can The Assessment Process be Automated Using Digital Technology so that Detailed Outcomes Information for Every Student on Campus Can be Effectively Utilized for Developmental Advising?

After conducting an exhaustive study of research material related to assessment and evaluation as referenced to in Section 3 and Section 4, the authors of this paper have come to a conclusion that digital systems which implement assessment methodology such as FCAR + specific/generic PIs using embedded assessments and PVT can collect, store and utilize detailed and accurate outcomes information for developmental advising regardless of the size of enrolled student populations.

6.4. Research Question 3: What Specific Benefits Can Digital Automated Advising Systems Provide to Developmental Advisors and their Students?

The advantages of digital advising systems with guided student self-evaluation in meeting outcomes are summarized below:
Detailed and accurate digital advising records for advisors and advisees showing trends and summaries of the results of performances related to SOs, PIs and their corresponding assessments.
Advisors can employ mixed methods approaches to achieve a consistent and structured mechanism for assessing student performance in meeting outcomes and can focus more on specific, relevant, and constructive advice for quality improvement.
Students’ metacognitive skills are boosted with accurate indications of their strong/weak skills and/or knowledge areas to support corrective actions in meeting student outcomes each semester.
The student’s self-directed remedial actions can align with the developmental advisor’s recommendations to reinforce overall performance improvement.

7. Limitations

Popular LMS tools like BlackBoard®, Moodle®, etc. do not offer embedded assessments technology and FCAR with PIs classification per Bloom’s Taxonomy. Therefore, EvalTools® Advising Module is an option for schools interested in implementing automated advising systems based on outcomes. Unfortunately, such advising modules cannot operate independently and have to integrate with outcomes assessment systems. The efficacy of the Advising Module based on outcomes depends upon accurate alignment of course assessments with specific learning outcomes and PIs information. Advising based on any form of unreliable and inaccurate outcomes data would be counterproductive if not damaging. Engineering programs cannot just rely on minimal accreditation standards to ensure the quality and standards of their outcomes assessment processes. Therefore, an apparent limitation for implementing digital advising systems such as the Advising Module offered by EvalTools® is that several measures such as the ten essential elements for establishing quality in assessment and evaluation processes mentioned in section IV.B of this paper would have to be mandated by schools to ensure accurate and reliable outcomes data for advising. Lastly, since EvalTools® maintains student data on a google cloud based environment, schools requiring local storage of advising and outcomes information would have to provide additional resources, technical support and required technology for managing student data on their own local serversThis section is not mandatory but can be added to the manuscript if the discussion is unusually long or complex.

8. Conclusion

The demand for higher education is ever on the increase, with student achievement and accountability posing the biggest challenges to improving the quality of higher education. In order to meet these challenges, an OBE model for student learning, along with several quality standards in higher education, have been adopted by accreditation agencies and educational institutions over the past two decades. With thousands of institutions and programs in a tight race for rank and accreditation, the prevalent understanding and implementation of authentic OBE and CQI need clarification. Referring to the paradigm, purpose, premises, and principles of authentic OBE, every component of an educational process must be based on achieving essential outcomes. Academic advising is a core component of the educational process. It helps students properly align with degree and curriculum requirements and provides them necessary guidance for achieving outcomes, graduation, and successful career prospects. Therefore, in the OBE model, advising should be based on and driven by outcomes information. NACADA has also clearly stated the importance of student outcomes in defining and implementing academic advising [37]. Academic advising is a major criterion for the fulfilment of regional and international accreditation standards.
With our vast experience in teaching and accreditation, we have yet to come across academic advising systems that are based on accurate and detailed outcomes, PIs, and assessment information collected for every individual student. NACADA has stated the importance of student learning outcomes for academic advising [37]. Yet several institutions, both within the US and abroad, which have adopted manual assessment models were unable to provide advising systems that are based upon accurate and detailed outcomes, PIs, and assessment information for each student. Education systems that rely on deficient manual assessment processes result in misinformed decisions for students due to delays in accessibility or the lack of accurate and detailed learning outcomes information. Consequently, wrong choices of field of study or professional career paths would lead to a wide spectrum of academic or career-related failures.
This research presents an in-depth description of the theoretical, conceptual, and practical frameworks that helped establish authentic OBE pedagogy at two engineering campuses of IU and GU. The pedagogical models support the implementation of state of the art digital developmental advising systems based on valid and reliable outcomes direct assessment data. Accuracy of direct assessment outcomes data is ensured by applying essential elements of an authentic OBE assessment methodology utilizing a digital platform, web-based software, EvalTools® employing the FCAR, specific/generic PIs and corresponding rubrics, classified per Bloom’s 3 domains and their learning levels. Digital platform, EvalTools®, coupled with FCAR + Specific PIs methodology streamlines pedagogical processes for the effective collection and evaluation of detailed outcomes and assessment information for every student in a higher education institution with thousands of enrolled students. Findings of this study indicate that digital developmental advising models based on authentic OBE frameworks specifically address major deficiencies of prevalent advising since they utilize pedagogical solutions that: (a) support automated collection and reporting of valid and reliable outcomes data for every individual enrolled student (b) collect accurate outcomes data using specific PIs and hybrid rubrics that are accurately aligned with intended course topics and their learning activity (c) provide high precision qualification for student attainment of holistic learning by assessing specific PIs classified per Bloom’s 3 domains and their learning levels (d) enable novel mixed methods approaches for the quick and accurate evaluation of student failure and/or strength based on detailed objective assessment data for achieving effective developmental advising (e) and enable students to easily access detailed multi-term outcomes data, reinforce remediation efforts with close collaboration and follow up with advisors, and use outcomes based self-evaluations to enhance their metacognition and lifelong learning skills.
The novel mixed methods investigative models using analytical tools that facilitate comprehensive diagnostics as shown in some examples of this paper, easily enable accurate and early identification of learning deficiencies for prompt remediation efforts. On the same note, early recognition of strong skills in specific engineering activities by observation of distinct patterns in diagnostics reports related to students’ performance can be followed by precise academic guidance to gain knowledge and skills in associated areas for the overall attainment of a holistic expertise. This approach results in the on-time, precision developmental advising necessary for students to make comprehensive decisions regarding the selection of relevant areas of specialization in education, research and training or industry-related prospects, helping them to evolve into outstanding performers in their respective fields and employing the highest standards to better shape the future of the world we live in today.

Author Contributions

Conceptualization, Wajid Hussain, William G. Spady and Mak Fong; methodology, Wajid Hussain, William G. Spady and Mak Fong.; software, Wajid Hussain and Mak Fong.; validation, Wajid Hussain and Mak Fong.; formal analysis, Wajid Hussain, William G. Spady and Mak Fong.; investigation, Wajid Hussain and Mak Fong; resources, Wajid Hussain and Mak Fong.; data curation, Wajid Hussain and Mak Fong; writing—original draft preparation, Wajid Hussain and Mak Fong; writing—review and editing, Wajid Hussain, William G. Spady and Mak Fong; supervision, William G. Spady; project administration, Wajid Hussain and Mak Fong; All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

There is no publicly available data.

Acknowledgments

This research is based on the results of a rigorous 5-year program for the implementation of a comprehensive Outcomes Based Education model involving the curriculum, teaching, learning, advising and other academic and quality assurance processes for the CE, ME and EE engineering departments at the Faculty of Engineering, Islamic University in Madinah. The program efforts were directly led by the corresponding author and three of the co-authors who at the time were the Director of the Quality and Accreditation Office at the Faculty of Engineering, Islamic University and ABET coordinators for the CE, ME and EE Departments, respectively. The authors thank the faculty members for their co-operation and support in completing the necessary quality assurance and academic teaching processes that enabled the collection of the necessary results.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Killen, R. (2007). Teaching strategies for outcome based education (second edn.). Cape Town, South Africa: Juta, & Co.
  2. Moon, J. (2000). Linking levels, learning outcomes and assessment criteria. Bologna Process.–European Higher Education Area. http://aic.lv/ace/ace_disk/Bologna/Bol_semin/Edinburgh/J_Moon_backgrP.
  3. Spady, W. (1994a). Choosing outcomes of significance. Educational Leadership, 51(5), 18–23.
  4. Spady, W. (1994b). Outcome-based education: Critical issues and answers. Arlington, VA: American Association of School Administrators.
  5. Spady, W. , Hussain, W., Largo, J., Uy, F. (2018, February) "Beyond Outcomes Accreditation," Rex Publishers, Manila, Philippines. https://www.rexestore.com/home/1880-beyond-outcomes-accredidationpaper-bound.html.
  6. Spady, W. (2020). Outcome-Based Education's Empowering Essence. Mason Works Press, Boulder, Colorado. http://williamspady.com/index.php/products/.
  7. Harden, R. M. (2002). Developments in outcome-based education. Medical Teacher, 24(2), 117–120. [CrossRef]
  8. Harden, R. M. (2007). Outcome-based education: The future is today. Medical Teacher, 29(7), 625–629. [CrossRef]
  9. Adelman, C. (2015). To imagine a verb: The language and syntax of learning outcomes statements. National Institute of Learning Outcomes Assessment (NILOA). http://learningoutcomesassessment.org/documents/Occasional_Paper_24.pdf.
  10. Provezis, S.
  11. Gannon-Slater, N. , Ikenberry, S., Jankowski, N., & Kuh, G. (2014). Institutional assessment practices across accreditation regions. Urbana, IL, National Institute of Learning Outcomes Assessment (NILOA). www.learningoutcomeassessment.org/documents/Accreditation%20report.pdf.
  12. Accreditation Board of Engineering & Technology (ABET), USA 2023, accreditation criteria, www.abet.org http://www.abet.org/accreditation/accreditation-criteria/.
  13. Dew, S. K. , Lavoie, M. In , & Snelgrove, A. (2011, June). An engineering accreditation management system. Proceedings of the Canadian Engineering Education Association. Paper presented at the 2nd Conference Canadian Engineering Education Association, Newfoundland, Canada., St. John’s. [CrossRef]
  14. Essa, E. , Dittrich, A., Dascalu, S., & Harris, F. C., Jr. (2010). ACAT: A web-based software tool to facilitate course assessment for ABET accreditation. Department of Computer Science and Engineering University of Nevada. http://www.cse.unr.edu/~fredh/papers/conf/092-aawbsttfcafaa/paper.pdf.
  15. Kalaani, Y. , Haddad, R. J. (2014). Continuous improvement in the assessment process of engineering programs. Proceedings of the 2014 ASEE South East Section Conference. 30 March. American Society for Engineering Education.
  16. International Engineerng Alliance (IEA), Washington Accord signatories (2023) https://www.ieagreements.org/accords/washington/signatories/.
  17. Middle States Commission of Higher Education (2023). Standards for accreditation, PA, USA. https://www.msche.org/.
  18. Mohammad, A. W. , & Zaharim, A. (2012). Programme outcomes assessment models in engineering faculties. Asian Social Science, 8(16). [CrossRef]
  19. Wergin, J. F. (2005). Higher education: Waking up to the importance of accreditation. Change, 37(3), 35–41.
  20. Sharon, A. Aiken-Wisniewski ; Joshua S. Smith ; Wendy G. Troxel (2010). Expanding Research in Academic Advising: Methodological Strategies to Engage Advisors in Research. NACADA Journal (2010) 30 (1): 4–13. [CrossRef]
  21. Appleby, D. C. (2002). The teaching-advising connection. In S. F. Davis & W. Buskist (Eds.), The teaching of psychology: Essays in honor of Wilbert J. McKeachie and Charles L. Braver. Mahwah, NJ: Lawrence Erlbaum Associates.
  22. Appleby, D. C. (2008). Advising as teaching and learning. In V. N. Gordon, W. R. Habley & T. J. Grites (Eds.), Academic Advising: A comprehensive handbook (second edn.) (pp. 85–102). San Francisco, CA: Jossey-Bass.
  23. Campbell, S. (2005a). Why do assessment of academic advising? Part I. Academic Advising Today, 28(3), 1, 8.
  24. Campbell, S. (2005b). Why do assessment of academic advising? Part II. Academic Advising Today, 28(4), 13–14.
  25. Campbell, S. M. , & Nutt, C. L. (2008). Academic advising in the new global century: Supporting student engagement and learning outcomes achievement. Peer Review, 10(1), 4–7.
  26. Gordon, V., N. (2019). Developmental Advising: The Elusive Ideal. NACADA Journal (2019) 39 (2): 72–76. [CrossRef]
  27. Habley, W., R. And Morales, R., H. (1998). Advising Models: Goal Achievement and Program Effectiveness. NACADA Journal (1998) 18 (1): 35–41. [CrossRef]
  28. He, Y. , & Hutson, B. (2017). Assessment for faculty advising: Beyond the service component. NACADA Journal, 37(2), 66–75. [CrossRef]
  29. Kraft-Terry, S. , & Cheri, K. (2019). Direct Measure Assessment of Learning Outcome–Driven Proactive Advising for Academically At-Risk Students. NACADA Journal (2019) 39 (1): 60–76. [CrossRef]
  30. Lynch, M. (2000). Assessing the effectiveness of the advising program. In V. N. Gordon & W. R. Habley (Eds.), Academic advising: A comprehensive handbook. San Francisco, CA: Jossey-Bass.
  31. Powers, K. L. , Carlstrom, A. H., & Hughey, K. F. (2014). Academic advising assessment practices: Results of a national study. NACADA Journal, 34(1), 64–77. [CrossRef]
  32. Swing, R. L. (Ed.) . (2001). Proving and improving: Strategies for assessing thejirst year of college (Monograph Series No. 33). U: Columbia, SC.
  33. Information on EvalTools®. http://www.makteam.com.
  34. Jeschke, M. P. , Johnson, K. E., & Williams, J. R. (2001). A comparison of intrusive and prescriptive advising of psychology majors at an urban comprehensive university. NACADA Journal, 21(1–2), 46–58. [CrossRef]
  35. Kadar, R. S. (2001). A counseling liaison model of academic advising. Journal of College Counseling, 4(2), 174–178. [CrossRef]
  36. Banta, T. W. , Hansen, M. J., Black, K. E., & Jackson, J. E. (2002). Assessing advising outcomes. NACADA Journal: spring, 22(1), 5–14. [CrossRef]
  37. National Academic Advising Association (NACADA) 2023. Kansas State University. KS, USA. https://nacada.ksu.
  38. Ibrahim, W. , Atif, Y., Shuaib, K., Sampson, D. (2015). A Web-Based Course Assessment Tool with Direct Mapping to Student Outcomes. Educational Technology & Society, 18 (2), 46–59.
  39. Kumaran, V., S. & Lindquist, T., E. (2007). Web-based course information system supporting accreditation. Proceedings of the 2007 Frontiers In Education conference. http://fieconference.org/fie2007/papers/1621.
  40. McGourty, J. , Sebastian, C., & Swart, W. (1997). Performance measurement and continuous improvement of undergraduate engineering education systems. Proceedings of the 1997 Frontiers in Education Conference, Pittsburgh, Pa. –8. IEEE Catalog no. 97CH36099 (pp. 1294–1301). 5 November.
  41. McGourty, J. , Sebastian, C. ( 87(4), 355–361. [CrossRef]
  42. Pallapu, S. K. (2005). Automating outcomes based assessment. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.199.4160&rep=rep1&type=pdf. 4160. [Google Scholar]
  43. Hussain, W. , Mak, F., Addas, M. F. (2016). ‘Engineering Program Evaluations Based on Automated Measurement of Performance Indicators Data Classified into Cognitive, Affective, and Psychomotor Learning Domains of the Revised Bloom’s Taxonomy,’ ASEE 123rd Annual Conference and Exposition, –29, New Orleans, LA. https://peer.asee. 26 June.
  44. Hussain, W. , & Spady, W. (2017). ‘Specific, Generic Performance Indicators and Their Rubrics for the Comprehensive Measurement of ABET Student Outcomes,’ ASEE 124th Annual Conference and Exposition, –28, Columbus, OH. 25 June.
  45. Mak, F. , & Sundaram, R. (2016). ‘Integrated FCAR Model with Traditional Rubric-Based Model to Enhance Automation of Student Outcomes Evaluation Process,’ ASEE 123rdAnnual Conference and Exposition, –29, New Orleans, LA. 26 June.
  46. Eltayeb, M. , Mak, F., Soysal, O. (2013).Work in progress: Engaging faculty for program improvement via EvalTools®: A new software model. 2013 Frontiers in Education conference FIE. 2012 (pp.1-6). [CrossRef]
  47. W. Hussain, W. G. Spady, M. T. Naqash, S. Z. Khan, B. A. Khawaja and L. Conner, "ABET Accreditation During and After COVID19 - Navigating the Digital Age," in IEEE Access, vol. 8, pp. 218997-21 9046, 2020. [CrossRef] [PubMed]
  48. Spady, W. & Marshall, K. J. (91). Beyond traditional outcome-based education. 19 October; 49.
  49. Spady, W. (88). Organizing for results: The basis of authentic restructuring and reform. Educational Leadership, 46, 7. 19 October.
  50. Spady, W. (summer 1992).
  51. Hussain, W. (2016) Automated engineering program evaluations—Learning domain evaluations—CQI. https://www.youtube.com/watch?v=VR4fsD97KD0.
  52. Hussain, W. (2017) Specific performance indicators.https://www.youtube.com/watch?v=T9aKfJcJkNk.
  53. Hussain, W. , Addas, M. F., & Mak, F. (2016, October). Quality improvement with automated engineering program evaluations using performance indicators based on Bloom’s 3 domains. In Frontiers in education conference (FIE), 2016 IEEE, pp. 1–9. IEEE.
  54. Hussain, W. , & Addas, M. F. (2016, April). Digitally automated assessment of outcomes classified per Bloom’s Three Domains and based on frequency and types of assessments. Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA). http://www.learningoutcomesassessment.org/documents/Hussain_Addas_Assessment_in_Practice.pdf.
  55. Hussain, W. , & Addas, M. F. (2015). “A Digital Integrated Quality Management System for Automated Assessmentof QIYAS Standardized Learning Outcomes”, 2nd International Conference on Outcomes Assessment (ICA), 2015, QIYAS, Riyadh, KSA.
  56. Estell, J. K. , Yoder, J-D. S., Morrison, B. B., & Mak, F. K. (2012). Improving upon best practices: FCAR 2. 2012. [Google Scholar]
  57. Liu, C. , & Chen, L. (2012). 29 March 2012; 12. [Google Scholar]
  58. Mak, F. , & Kelly, J. (2010). Systematic means for identifying and justifying key assignments for effective rules-based program evaluation. 40th ASEE/IEEE Frontiers in Education Conference, –30, Washington, DC. 27 October.
  59. Miller, R. L. , & Olds, B. M. (1999). Performance assessment of EC-2000 student outcomes in the unit operations laboratory. ASEE Annual Conference Proceedings, 1999.
  60. Mead, P. F. , & Bennet, M. M. (2009). Practical framework for Bloom’s based teaching and assessment of engineering outcomes. Education and training in optics and photonics 2009. Optical Society of America, paper ETB3. [CrossRef]
  61. Mead, P. F. , Turnquest, T. T., & Wallace, S., D. (2006). Work in progress: Practical framework for engineering outcomes-based teaching assessment—A catalyst for the creation of faculty learning communities. 36th Annual Frontiers in Education Conference, pp. 19–20). [CrossRef]
  62. Gosselin, K.R. , & Okamoto, N. (2018). Improving Instruction and Assessment via Bloom’s Taxonomy and Descriptive Rubrics. ASEE 125th Annual Conference and Exposition, –28, Salt Lake City, UT. 25 June.
  63. Jonsson, A. , & Svingby, G. (2007). The use of scoring rubrics: Reliability, validity and educational consequences. Educational Research Review, 2(2), 130–144. http://www.sciencedirect.com/science/article/pii/S1747938X07000188. [CrossRef]
  64. Information on Blackboard®. https://www.blackboard.com/teaching-learning/learning-management/blackboard-learn.
Figure 1. FCAR + PIs assessment model process flow indicating course faculty involvement in almost all phases of CQI cycles.
Figure 1. FCAR + PIs assessment model process flow indicating course faculty involvement in almost all phases of CQI cycles.
Preprints 108226 g001
Figure 2. OBE design down mapping from goals, PEOS, SOs, COs to PIs [26].
Figure 2. OBE design down mapping from goals, PEOS, SOs, COs to PIs [26].
Preprints 108226 g002
Figure 3. Portion of lists in a digital database showing PIs and their corresponding hybrid rubrics for ABET SO ‘e’ on problem solving.
Figure 3. Portion of lists in a digital database showing PIs and their corresponding hybrid rubrics for ABET SO ‘e’ on problem solving.
Preprints 108226 g003
Figure 4. Clipped portion of performance vector for course ME_262 THERMODYNAMICS 1.
Figure 4. Clipped portion of performance vector for course ME_262 THERMODYNAMICS 1.
Preprints 108226 g004
Figure 5. SO8-PIs, Performance Vector Table (PVT) for term 372 mechanical engineering program.
Figure 5. SO8-PIs, Performance Vector Table (PVT) for term 372 mechanical engineering program.
Preprints 108226 g005
Figure 6. Mixed methods approach to developmental advising at Faculty of Engineering, IU.
Figure 6. Mixed methods approach to developmental advising at Faculty of Engineering, IU.
Preprints 108226 g006
Figure 7. EE program, consolidated student evaluation for terms 361, 362 and 371, and above average student showing pattern of weakness in skills related to SOs ‘h,’ ‘i,’ and ‘j’.
Figure 7. EE program, consolidated student evaluation for terms 361, 362 and 371, and above average student showing pattern of weakness in skills related to SOs ‘h,’ ‘i,’ and ‘j’.
Preprints 108226 g007
Figure 8. SO_1, ‘a’, an individual underperforming student’s skills data measured by multiple raters using several PIs in multiple courses, types of assessments, terms and applying weighting factors WF.
Figure 8. SO_1, ‘a’, an individual underperforming student’s skills data measured by multiple raters using several PIs in multiple courses, types of assessments, terms and applying weighting factors WF.
Preprints 108226 g008
Figure 9. Patterns of comparatively better learning for a typical underperforming EE student observed in a two-term student evaluation report.
Figure 9. Patterns of comparatively better learning for a typical underperforming EE student observed in a two-term student evaluation report.
Preprints 108226 g009
Figure 10. A typical outcomes-based advising sample for an EE student showing consistent failure in SO_8 related to understanding the impact of engineering solutions on economic, societal and environmental aspects.
Figure 10. A typical outcomes-based advising sample for an EE student showing consistent failure in SO_8 related to understanding the impact of engineering solutions on economic, societal and environmental aspects.
Preprints 108226 g010
Figure 11. Questionnaires and instructions for students’ self-assessment of SO9.
Figure 11. Questionnaires and instructions for students’ self-assessment of SO9.
Preprints 108226 g011
Figure 12. Mixed methods approach to developmental advising at Electrical Engineering program, GU.
Figure 12. Mixed methods approach to developmental advising at Electrical Engineering program, GU.
Preprints 108226 g012
Figure 13. Case study –a sample student’s overall SOs attainment.
Figure 13. Case study –a sample student’s overall SOs attainment.
Preprints 108226 g013
Figure 14. A sample student’s self-assessment of his learning status against the student outcomes.
Figure 14. A sample student’s self-assessment of his learning status against the student outcomes.
Preprints 108226 g014
Figure 15. A drill-down menu showing key assignments for SO4.
Figure 15. A drill-down menu showing key assignments for SO4.
Preprints 108226 g015
Figure 16. Advisor’s input to student progress in attaining SOs.
Figure 16. Advisor’s input to student progress in attaining SOs.
Preprints 108226 g016
Figure 17. EAMU rubric performance vector for PI_9_1.
Figure 17. EAMU rubric performance vector for PI_9_1.
Preprints 108226 g017
Figure 18. Effectiveness of advising through assessing SO9.
Figure 18. Effectiveness of advising through assessing SO9.
Preprints 108226 g018
Table 1. Heuristic rules for performance criteria.
Table 1. Heuristic rules for performance criteria.
Preprints 108226 i001
Table 5. Qualitative Comparison of Digital Developmental and Prevalent Traditional Advising.
Table 5. Qualitative Comparison of Digital Developmental and Prevalent Traditional Advising.
Area Pedagogical Aspects Digital Developmental Advising Prevalent Traditional Advising Sectional References
Authentic OBE and Conceptual Frameworks Based on Authentic OBE Frameworks Maximum fulfillment of authentic OBE frameworks Partial or Minimal fulfillment of authentic OBE frameworks I, III.C, IV.A
Standards of Language of Outcomes Maximum fulfillment of consistent OBE frameworks Partial or Minimal fulfillment and lack of any consistent frameworks I, III.C, IV.A, IV.B
Assess students All students assessed Random or Select Sampling I, III.C, IV.A, IV.B
Specificity of Outcomes Mostly specific resulting in valid and reliable outcomes data Mostly generic resulting in vague and inaccurate results I, III.C, IV.A, IV.B
Coverage of Bloom’s 3 Learning Domains and Learning Levels Specific PIs that are classified according to Bloom’s 3 learning domains and their learning levels Generic PIs that have no classification I, IV.B
‘Design Down’ Implementation OBE power principle design down is fully implemented with specific PIs used to assess the course outcomes OBE power principle design down is partially implemented with generic PIs used to assess the program outcomes IV.B
Assessment Practices Description of Rubrics Hybrid rubrics that are combination of analytic and holistic, topic specific, provide detailed steps, scoring information and descriptors Mostly holistic generic rubrics, some could be analytic, rarely topic specific or provide detailed steps, without scoring information and detailed descriptors IV.B
Application of Rubrics Applied to most course learning activities with tight alignment Applied to just major learning activities at the program level with minimal alignment IV.B
Embedded Assessments The course outcomes and PIs follow consistent frameworks and are designed to enable embedded assessments methodology The course outcomes and PIs do not follow consistent frameworks and are not designed to enable embedded assessments methodology IV.B, IV.C
Quality of Outcomes Data Validity & Reliability of Outcomes Data Specific outcomes and PIs, consistent frameworks and hybrid rubrics produce comprehensive and accurate assessment data for all students. Therefore, outcomes data can be used for advising purposes. Generic outcomes and PIs, lack of consistent frameworks, and generic rubrics produce vague and inaccurate assessment data for small samples of students. Therefore, outcomes data cannot be used for advising purposes. I, III.C, IV.B, IV.C, V.A, V.B, V.C, V.D, V.E, V.F, V.G, V.H
Statistical Power Heterogeneous and accurate data. All students, All courses and All major assessments sampled Random or selective sampling of students, courses and assessments III.C, IV.A, IV.B, IV.C
Quality of Multi-term SOs Data Valid and reliable data.
All data is collected from direct assessments by implementing:
several essential elements of comprehensive assessment methodology
specific PIs
wide application of hybrid rubrics
strictly following stringent QA processes and monitoring ensuring tight alignment with student learning.
Usually not available and unreliable.
Due to lack of:
comprehensive assessment process
specific PIs
wide usage of rubrics
stringent QA processes and
appropriate technology.
1, III.C, IV.A, IV.B, IV.C
Staff Access to Students Skills and Knowledge Information Advisors can easily access student outcomes, assessment, and objective evidence besides academic transcripts information Advisors cannot access student outcomes, assessment, and objective evidence. Advising is fully based on academic transcripts information I, III.C, IV.B, IV.C, V.A, V.B, V.C, V.D, V.E, V.F, V.G, V.H
Advisor Interactions Advisors have full access to detailed student past and present course performances thereby providing accurate informational resources to facilitate productive advisor-course instructor dialog Advisors do not have access to any detail related to student past or present course performances thereby lacking any information resources for productive advisor-course instructor dialog I, III.C, V, V.B, V.C
Performance Criteria Advisors apply detailed performance criteria and heuristics rules based on a scientific color coded flagging scheme to evaluate attainment of students outcomes Advisors do not refer or apply any such performance criteria or heuristics rules due to lack of detailed direct assessments data and associated digital reporting technology IV.C, V.A, V.B, V.C, V.D, V.E, V.F, V.G
Access to Multi-term SOs Data Advisors can easily access and use multi-term SOs data reports and identify performance trends and patterns for accurate developmental feedback Advisors cannot access any type of multi-term SOs data reports and identify performance trends and patterns for accurate developmental feedback V.A, V.B, V.C, V.D, V.E, V.F, V.G
Mixed Methods Approaches to Investigation Advisors can easily apply mixed methods approaches to investigation and feedback for effective developmental advising due to availability of accurate outcomes data, specific PIs and assessment information presented in organized formats using state of the art digital diagnostic reports Advisors cannot apply mixed methods approaches to investigation and feedback for effective developmental advising due to lack of availability of accurate outcomes data, specific PIs and assessment information presented in organized formats using state of the art digital diagnostic reports V.A, V.B, V.C, V.D, V.E, V.F, V.G
Students Student Accessibility of outcomes data All students can review their detailed outcomes based performances, assessments information for multiple terms and examine trends in improvement or any failures Students cannot review any form of outcomes based performances, assessments information for multiple terms and cannot examine trends in improvement or any failures V.A, V.B, V.C, V.D, V.E, V.F, V.G
Student Follow Up Actions for Improvement Both students and advisors can track outcomes based performances and systematically follow up on recommended remedial actions using digital reporting features Both students and advisors cannot track outcomes based performances and therefore cannot systematically follow up on any recommended remedial actions V.A, V.B, V.C, V.D, V.E, V.F, V.G
Student Attainment of Lifelong Learning Skills Students can use self-evaluations forms and reinforce their remediation efforts with guidance from advisors to enhance metacognition capabilities and eventually attain lifelong learning skills Students do not have any access to outcomes data and therefore cannot conduct any form of self-evaluation for outcomes performances and therefore cannot collaborate with any advisors guidance on outcomes V.E, V.F, V.G
Process Integration with Digital Technology Pedagogy and assessment methodology fully support integration with digital technology that employs embedded assessments Language of outcomes, alignment issues, lack of rubrics make it difficult to integrate with digital technology employing embedded assessments I, IV.C, V.A, V.B, V.C, V.D, V.E, V.F, V.G
PDCA Quality Processes 6 Comprehensive PDCA Quality Cycles for Stringent Quality standards, monitoring and control of education process Lack of Well Organized and Stringent QA cycles or measures and technology for implementing education process. I, III.C, IV.C
Impact Evaluation of Advising Credible impact evaluations of developmental advising can be conducted by applying qualifying rubrics to multi-year SOs direct assessment trend analyses information. There is no need of control or focus groups and credibility issues related to student survey feedback. Impact evaluations are usually based on indirect assessments collected using student surveys. Several issues related to use of control or focus groups and credibility of feedback have to be accordingly dealt with. V.H
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated