Educational programs at Sam Houston State University (SHSU) regularly identify expected outcomes, assess the extent to which they achieve those outcomes, and provide evidence of seeking improvement based on analysis of the results. This narrative features specific examples of the outcomes assessment process utilized by educational units, including online and hybrid programs, at SHSU. These initial examples are expanded upon in more extensive, college-specific documents highlighting completed unit-level assessment plans from the past three assessment cycles (i.e., 2014-2015, 2015-2016, and 2016-2017). Reviewers may access all archived academic unit assessment plans for these cycles within SHSU’s reaffirmation website. Instructions for accessing this repository are available on the instruction sheet. Furthermore, this narrative will highlight additional steps taken by SHSU to ensure compliance with the guidelines, recommendations, and requirements of the Southern Association of Colleges and Schools Commission on Colleges (SACSCOC) regarding Standard 8.2.a.
Institutional assessment at SHSU is overseen by the Office of Academic Planning and Assessment (OAPA) [1]. Within OAPA, the Director of Assessment oversees unit entries into the campus’s online assessment management system. Through the 2014-2015 assessment cycle, SHSU used a locally developed system, the Online Assessment Tracking Database (OATDB), to document assessment plan entries. Beginning with the 2015-2016 cycle, SHSU transitioned all assessment plan documentation into the CampusLabs software [2]. Furthermore, OAPA staff members provide regular training, resources, and support to all units across campus conducting annual assessment [3].
SHSU utilizes a 13-month annual assessment cycle for all units, running from September to October of the following year. This cycle roughly aligns with the University’s academic calendar and gives units flexibility to collect data from all academic semesters (i.e., fall, spring, and summer), while still giving units relying on end-of-fiscal-year data time to analyze their results and develop actions for improvement. OAPA staff members monitor unit entries throughout the year to ensure participation in the ongoing assessment cycle.
Annual Assessment Plan Elements
Units at SHSU utilize CampusLabs: Planning (and formally the OATDB) to document their ongoing assessment plans and reports and are asked to provide the following plan elements.
Examples of Annual Outcomes Assessment
Specific examples from each academic college for the most recently completed assessment cycle (2016-2017) are highlighted below, in detail, to demonstrate how educational units at SHSU define outcomes, assess the extent to which they achieve these outcomes, and utilize the analyzed data to identify actions for improvement.
College of Business Administration – Marketing BBA (2016-2017 cycle) [4]
The Marketing BBA program identified five student learning objectives for the 2016-2017 assessment cycle. To assess student attainment of these objectives, faculty within the Marketing BBA program used a variety of different embedded assessments, ranging from multiple choice/short answer questions embedded in exams to in-class assignments and presentations to writing assignments. The program then used the data gathered from these assessments to identify specific actions for improving student learning.
For example, one learning objective identified by the program was that “Students Will be Able to Summarize and Explain Consumer Behavior Concepts” [5]. Student attainment of this objective was assessed using questions embedded within exams and assignments within MKTG 3320 – Consumer Behavior. Findings from these assessments revealed various student strengths and weaknesses regarding this objective [6].
The program identified some areas in which students excelled, including students’ understanding of two key types of consumer value: utilitarian and hedonistic. Student success was ascribed to a hands-on classroom activity. However, the program also identified an area for improvement for this objective. The program noted “some students had difficulty conceptualizing consumer behavior outside of buying behavior, such as the recognizing that voting behavior is a form of consumer behavior.” Additionally, the program noted that students also needed to “better understand the concept of socialization.”
To address these student weaknesses, the program will develop and implement new in-class examples for use when students are being taught these concepts. For example, the program noted in their actions that “new examples will be created to more explicitly link voting to consumer behavior, such as examples of voting issues linked to the consumer decision-making process and situational influences on decision making.” [7]
College of Criminal Justice – Victim Services Management MS (2016-2017 cycle) [8]
For the 2016-2017 cycle, the Victim Services Management MS program identified two learning objectives. To assess these objectives, the program employed a variety of different assessment measures, including rubrics to evaluate different embedded assignments, such as mock policy manuals and mock funding proposals. The program’s second learning objective focused specifically on students’ abilities to “identify meaningful outcomes for the purpose of writing grants as reflected in a mock application for foundation funding” [9].
Students enrolled in CRIJ 5385 – Non-Profit Management and Grant Writing, were asked to develop funding proposals for mock family violence programs, to be submitted for a fictitious foundation grant. These mock program proposals were then evaluated using a rubric for student attainment of the objective. In particular, students were evaluated on their abilities to (a) identify at least one measurable outcome for their program, (b) identify at least one attainable outcome for their program, and (c) identify at least one outcome that was directly attributable to the services of their program [10]. Eighty-one percent of students were able to identify at least one hypothetical outcome that is attainable, measurable, and attributable to their hypothetical program’s services, exceeding the baseline of 70%. However, the program was still able to identify an action for improvement for this objective. The program noted that some students were confusing outcomes with outputs. Therefore, the program resolved to increase instructional emphasis on the differences between program outputs and program objectives. Additionally, as the criteria for this objective were met for the 2016-2017 year, the program also determined the need to raise the criteria of success for this objective from 70% to 85%.
College of Education – Interdisciplinary Studies BA/BS (Elementary EC-6) (2016-2017 cycle) [11]
The Interdisciplinary Studies BA/BS (Elementary EC-6) program identified two learning objectives for the 2016-2017 cycle. The second of these objectives centered on students being able to “Demonstrate Mastery of the State Mandated Standards for the Pedagogy and Professional Responsibilities (PPR) Certification Exam.” Student mastery of these skills and standards was assessed through the PPR Exam, which has four separate learning domains covering at least forty different areas of knowledge and skills:
Student attainment of this objective was assessed by the percentage of first-time passage rates of the PPR Exam, with the expectation that 90% or more of students would pass the PPR Exam.
For the 2016-2017 cycle, 95% of students (250 out of 262) taking the PPR Exam successfully passed the exam on their first attempt, indicating that the objective was exceeded. However, through examining the disaggregated data for each of the four PPR domains, the program was able to identify specific areas where students struggled. In particular, the program faculty identified Domains 1 and 3 as areas of underperformance, with students answering only 77% and 76% of questions correctly, respectively. In response, the faculty within the program will make further efforts to align course content and assessments with the standards measured by the PPR Exam, with particular attention paid to those within Domains 1 and 3. Additionally, it was recognized by the faculty that students needed more opportunities to practice prior to taking the PPR Exam. Therefore, the program also determined that all teacher education candidates will need to take a practice certification exam prior to their being allowed to take the actual teacher certification exam.
College of Fine Arts and Mass Communication – Photography BFA (2016-2017 cycle) [13]
For the 2016-2017 cycle, the Photography BFA program identified four different learning objectives. Student learning of these objectives was assessed using a number of different measures, including rubrics to evaluate student essays and rubrics to evaluate student portfolios of work. As an example, the second learning objective for the program stated that students would be able to “create photographs that demonstrate proficiency in the use of analog darkroom processes” [14]. This learning objective was then assessed using a rubric to evaluate a portfolio of student work. In particular, student work was evaluated from ARTS 3365 – Photography to determine the degree to which students demonstrated the following:
The overall expectation was that at least 60% of students would score 80% or higher in each of these areas; however, findings from the 2016-2017 cycle indicated that, although student scores had improved from the previous year, student work was still falling short of these marks. In particular, the faculty observed that students were demonstrating weaknesses in all areas of analog photography [16]. In response to these findings, the program faculty have determined the need to increase the focus being placed on analog photography techniques within the photography courses. The faculty will work together to determine, and then to communicate, the specific processes that need to be reinforced to the faculty members teaching the analog photography courses.
College of Health Sciences – Nursing BSN (2016-2017 cycle) [17]
The Nursing BSN identified 23 different learning objectives for the 2016-2017 assessment cycle. These learning objectives were assessed using a wide variety of assessment measures, including standardized and locally developed exams, course embedded assignments, and scores from student papers. As an example, students were expected to demonstrate their ability to “integrate knowledge from liberal arts studies and nursing science to practice professional nursing in a holistic nursing career.” This learning goal was evaluated in several ways, including through student performance on the Assessment Technologies Institute (ATI) Nurse Touch: Becoming a Professional Nurse Exam, the ATI Nutrition Exam, and the ATI Pharmacology Made Easy 3.0 exam [18].
Student performance across these three assessments varied. On the ATI Nurse Touch exam, 100% of students successfully achieved a score of “meets expectations”; however, students did not meet targets for the ATI Nutrition and ATI Pharmacology Made Easy exams. In reviewing the data, the nursing faculty determined that the Nutrition Exam was being administered too early in the program for students and thus was not serving as an effective assessment measure. Therefore, the program decided that the exam would not be administered moving forward. After reviewing the data from the Pharmacology Made Easy exam, the program determined a need to increase student exposure to the topic of pharmacology. To do so, the program determined that a curricular change was needed, and that the existing Pathophysiology/Pharmacology course should be split into two separate courses, Pharmacology and Pathophysiology, starting with the fall term of 2017 [19].
College of Humanities and Social Sciences – Philosophy BA (2016-2017 cycle) [20]
The Philosophy BA program identified eight different learning and performance objectives for the 2016-2017 assessment cycle. One such example was the objective for students to demonstrate critical thinking skills by being “able to analyze arguments and draw conclusions from available information” [21]. Student attainment of this objective was assessed through two separate indicators, student pre- to post-test performance on the Texas Assessment of Critical Thinking Skills (TACTS) Test, and pre- to post-test student performance on one particular TACTS test question (Question 23) that focused on student understanding of linked probabilities.
Student results for both the TACTS test and Question 23 (Linked Probabilities) were reported for students enrolled in face-to-face courses, for students enrolled in online courses, and for the combined student populations. The expectations of success were that students would make statistically significant pre-to-post gains on the TACTS exam [22]. Furthermore, the program expected that the percentage of students correctly answering Question 23 (Linked Probabilities) would increase by at least 150% from pre- to post-test, and that the overall post-test average for Question 23 would be 50% or higher [23].
The pre- to post-test TACTS performance for face-to-face students and for the overall student population both demonstrated statistically significant increases; however, the increases for online students were not statistically significant [24]. In response, the program determined to hold a series of meetings for the faculty teaching PHIL 2303 – Critical Thinking specifically to share successful strategies for improving student learning, with a particular emphasis on improving student performance within online sections [25].
The results for Question 23 (Linked Probabilities) did not meet the program’s expectations for success. For face-to-face students, the pre- to post-test percentage increase was only 147% (15.5% to 38.4%). Furthermore, the post-test result of 38.4% was less than the 50% target. For online students, the pre- to post-test increase was 195% (5.9% to 17.4%); however, the post-test result of 17.4% was well below the 50% mark. In discussions with the faculty teaching PHIL 2303 – Critical Thinking, the program determined that a faculty member did not include a teaching unit on linked probabilities in his/her face-to-face and online sections. To address this issue, the program undertook a series of program-wide meetings to ensure that all faculty were aware of the appropriate student learning outcomes for each course and to ensure that they were being adequately covered within the course curriculum [26].
College of Science and Engineering Technology – Geology BS (2016-2017 cycle) [27]
For the 2016-2017 cycle, the Geology BS program identified three learning objectives. These objectives were assessed using a variety of instruments, including mineral and mapping practical exams, embedded questions, and external evaluations at a required capstone Field Camp. As an example, one of the program’s learning objectives was that “students completing the introductory geology courses will demonstrate an understanding of the basic skills required of a geology major to succeed in subsequent coursework.” These skills, in particular, included the ability to observe the mineral properties necessary for mineral identification and the ability to read maps and to make geological interpretations based on map observations. The program expected that at least 70% of students would score 70% or higher on examined assignments or embedded questions, with the remaining 30% of students scoring at least 50% [28] [29].
When the program examined the student results for this objective, they determined that students had not hit their targets for success. Only 50% of students were able to determine mineral cleavage correctly 70% of the time, and 25% of students were successful less than 50% of the time. On rock and mineral practices, only 40% of students were able to determine the appropriate texture of igneous rocks 70% of the time, but 53% were successful 50% of the time. On map skills practices only 26% of students were able to determine location coordinates and make correct geological interpretations based on map observations correctly 70% of the time, and only 37% were successful 50% of the time [30].
The faculty within the Geology BS program have found the assessment results they have gathered to be helpful, and they plan to continue with their embedded assessment approach. In particular, the results have helped the program identify specific areas for improvement, such as igneous rock textures and geological interpretations based on map observations. These data have informed their efforts to develop improved curricular and pedagogical methods for providing students with instruction related to these skills.
Example Assessment Plans and Reports
As further demonstration of ongoing quality assessment practices at SHSU, this narrative includes documents containing example assessment plans for the last three complete assessment cycles:
Table 1. Example Assessment Plans
College of Business Administration | 2014-2015 [31] | 2015-2016 [32] | 2016-2017 [33] |
College of Criminal Justice | 2014-2015 [34] | 2015-2016 [35] | 2016-2017 [36] |
College of Education | 2014-2015 [37] | 2015-2016 [38] | 2016-2017 [39] |
College of Fine Arts and Mass Communication | 2014-2015 [40] | 2015-2016 [41] | 2016-2017 [42] |
College of Health Sciences | 2014-2015 [43] | 2015-2016 [44] | 2016-2017 [45] |
College of Humanities and Social Sciences | 2014-2015 [46] | 2015-2016 [47] | 2016-2017 [48] |
College of Science and Engineering Technology | 2014-2015 [49] | 2015-2016 [50] | 2016-2017 [51] |
The following procedure was used to select the highlighted units: For departments containing two or fewer units, one was selected for inclusion; for departments of three to four units, two were selected; for departments of five to six units, three were selected; and for departments of seven or more units, four were selected. This selection scheme provided roughly 50% of academic assessment plans from each given cycle. A complete repository for all educational program assessment plans for these cycles is included as part of the SACSCOC Reaffirmation Report Website.
Distance Education
For the purposes of programmatic assessment, distance education programs at SHSU are classified as one of two types: (a) fully online programs, in which students can earn a degree only through online or distance education formats and (b) hybrid programs, in which students can earn 50% or more of a degree through online or distance education formats but some (or all) of the degree may also be offered through traditional face-to-face modalities.
Distance education programs conduct and document their annual assessment efforts in the same manner as their traditional, face-to-face counterparts. As theory and practice regarding distance education assessment has evolved, OAPA has endeavored to create guidelines for these programs that align with generally recognized best practices. These guidelines are summarized within the Best Practices for Documenting Assessment of Online and Distance Education Programs document [52]. This document, developed during the spring 2014 semester, provides a summary of the recommendations and guidelines outlined within several SACSCOC documents: (a) Best Practices for Electronically Offered Degree and Certificate Programs [53], (b) Distance and Correspondence Education: Policy Statement [54], and (c) Guidelines for Addressing Distance and Correspondence Education: A Guide for Evaluators Charged with Reviewing Distance and Correspondence Education [55].
Programs that are exclusively available online report their assessment results and actions normally within CampusLabs: Planning. Programs that employ a hybrid model, in which students can potentially complete a degree through both distance and face-to-face modalities, are encouraged to disaggregate their assessment results (where appropriate) [56] in CampusLabs for online and face-to-face students and to use the results from both groups in the formulation of their actions for improvement [57]. An example of how hybrid programs use disaggregated assessment results to drive improvement is provided below.
During the 2016-2017 cycle, the Marketing BBA program examined assessment results from both online and face-to-face students for the program’s learning objective that states “students will be able to describe marketing core concepts and principles.” These core marketing concepts and principles included the following:
This objective was assessed using embedded exam questions from both online and face-to-face sections of MKTG 3310 – Fundamentals of Marketing, with the expectation that 70% of students would score 70% or higher on all assessments. Student performance data were reported for both face-to-face [59] and online students [60]. The program identified three areas of weakness for both online and face-to-face students: (a) supply chain functions, (b) characteristics that distinguish goods from services, and (c) price elasticity of demand. The average scores of face-to-face students for these three areas were 58.8%, 39.7%, and 68.8%, respectively. The program determined that online students achieved the 70% mark for each of these areas, but the scores for all three were lower when compared to student performance with regard to the other concepts.
In response to these results, the program determined a need to make improvements for all students in these areas. It was determined that students should be given more exposure to these three principles. A first step was to articulate explicitly all 14 of the concepts within the syllabi for all sections of the course and to refer to them consistently as a way of reinforcing these concepts to the students throughout the semester. Faculty would also give more time to each of the problematic topics when they were initially introduced in the course, with an additional review in the following class period. Additionally, efforts would be made to better incorporate these concepts into other courses across the program’s curriculum. Concerns were also voiced that not all students were purchasing the textbook for the course, leading them to struggle. The program determined that they would pursue an opportunity to provide students with access to a free e-book. Finally, a major course redesign of MKTG 3310 was carried out to be implanted for the fall 2017 semester. As part of this revamp, the program adopted Pearson My Lab for use in the course, which they anticipated would be a good resource for students to improve their comprehension of the course learning objectives [61].
Graduate Program Review
Graduate program review represents an additional component of SHSU’s efforts to ensure quality educational programs. In accordance with Texas Administrative Code, Rule 5.52, Review of Existing Degree Programs [62], all graduate programs at SHSU engage in an external review process. This graduate review process is governed by the Texas Higher Education Coordinating Board (THECB) and is overseen at SHSU by the Dean of Graduate Studies [63].
On a rotating, 7-year cycle [64], each graduate program conducts a self-study [65] that addresses the aspects that are common to all graduate programs as well as aspects that are unique attributes of each program. A self-study is but one tool to guide programs in their continuous improvement efforts in meeting the challenge of serving the needs of students, the University, and external stakeholders. The graduate program self-studies provide an overview of the programs as well as a detailed study of the curricula, graduate faculty, program resources, assessment, student success, recruitment, and marketing.
The Self-Study Process
The self-study process incorporates three stages, examples of which are provided here as evidence: (a) the creation of the self-study [66] [67] [68], (b) an external review [69] [70] [71], and (c) the development of an action plan for improvement [72] [73] [74]. The program faculty and the support staff conduct a thorough program review and produce a report with appropriate support documentation. A team of external reviewers reviews the report, visits the campus and consults with program personnel and University administrators, and subsequently provides an evaluation of the program to include program strengths and recommendations for improvement. University leaders, in coordination with faculty, develop an action plan in response to the results of the self-study and external review. The process is as transparent and inclusive as possible. The self-study, the external reviewers’ report, and the response are all submitted to the THECB.
Follow-Up Process
One year following the program review, the program director, chair, academic dean, graduate dean, and Provost meet to discuss the progress made on the action plan, which addresses the recommendations for improvement. Any outstanding issues or barriers to improvement are discussed and addressed.
Meta-Assessment
As part of University-wide efforts to promote quality assessment practices, OAPA also facilitates an annual meta-assessment process. Beginning with the evaluation of the 2016-2017 assessment cycle, OAPA began transitioning to a locally developed 4-point rubric [75] to evaluate the quality of programmatic assessment plans. This rubric was a revision of an older locally developed 3-point rubric [76]. The focus of the meta-assessment review is not on what is being assessed by each unit, but rather the quality of the assessment, with emphasis on the assessment practices and processes themselves.
Each academic college is requested by OAPA to devise a plan for evaluating the annual assessment plans for all academic programs in the college using the Meta-assessment Rubric [75]. The colleges are given some flexibility in how and when these reviews take place. For example, although some colleges choose to evaluate all academic program assessment plans every year, other colleges have implemented a rotational cycle in which all plans are evaluated over a multi-year period.
Feedback from the annual meta-assessment reviews is used in two ways. First, completed meta-assessment rubrics are returned to the individual units for review and use in the continuous improvement of their assessment practices. Second, data from these reviews are used by University administrators to identify areas where training and resources are needed to improve programmatic assessment efforts. Examples of completed meta-assessment rubrics are provided to highlight this process [77] [78] [79] [80] [81] [82]. In addition, some examples of college-level summary reports are provided here to highlight how meta-assessment is being used at the college and institutional levels [83] [84] [85] [86] [87] [88] [89]. Within these reports, college leaders are asked to reflect upon the strengths and weaknesses they observed in reviewing the completed meta-assessment rubrics for their units, as well as offer strategies for addressing them. These reports, along with the complete rubrics, are then used by OAPA to enhance assessment-related training and resources.