Academic and student services programs at Sam Houston State University (SHSU) regularly identify expected outcomes, assess the extent to which they achieve those outcomes, and provide evidence of seeking improvement based on analysis of the results. This narrative features specific examples of the outcomes assessment process used by academic and student services at SHSU. These initial examples are expanded upon in more extensive documents highlighting completed unit-level assessment plans from the past three assessment cycles (i.e., 2014-2015, 2015-2016, and 2016-2017). Reviewers may access all archived academic unit assessment plans for these cycles within SHSU’s reaffirmation website. Instructions for accessing this repository are available on the instruction sheet. Furthermore, this narrative will highlight the initiatives and structures put into place by SHSU to ensure compliance with the guidelines, recommendations, and requirements of the Southern Association of Colleges and Schools Commission on Colleges (SACSCOC) regarding Standard 8.2.c.
Institutional assessment at SHSU is overseen by the Office of Academic Planning and Assessment (OAPA) [1]. OAPA staff members provide regular training, resources, and support to all units across campus conducting annual assessment [2]. Academic and student services programs document assessment information in the campus’s online assessment management system. Through the 2014-2015 assessment cycle, SHSU used the Online Assessment Tracking Database (OATDB) to document each unit’s assessment plans. Effective with the 2015-2016 cycle, SHSU transitioned all assessment plan documentation into the CampusLabs software [3].
SHSU uses a 13-month annual assessment cycle for all units, running from September to October of the following year. This cycle roughly aligns with the University’s academic calendar and gives units flexibility to collect data from all academic semesters (i.e., fall, spring, and summer). OAPA staff members monitor unit entries throughout the year to provide necessary support and ensure participation in the ongoing assessment cycle.
Annual Assessment Plan Elements
Units at SHSU use CampusLabs (formerly the OATDB) to document their ongoing assessment plans and reports. Administrative support units are asked to provide the following assessment plan elements.
Examples of Annual Outcomes Assessment
Specific examples from each academic college for the most recently completed assessment cycle (2016-2017) are highlighted below, in detail, to demonstrate how academic and student services units at SHSU define outcomes, assess the extent to which they achieve these outcomes, and use the analyzed data to identify and carry out actions for improvement.
Division of Academic Affairs – International Programs (2016-2017 cycle) [4]
The Office of International Programs identified three goals, which are supported by six performance objectives. These goals and objectives for the 2016-2017 cycle were as follows:
As an example of how these outcomes were measured, within the goal to increase international student enrollment, two objectives centered on recruitment were identified: (1) Collaborate with SHSU Admissions to help increase international student enrollment and (2) Recruit international students at Texas community colleges. In conjunction with Admissions counselors [5], International Programs set a target of attending at least five recruitment events during the 2016-2017 academic year. They far exceeded this target by attending 17 events at various community colleges, the Mexican Consulate in Houston, and the Raindrop Turkish Houston Women’s Association. Even though this objective was met, International Programs recognized that Admissions recruiters needed additional encouragement to disburse information. In order to encourage this support, an incentive program was to be developed. They also determined that an additional international recruiter was needed to provide more impact, so they created an action to advocate for the hire of a part-time recruiter.
As part of their efforts to recruit students from Texas community colleges, International Programs developed the International VIP Program [6], in which they collected contact information from potential international transfer students, to whom they sent periodic program updates and information on transfer advisement. They exceeded their goal of recruitment to this program and expanded results reporting to include how many of these students submitted applications to attend SHSU. Further actions included increasing the number of table events to increase VIP enrollment and collecting data on the number of VIP students who apply and enroll at SHSU.
Division of Academic Affairs – SAM Center (2016-2017 cycle) [7]
The Student Advising and Mentoring (SAM) Center identified five goals, which are supported by one learning objective and eleven performance objectives. These goals and objectives for the 2016-2017 cycle were as follows:
As an example of how these outcomes were measured, within the goal of Academic Advising, one of the objectives identified was to provide a positive and informative advising experience. In order to measure the advising experience [8], a feedback survey [9] of student perception of advising services was used. The target was to increase the response rate for 2016-2017 by 5% over 2015-2016, which was the first year this survey was used. The results indicated an 89.07% decrease in survey response rate, even though 1,507 more advising sessions were conducted in 2016-2017. While this data did not inform the original question, it did provide valuable information on a breakdown in the survey distribution process. It was found that the surveys were not distributed as they should have been, and the responsible parties were counseled on the importance of dispersing these surveys. As a future action the SAM Center will consider using an online survey format for 2017-2018.
Within the goal of First Alert, one of the objectives identified was to increase program outreach. The First Alert Program allows SHSU faculty and staff to refer students to the mentoring section of the SAM Center for academic support. Regarding increasing program outreach [10], the SAM Center sought benchmark data by comparing the number of referrals received during 2015-2016 with 2016-2017. There was a 21.2% increase in student referrals from 2015-2016 (1,240) to 2016-2017 (1,503), although the number of people referring students increased minimally (from 266 to 277). The rise in referrals seen for 2016-2017, coupled with the implementation of a new progress report function in the University’s student management dashboard, is expected to increase substantially in the coming academic year. As a result, the SAM Center identified an action to prepare for this increase by training a new hire to help facilitate the referrals.
Division of Enrollment Management – Financial Aid and Scholarships (2016-2017 cycle) [11]
The Financial Aid and Scholarships Office identified three goals, which are supported by three performance objectives. These goals and objectives for the 2016-2017 cycle were as follows:
As an example of how these outcomes were measured, within the goal of Student Service, the objective to reduce student contact during peak times through proactive outreach efforts was identified. In order to reduce the number of phone calls and office visits during the busiest times of the year, Financial Aid and Scholarships developed KPIs to improve the efficiency of submitting paperwork and to provide proactive outreach. The plan to improve the paperwork process [12] was to move from requiring handwritten signatures to electronic signatures via Adobe Sign. Not only would this reduce the amount of paper being submitted to the office, but forms would also be immediately retrievable and processed by personnel faster. Due to this being a new initiative, results were not available by the end of the assessment cycle; however, a committee was designated to work with Information Technology (IT@Sam) to research other alternatives for electronic submission and to pursue the subsequent implementation.
Proactive outreach included staffing a table [13] in the Lowman Student Center (LSC) Mall to talk to passing students, as it was found that the majority of students seen during peak times were returning students. By reinforcing the importance of financial aid application deadlines, there was a 27% increase in FAFSAs received. Financial Aid also participated in the Student Money Management Center (SMMC) Financial Literacy Week [14] to educate current students on financial aid opportunities and deadlines. The efforts this office made to be more proactive in reaching out to students resulted in a 16% reduction in front-counter traffic and a 37% decrease in phone calls compared to the prior academic year peak times. Actions for the following assessment cycle included using reports in My Success Planner (MSP; now Campus Connect) to determine why students visited or called in order to inform customer service and to have more workshop collaborations with SMMC.
Division of Student Affairs – Counseling Services (2016-2017 cycle) [15]
Counseling Services identified two goals, which are supported by one learning objective and two performance objectives. These goals and objectives for the 2016-2017 cycle were as follows:
As an example of how these outcomes were measured, within the goal of Development, two objectives were identified: (1) Practicum Training Effectiveness and (2) Counseling Center Productivity. Regarding practicum training [16] [17], the Counseling Center implemented an evaluation to measure counseling students’ basic clinical skills at the beginning, middle, and end of the training experience. The criterion of this objective was that students would improve an average of two points on a 5-point scale across each of the following areas: Professionalism, Ethical Behavior, Therapy Skills, Effective Use of Supervision, and Cultural Fluency. From beginning to mid-semester there was virtually no movement, but by the end of the semester their students displayed average increases between one and three points in each area. The areas with the least growth, Ethical Behavior and Professionalism, were due to trainees already being at the higher end of the scale from the beginning, leaving little room for growth. The evaluation was found to be useful and will be used again.
The Counseling Center strives to keep up with clinical demand and regularly has students on a waitlist during their busiest times [18]. In addition to this direct service, they must make time for indirect service and administrative tasks. For this assessment cycle, they set targets so that no more than 60% of their time would be spent on direct service, no more than 25% on indirect service, and no more than 15% on administrative tasks. The results were as follows: direct service – 58%, indirect service – 20%, and administrative tasks – 22%. Even with a full staff, they had a difficult time keeping up with the direct clinical service and had about 180 students on a waitlist. The resultant action was for the leadership team to investigate other ways to implement direct clinical service.
Division of Student Affairs – Student Activities (2016-2017 cycle) [19]
Student Activities identified three goals, which are supported by four performance objectives. These goals and objectives for the 2016-2017 cycle were as follows:
As an example of how these outcomes were measured, within the goal of Student Engagement and Spirit, two objectives were identified to address Cheer programs and other student engagement and spirit activities. Regarding Cheer programs [20], Student Activities distributed a 21-question survey to members of all four spirit team squads to obtain their perceptions on attainment of five of the Texas Higher Education Coordinating Board’s core competencies: critical thinking, communication, teamwork, social responsibility, and personal responsibility. The criterion was simply to obtain baseline data in order to better understand how to further prepare students for the workforce beyond graduation. There were 67 respondents, of which at least 95% agreed or strongly agreed that each of the core competencies were further developed as a result of their participation in the Cheer programs [21]. The areas of written communication, community engagement, and personal and social responsibility received lower scores from respondents, so Student Activities plans to incorporate additional trainings and exercises to address any deficiencies. One example will be to add a community project to increase community engagement.
Student Activities also measured student perceptions of those who attended the Kat Comedy Showcase and Sammypalooza [22]. The objective was that by attending campus events such as these, students would become more engaged and, therefore, more likely to continue enrollment at SHSU. In addition to gathering information from surveys, Student Activities planned to track student retention from spring 2017 to fall 2017, although this data was not available by the close of the 2016-2017 assessment cycle. Based on survey results [23] [24], a few areas were found to need improvement: sense of belonging on campus, student involvement and its contribution to student success at SHSU, and the promotion of student activities on campus. The action put into place was to pilot a new program for fall 2017 that would consist of 24 Welcome Week Leaders who would engage with the incoming freshmen, encouraging them to engage with other students and to attend Welcome Week functions.
Example Assessment Plans and Reports
As a further demonstration of ongoing quality assessment practices at SHSU, this narrative includes documents containing example assessment plans from academic and student services units for the last three complete assessment cycles: 2014-2015 [25], 2015-2016 [26], and 2016-2017 [27].
The selection scheme used to highlight units provided roughly 50% of academic and student services assessment plans from each cycle. A complete list of all academic and student services unit assessment plans for these cycles is included as part of the SACSCOC Reaffirmation Report Website.
Meta-Assessment
As part of University-wide efforts to promote quality assessment practices, OAPA also facilitates an annual meta-assessment process. Beginning with the evaluation of the 2016-2017 assessment cycle, OAPA began transitioning to a locally developed 4-point rubric [28] to evaluate the quality of programmatic assessment plans. This rubric was a revision of an older locally developed 3-point rubric [29]. The focus of the meta-assessment review is not on what is being assessed by each unit, but rather the quality of the assessment, with emphasis on the assessment practices and processes themselves.
Feedback from the annual meta-assessment reviews is used in two ways. First, completed meta-assessment rubrics are returned to the individual units for review and use in the continuous improvement of their assessment practices. Second, data from these reviews are used by University administrators to identify areas where training and resources are needed to improve programmatic assessment efforts. Examples of the most recently completed meta-assessment rubrics for academic and student services units are provided to highlight this process [30] [31] [32].
Administrative Program Review
In addition to the institutional annual outcomes assessment and meta-assessment processes, academic and student support services units also participate in an Administrative Program Review (APR) process. The purpose of APR is to support executives, managers, and employees of SHSU’s administrative departments in the examination of current operations, identification of opportunities for enhancement, implementation of adjustments, and establishment of plans for continuous improvement. The APR process is facilitated by the Office of Institutional Effectiveness [33] [34].
The APR Process
The APR process incorporates three-stages: (a) the completion of a self-study [35] [36], (b) a peer review [37] [38], and (c) the development of an action plan for improvement [39] [40].
The Self-Study Process. Under the guidance of the Assistant Director for APR, the administrative department completes a self-study document, which addresses the department’s mission statement, facilities, plan for staffing, stakeholder feedback, policies and procedures, and communication/outreach techniques. Supporting datasets and administrative documents are provided to the department head in advance of the self-study. These may include documentation of annual assessments, department budget and expense reports, institutional and divisional goals, guidelines issued by the Council for the Advancement of Standards in Higher Education (CAS), organizational charts, etc.
The Peer Review Process. A peer review committee is identified by the administrative department head. Coordinated by the Assistant Director for APR, the committee consists of, at minimum, one internal reviewer (on-campus faculty or staff from outside the administrative department), one external reviewer (professional not employed at SHSU), and one student reviewer. The role of the peer review committee is to review the department’s self-study, to meet with the department and other committee members, and to complete a rubric documenting evaluations and recommendations. Rubrics are collected by the Assistant Director for APR. Identifying information is removed, and the rubrics are given to the department head for review. The comments and recommendations given by the peer reviewers are taken into consideration as the action plan is developed.
Action Plan Development. Once the peer review process is complete, the administrative department head works with the Assistant Director for APR to create an action plan, establishing viable plans for continuous improvement.
Upon completion of the action plan, APR staff compose an executive summary [41] [42] outlining the department’s review process and action plan objectives. This document is distributed to departmental and divisional leadership.
Follow-Up Process. Two years following the program review, the Assistant Director for APR and the department head meet to discuss progress made on action plan objectives. Any outstanding issues or barriers to improvement are discussed and addressed, and adjustments to the action plan are made as necessary. The APR process is revisited once every five years.