Administrative support services at Sam Houston State University (SHSU) regularly identify expected outcomes and demonstrate the extent to which the outcomes are achieved. This narrative features specific examples of the outcomes assessment process used by administrative support units at SHSU. These initial examples are expanded upon in more extensive, division-specific documents highlighting completed unit-level assessment plans from the past three assessment cycles (i.e., 2014-2015, 2015-2016, and 2016-2017). Reviewers may access all archived academic unit assessment plans for these cycles within SHSU’s reaffirmation website. Instructions for accessing this repository are available on the Instruction Sheet.
Institutional assessment at SHSU is overseen by the Office of Academic Planning and Assessment (OAPA) [1]. OAPA staff members provide regular training, resources, and support to all units across campus conducting annual assessment [2]. Administrative support units document assessment information in the campus’s online assessment management system. Through the 2014-2015 assessment cycle, SHSU used the Online Assessment Tracking Database (OATDB) to document each unit’s assessment plans. Effective with the 2015-2016 cycle SHSU transitioned all assessment plan documentation into the CampusLabs software [3].
SHSU uses a 13-month annual assessment cycle for all units, running from September to October of the following year. This cycle roughly aligns with the University’s academic calendar and gives units flexibility to collect data from all academic semesters (i.e., fall, spring, and summer). OAPA staff members monitor unit entries throughout the year to provide necessary support and ensure participation in the ongoing assessment cycle.
Annual Assessment Plan Elements
Units at SHSU use CampusLabs (formerly the OATDB) to identify expected outcomes and to demonstrate the extent to which the outcomes are achieved. Administrative support units are asked to provide the following assessment plan elements:
Examples of Annual Outcomes Assessment
Specific examples from each division for the most recently completed assessment cycle (2016-2017) are highlighted here, in detail, to demonstrate how administrative support units at SHSU define outcomes and assess the extent to which they achieve these outcomes.
Division of Academic Affairs – Academic Planning and Assessment (2016-2017 cycle) [4]
The Office of Academic Planning and Assessment has six defined goals, which are supported by twelve performance objectives. These goals and objectives for the 2016-2017 cycle were as follows:
As an example of how these outcomes were measured, within the goal to promote the scholarship of assessment, two performance objectives were identified: Assessment Mini-Grants and Scholarly Presentations and Publications. The Assessment Mini-Grant program was made available to SHSU faculty and staff through a competitive application process, through which up to ten recipients could be chosen to receive a grant of up to $1,000 for assessment-related activities. All ten awards were distributed from a pool of 29 applicants, which covered projects ranging from developing assessment instruments and surveys to presenting at assessment-related conferences [5].
Regarding scholarly presentations and publications, Academic Planning and Assessment staff were expected to present at four or more state, regional, or national conferences. This objective was accomplished as all three assessment staff members did make four presentations [2] on a variety of topics:
Division of Enrollment Management – Institutional Effectiveness (2016-2017 cycle) [6]
For the 2016-2017 assessment cycle, Institutional Effectiveness defined eight performance objectives, which supported two goals:
As an example of how these outcomes were measured, Institutional Effectiveness (IE) identified five performance objectives to fulfill the goal of collecting, analyzing, and disseminating institutional data in a timely and understandable manner. In order to strive toward achieving this goal, they created electronic documents and reports, provided training, participated in training, and requested feedback to ensure client satisfaction. They were successful in achieving the outcomes associated with building reports. For example, for the outcome to build four new Cognos reports, they built five new reports and updated two others. IE also fulfilled this outcome by sending its staff members to cross training with other departments in order to have a better understanding of University operations.
For the goal to contribute materially in the University-wide process for continuous improvement, IE identified three performance objectives to assess the Administrative Program Review (APR) process. APR supports the administrative departments in examining their operations and making adjustments for continuous improvement [7]. IE had a target of completing APR for five departments during 2016-2017, and they achieved their objective of expanding APR into more University departments by completing the process for seven departments.
Division of Finance and Operations – Human Resources (2016-2017 cycle) [8]
For the 2016-2017 assessment cycle, Human Resources defined six performance objectives. These objectives supported two goals:
As an example of how these outcomes were measured, Human Resources (HR) was successful in promoting the goal of developing a strong employee development process through three performance objectives. HR already hosts many online training sessions through Cornerstone Talent Management [9], but they strived to provide additional job-specific trainings for employees during 2016-2017. The result was five additional trainings through their signature “Come for Coffee” series.
In order to satisfy the goal of facilitating continuous improvement within its own department, HR set out to review each staff employee job offer and reclassification request for internal pay equity, consolidate staff position classes, and improve the hiring process. While reviewing job offers for equity and consolidating staff positions are ongoing outcomes for this department, HR did report that online Search Committee Training and resources were provided to assist search committees in the hiring process.
Division of Finance and Operations – University Police Department (UPD) and Parking (2016-2017 cycle) [10]
For the 2016-2017 assessment cycle, UPD and Parking defined two goals and two performance objectives:
Measuring the outcomes of enhancing University crime prevention and safety was twofold; UPD analyzed parking citations to evaluate the number of citations given versus the number that were paid in order to review trends for budgeting purposes, and UPD requested the completion of surveys by participants who attended UPD presentations. UPD stated that parking citation fines are not the main staple of their budget, but the funds to assist with infrastructure maintenance. The citation fine total was well above the goal necessary to have $250,000 in reserve. They also discovered that they were able to capture more granular data due to the new parking management system, so they will be creating new performance indicators for the next assessment cycle. In addition, UPD conducted over 30 different presentations at SHSU since January 2017 covering a variety of topics. Other events included the Kats Safety Bash, Destination Spring Break, a mock dorm room fire, and an online or in-person training called All-Hazards Awareness [11]. Attendees who completed surveys gave above average marks for all presentations and events.
In order to assess the efficiency of operations compared to similar universities, UPD compared the officer-to-student ratio to that of similar institutions. Other universities with similar police departments typically have one officer for every 1,000 individuals in the total university population (i.e., students, faculty, and staff). Moving forward, SHSU’s UPD planned to maintain that ratio, or an even lower ratio of one officer per 800 individuals in the total population. During 2016-2017, UPD, along with all other SHSU departments, had to find ways to accomplish more with fewer resources. Rather than hire additional officers, the department made several changes that positively impacted operational effectiveness. One implementation concerned the shift structure: Rather than having eight- or ten-hour shifts, the department determined that twelve-hour shifts were in the best interest of all. In regards to campus safety, it meant that there would always be a shift minimum of two officers, whereas previously the shift minimum was one officer. UPD is currently maintaining the 1/1,000 ratio, but they plan to request additional personnel within the next couple of years.
Division of Information Technology – Information Technology (IT) Client Services (2016-2017 cycle) [12]
For the 2016-2017 assessment cycle, IT Client Services defined eight performance objectives. These objectives support the following four goals:
As an example of how these outcomes were measured, IT Client Services put into place three performance indicators to measure increases in campus efficiency of costs, operations, and communications. One indicator was to improve the inventory audit process, which was accomplished through improvements in policies and procedures to better track the movement of assets and prevent loss. Implementing sign-in sheets that required signatures from staff, clients, and the Asset Manager greatly assisted in tracking items. The annual inventory audit revealed a small percentage of lost assets, which the department plans to improve by tightening the control on the physical areas where assets are kept prior to installation.
IT Client Services also employed surveys to satisfy the goal of providing high-quality support services. The target was that 95% of respondents would give a “satisfied” or better rating on their perceptions in the following areas: overall service request experience, technical qualifications of the staff member assisting with the request, and time needed to complete the service request. Out of more than 1,500 respondents, 95% were satisfied or very satisfied with the overall experience, 96% were satisfied or very satisfied with the technical qualifications, and 94% were satisfied or very satisfied with the completion time.
Division of University Advancement – Alumni Relations (2016-2017 cycle) [13]
For the 2016-2017 assessment cycle, Alumni Relations defined two goals and three performance objectives:
In order to enhance the image of the University, Alumni Relations made plans to increase the number of meetings and events (at least 360) and to increase the overall attendance (at least 28,000) at all meetings and events. They exceeded their goals by holding 395 meetings and events, with a total attendance of 28,874 through August 31, 2017.
Alumni Relations was also charged with securing private support for the University by increasing Alumni Association memberships. Communications through mass mailings and emails to non-active alumni and lapsing members were employed throughout the year, and memberships were tracked through a software system called Razor’s Edge. The department’s goal of obtaining 13,400 members by August 31, 2017 was met, as the final numbers reflected a total membership of 13,700.
Example Assessment Plans and Reports
As further demonstration of ongoing quality assessment practices at SHSU, this narrative includes documents containing example assessment plans for the last three complete assessment cycles: 2014-2015 [14], 2015-2016 [15], and 2016-2017 [16]. The selection scheme used to highlight units provided roughly 50% of administrative support assessment plans from each given cycle. A complete list of all administrative support unit assessment plans for these cycles is included as part of the SACSCOC Reaffirmation Report Website.
Meta-Assessment
As part of University-wide efforts to promote quality assessment practices, OAPA also facilitates an annual meta-assessment process that is inclusive of administrative unit assessment plans. Beginning with the evaluation of the 2016-2017 assessment cycle, OAPA began transitioning to a locally developed, 4-point rubric [17] to evaluate the quality of programmatic assessment plans. This rubric was a revision of an older, locally developed 3-point rubric [18]. The focus of the meta-assessment review is not on what is being assessed by each unit, but rather the quality of the assessment, with emphasis on the assessment practices and processes themselves.
Feedback from the annual meta-assessment reviews is used in two ways. First, completed meta-assessment rubrics are returned to the individual units for review and use in the continuous improvement of their assessment practices. Second, data from these reviews are used by University administrators to identify areas where training and resources are needed to improve programmatic assessment efforts. Examples of the most recently completed meta-assessment rubrics are provided to highlight this process [19] [20] [21].
Administrative Program Review (APR)
In addition to the institutional annual outcomes assessment and meta-assessment processes, administrative units also participate in an APR process. The purpose of APR is to support executives, managers, and employees of SHSU administrative departments in the examination of current operations, identification of opportunities for enhancement, implementation of adjustments, and establishment of plans for continuous improvement. The APR process is facilitated by the Office of Institutional Effectiveness [7] [22].
The Administrative Program Review Process
The APR process incorporates three-stages: (a) the completion of a self-study, (b) the completion of a peer review, and (c) the development of an action plan for improvement.
The Self-Study Process. Under the guidance of the Assistant Director for APR, the administrative department completes a self-study document, which addresses the department’s mission statement, facilities, plan for staffing, stakeholder feedback, policies and procedures, and communication/outreach techniques. Supporting datasets and administrative documents are provided to the department head in advance of the self-study. These may include documentation of annual assessments, department budget and expense reports, institutional and divisional goals, guidelines issued by the Council for the Advancement of Standards in Higher Education (CAS), organizational charts, etc. [23] [24].
The Peer Review Process. A peer review committee is identified by the administrative department head. Coordinated by the Assistant Director for APR, the committee consists of, at minimum, one internal reviewer (on-campus faculty or staff from outside the administrative department), one external reviewer (professional not employed at SHSU), and one student reviewer. The role of the peer review committee is to review the department’s self-study, to meet with the department and other committee members, and to complete a rubric documenting evaluations and recommendations. Rubrics are collected by the Assistant Director for APR. Identifying information is removed, and the rubrics are given to the department head for review. The comments and recommendations given by the peer reviewers are taken into consideration as the action plan is developed [25] [26].
Action Plan Development. Once the peer review process is complete, the administrative department head works with the Assistant Director for APR to create an action plan, establishing viable plans for continuous improvement [27] [28].
Upon completion of the action plan, APR staff compose an executive summary [29] [30] outlining the department’s review process and action plan objectives. This document is distributed to departmental and divisional leadership.
Follow-Up Process. Two years following the program review, the Assistant Director for APR and the department head meet to discuss progress made on action plan objectives. Any outstanding issues or barriers to improvement are discussed and addressed, and adjustments to the action plan are made as necessary. The APR process is revisited once every five years.