2019 SACSCOC Reaffirmation

Compliance Narratives


7.3 - Administrative Effectiveness


The institution identifies expected outcomes of its administrative support services and demonstrates the extent to which the outcomes are achieved. (Administrative effectiveness)

Judgment of Compliance

Compliant

Narrative


Administrative support services at Sam Houston State University (SHSU) regularly identify expected outcomes and demonstrate the extent to which the outcomes are achieved. This narrative features specific examples of the outcomes assessment process used by administrative support units at SHSU. These initial examples are expanded upon in more extensive, division-specific documents highlighting completed unit-level assessment plans from the past three assessment cycles (i.e., 2014-2015, 2015-2016, and 2016-2017). Reviewers may access all archived academic unit assessment plans for these cycles within SHSU’s reaffirmation website. Instructions for accessing this repository are available on the Instruction Sheet.

Institutional assessment at SHSU is overseen by the Office of Academic Planning and Assessment (OAPA) [1]. OAPA staff members provide regular training, resources, and support to all units across campus conducting annual assessment [2]. Administrative support units document assessment information in the campus’s online assessment management system. Through the 2014-2015 assessment cycle, SHSU used the Online Assessment Tracking Database (OATDB) to document each unit’s assessment plans. Effective with the 2015-2016 cycle SHSU transitioned all assessment plan documentation into the CampusLabs software [3].

SHSU uses a 13-month annual assessment cycle for all units, running from September to October of the following year. This cycle roughly aligns with the University’s academic calendar and gives units flexibility to collect data from all academic semesters (i.e., fall, spring, and summer). OAPA staff members monitor unit entries throughout the year to provide necessary support and ensure participation in the ongoing assessment cycle.

Annual Assessment Plan Elements

Units at SHSU use CampusLabs (formerly the OATDB) to identify expected outcomes and to demonstrate the extent to which the outcomes are achieved. Administrative support units are asked to provide the following assessment plan elements:

Goals
Goals are broad statements of mission or purpose that serve as guiding principles for a unit. By their nature, goals are not necessarily measurable.
Objectives
Objectives are specific statements of intent or purpose that a unit expects to achieve. Objectives are measurable and aligned with a unit’s goals. Both learning and performance objectives may be used by a unit, as appropriate.
Performance Objectives
Performance objectives are the expected attainment of non-learning tasks (e.g., satisfaction with service, attendance/participation levels, student recruitment and enrollment, general administrative functions)
Learning Objectives
Learning objectives are the expected knowledge or skills someone should gain as a result of receiving instruction or training. (Note: Administrative support units are not required to use Learning Objectives; however, they are not discouraged from using them when appropriate.)
Key Performance Indicators (KPIs; For Performance Objectives)
KPIs are the instruments, processes, or evidence, both direct and indirect, used by a unit to assess a performance objective.
Indicators (For Learning Objectives)
Indicators are the instruments, processes, or evidence, both direct and indirect, used by a unit to assess a learning objective.
Criterion (For Learning Objectives)
Criterion are utilized with indicators to assess learning objectives. Criterion are the level of expected attainment or performance for an objective.
Findings/Results
Findings or Results are the data gathered from the unit’s assessment measures.

Examples of Annual Outcomes Assessment

Specific examples from each division for the most recently completed assessment cycle (2016-2017) are highlighted here, in detail, to demonstrate how administrative support units at SHSU define outcomes and assess the extent to which they achieve these outcomes.

Division of Academic Affairs – Academic Planning and Assessment (2016-2017 cycle) [4]

The Office of Academic Planning and Assessment has six defined goals, which are supported by twelve performance objectives. These goals and objectives for the 2016-2017 cycle were as follows:

  1. Effective and Efficient Administrative Practices, supported by:
    • Create a Curriculum Plan for Academic Affairs
    • Curriculum Review and Approval Process
    • Undergraduate and Graduate Catalog
  2. Promote an Environment that Encourages Continuous Improvement of Assessment Initiatives, supported by:
    • Ensure Quality Annual Assessment Processes
    • Provide Quality Assessment Support Resources
    • Provide Quality Assessment Support Services – Brown Bag Lunch Series
  3. Promote the Scholarship of Assessment, supported by:
    • Assessment Mini-Grants
    • Scholarly Presentations and Publications
  4. Support and Facilitate the Undergraduate Program Review Process, supported by:
    • Design and Implement Quality Undergraduate Program Review Process
  5. Support the Institution’s Ongoing SACSCOC Accreditation Efforts, supported by:
    • Ensure Institutional Compliance With and Timely Submission of Required SACSCOC Documentation
    • Facilitate Completion of the SACSCOC 2019 Compliance Certification Report
  6. Support the Strategic Planning Process for the Division of Academic Affairs, supported by:
    • Provide Quality Strategic Planning Resources and Processes

As an example of how these outcomes were measured, within the goal to promote the scholarship of assessment, two performance objectives were identified: Assessment Mini-Grants and Scholarly Presentations and Publications. The Assessment Mini-Grant program was made available to SHSU faculty and staff through a competitive application process, through which up to ten recipients could be chosen to receive a grant of up to $1,000 for assessment-related activities. All ten awards were distributed from a pool of 29 applicants, which covered projects ranging from developing assessment instruments and surveys to presenting at assessment-related conferences [5].

Regarding scholarly presentations and publications, Academic Planning and Assessment staff were expected to present at four or more state, regional, or national conferences. This objective was accomplished as all three assessment staff members did make four presentations [2] on a variety of topics:

  • General education assessment: Differences in written communication skills as a function of demographic characteristics
  • Assessment teamwork using student self-reflections: Efforts to design and pilot a locally developed instrument
  • Expanding the use of an existing course/program-level critical thinking assessment to the institutional level
  • Institutional assessment at Sam Houston State University.

Division of Enrollment Management – Institutional Effectiveness (2016-2017 cycle) [6]

For the 2016-2017 assessment cycle, Institutional Effectiveness defined eight performance objectives, which supported two goals:

  1. Collect, Analyze, and Disseminate Institutional Data and Relevant Information in an Accurate, Timely, and Understandable Manner, supported by:
    • Develop and Maintain Outcome Indicators
    • Develop New and Enhance Existing Cognos Reports
    • Develop and Design Interactive Fact Book
    • Team Performance
    • Track and Evaluate User Satisfaction
  2. Contribute Materially in the University-wide Process for Continuous Improvement by Assisting Administrative Units in the Evaluation of Operations, supported by:
    • Administrative Program Review Evaluation
    • Conduct [Administrative Program Review] APR Process in Institutional Effectiveness
    • Expand Administrative Program Review

As an example of how these outcomes were measured, Institutional Effectiveness (IE) identified five performance objectives to fulfill the goal of collecting, analyzing, and disseminating institutional data in a timely and understandable manner. In order to strive toward achieving this goal, they created electronic documents and reports, provided training, participated in training, and requested feedback to ensure client satisfaction. They were successful in achieving the outcomes associated with building reports. For example, for the outcome to build four new Cognos reports, they built five new reports and updated two others. IE also fulfilled this outcome by sending its staff members to cross training with other departments in order to have a better understanding of University operations.

For the goal to contribute materially in the University-wide process for continuous improvement, IE identified three performance objectives to assess the Administrative Program Review (APR) process. APR supports the administrative departments in examining their operations and making adjustments for continuous improvement [7]. IE had a target of completing APR for five departments during 2016-2017, and they achieved their objective of expanding APR into more University departments by completing the process for seven departments.

Division of Finance and Operations – Human Resources (2016-2017 cycle) [8]

For the 2016-2017 assessment cycle, Human Resources defined six performance objectives. These objectives supported two goals:

  1. Develop a Strong Employee Development Process, supported by:
    • Additional Online Training
    • New Student Employee Onboarding Process
    • Utilize the Cornerstone Performance Management System for Additional Efficiency
  2. Facilitate Continuous Improvement Within Department, supported by:
    • Compensation Review for Staff Internal Pay Equity
    • Consolidation of Classification System
    • Improvements to Hiring Process

As an example of how these outcomes were measured, Human Resources (HR) was successful in promoting the goal of developing a strong employee development process through three performance objectives. HR already hosts many online training sessions through Cornerstone Talent Management [9], but they strived to provide additional job-specific trainings for employees during 2016-2017. The result was five additional trainings through their signature “Come for Coffee” series.

In order to satisfy the goal of facilitating continuous improvement within its own department, HR set out to review each staff employee job offer and reclassification request for internal pay equity, consolidate staff position classes, and improve the hiring process. While reviewing job offers for equity and consolidating staff positions are ongoing outcomes for this department, HR did report that online Search Committee Training and resources were provided to assist search committees in the hiring process.

Division of Finance and Operations – University Police Department (UPD) and Parking (2016-2017 cycle) [10]

For the 2016-2017 assessment cycle, UPD and Parking defined two goals and two performance objectives:

  1. Continually Improve the Support Services to Internal Constituents, supported by:
    • Enhance University Crime Prevention and Safety
  2. Provide Efficient Operations in Comparison to Similar Universities, supported by:
    • Provide Efficient Operations in Comparison to Similar Universities

Measuring the outcomes of enhancing University crime prevention and safety was twofold; UPD analyzed parking citations to evaluate the number of citations given versus the number that were paid in order to review trends for budgeting purposes, and UPD requested the completion of surveys by participants who attended UPD presentations. UPD stated that parking citation fines are not the main staple of their budget, but the funds to assist with infrastructure maintenance. The citation fine total was well above the goal necessary to have $250,000 in reserve. They also discovered that they were able to capture more granular data due to the new parking management system, so they will be creating new performance indicators for the next assessment cycle. In addition, UPD conducted over 30 different presentations at SHSU since January 2017 covering a variety of topics. Other events included the Kats Safety Bash, Destination Spring Break, a mock dorm room fire, and an online or in-person training called All-Hazards Awareness [11]. Attendees who completed surveys gave above average marks for all presentations and events.

In order to assess the efficiency of operations compared to similar universities, UPD compared the officer-to-student ratio to that of similar institutions. Other universities with similar police departments typically have one officer for every 1,000 individuals in the total university population (i.e., students, faculty, and staff). Moving forward, SHSU’s UPD planned to maintain that ratio, or an even lower ratio of one officer per 800 individuals in the total population. During 2016-2017, UPD, along with all other SHSU departments, had to find ways to accomplish more with fewer resources. Rather than hire additional officers, the department made several changes that positively impacted operational effectiveness. One implementation concerned the shift structure: Rather than having eight- or ten-hour shifts, the department determined that twelve-hour shifts were in the best interest of all. In regards to campus safety, it meant that there would always be a shift minimum of two officers, whereas previously the shift minimum was one officer. UPD is currently maintaining the 1/1,000 ratio, but they plan to request additional personnel within the next couple of years.

Division of Information Technology – Information Technology (IT) Client Services (2016-2017 cycle) [12]

For the 2016-2017 assessment cycle, IT Client Services defined eight performance objectives. These objectives support the following four goals:

  1. Analyze Opportunities to Increase Campus Efficiency, supported by:
    • Evaluate Processes to Increase Campus Efficiency
  2. Provide High-Quality Support Services to Campus, supported by:
    • Service Delivery Will be Perceived as a Good Experience for the Client
    • Service Delivery Will be Perceived to be Provided by Qualified Staff
    • Service Delivery Will be Perceived to be Timely and Efficient
    • Service Delivery Will be Perceived to Have Kept the Client Informed
  3. Provide Quality Information Technology Resources, supported by:
    • Provide Reliable Core Customer Services to Campus
  4. Provide Quality Professional Development Opportunities for Staff, supported by:
    • Provide Opportunity for High Quality Professional Development that Enhances Value
    • Provide Professional Development

As an example of how these outcomes were measured, IT Client Services put into place three performance indicators to measure increases in campus efficiency of costs, operations, and communications. One indicator was to improve the inventory audit process, which was accomplished through improvements in policies and procedures to better track the movement of assets and prevent loss. Implementing sign-in sheets that required signatures from staff, clients, and the Asset Manager greatly assisted in tracking items. The annual inventory audit revealed a small percentage of lost assets, which the department plans to improve by tightening the control on the physical areas where assets are kept prior to installation.

IT Client Services also employed surveys to satisfy the goal of providing high-quality support services. The target was that 95% of respondents would give a “satisfied” or better rating on their perceptions in the following areas: overall service request experience, technical qualifications of the staff member assisting with the request, and time needed to complete the service request. Out of more than 1,500 respondents, 95% were satisfied or very satisfied with the overall experience, 96% were satisfied or very satisfied with the technical qualifications, and 94% were satisfied or very satisfied with the completion time.

Division of University Advancement – Alumni Relations (2016-2017 cycle) [13]

For the 2016-2017 assessment cycle, Alumni Relations defined two goals and three performance objectives:

  1. Enhance the Image of the University, supported by:
    • Increase the Number of Meetings, Events, and Attendance
  2. Secure Private Support for the University, supported by:
    • Increase Total Membership in the Alumni Association
    • Meet Gifts Goal (non-dues Income) From the President’s Performance Indicator Report for FY ’17

In order to enhance the image of the University, Alumni Relations made plans to increase the number of meetings and events (at least 360) and to increase the overall attendance (at least 28,000) at all meetings and events. They exceeded their goals by holding 395 meetings and events, with a total attendance of 28,874 through August 31, 2017.

Alumni Relations was also charged with securing private support for the University by increasing Alumni Association memberships. Communications through mass mailings and emails to non-active alumni and lapsing members were employed throughout the year, and memberships were tracked through a software system called Razor’s Edge. The department’s goal of obtaining 13,400 members by August 31, 2017 was met, as the final numbers reflected a total membership of 13,700.

Example Assessment Plans and Reports

As further demonstration of ongoing quality assessment practices at SHSU, this narrative includes documents containing example assessment plans for the last three complete assessment cycles: 2014-2015 [14], 2015-2016 [15], and 2016-2017 [16]. The selection scheme used to highlight units provided roughly 50% of administrative support assessment plans from each given cycle. A complete list of all administrative support unit assessment plans for these cycles is included as part of the SACSCOC Reaffirmation Report Website.

Meta-Assessment

As part of University-wide efforts to promote quality assessment practices, OAPA also facilitates an annual meta-assessment process that is inclusive of administrative unit assessment plans. Beginning with the evaluation of the 2016-2017 assessment cycle, OAPA began transitioning to a locally developed, 4-point rubric [17] to evaluate the quality of programmatic assessment plans. This rubric was a revision of an older, locally developed 3-point rubric [18]. The focus of the meta-assessment review is not on what is being assessed by each unit, but rather the quality of the assessment, with emphasis on the assessment practices and processes themselves.

Feedback from the annual meta-assessment reviews is used in two ways. First, completed meta-assessment rubrics are returned to the individual units for review and use in the continuous improvement of their assessment practices. Second, data from these reviews are used by University administrators to identify areas where training and resources are needed to improve programmatic assessment efforts. Examples of the most recently completed meta-assessment rubrics are provided to highlight this process [19] [20] [21].

Administrative Program Review (APR)

In addition to the institutional annual outcomes assessment and meta-assessment processes, administrative units also participate in an APR process. The purpose of APR is to support executives, managers, and employees of SHSU administrative departments in the examination of current operations, identification of opportunities for enhancement, implementation of adjustments, and establishment of plans for continuous improvement. The APR process is facilitated by the Office of Institutional Effectiveness [7] [22].

The Administrative Program Review Process

The APR process incorporates three-stages: (a) the completion of a self-study, (b) the completion of a peer review, and (c) the development of an action plan for improvement.

The Self-Study Process. Under the guidance of the Assistant Director for APR, the administrative department completes a self-study document, which addresses the department’s mission statement, facilities, plan for staffing, stakeholder feedback, policies and procedures, and communication/outreach techniques. Supporting datasets and administrative documents are provided to the department head in advance of the self-study. These may include documentation of annual assessments, department budget and expense reports, institutional and divisional goals, guidelines issued by the Council for the Advancement of Standards in Higher Education (CAS), organizational charts, etc. [23] [24].

The Peer Review Process. A peer review committee is identified by the administrative department head. Coordinated by the Assistant Director for APR, the committee consists of, at minimum, one internal reviewer (on-campus faculty or staff from outside the administrative department), one external reviewer (professional not employed at SHSU), and one student reviewer. The role of the peer review committee is to review the department’s self-study, to meet with the department and other committee members, and to complete a rubric documenting evaluations and recommendations. Rubrics are collected by the Assistant Director for APR. Identifying information is removed, and the rubrics are given to the department head for review. The comments and recommendations given by the peer reviewers are taken into consideration as the action plan is developed [25] [26].

Action Plan Development. Once the peer review process is complete, the administrative department head works with the Assistant Director for APR to create an action plan, establishing viable plans for continuous improvement [27] [28].

Upon completion of the action plan, APR staff compose an executive summary [29] [30] outlining the department’s review process and action plan objectives. This document is distributed to departmental and divisional leadership.

Follow-Up Process. Two years following the program review, the Assistant Director for APR and the department head meet to discuss progress made on action plan objectives. Any outstanding issues or barriers to improvement are discussed and addressed, and adjustments to the action plan are made as necessary. The APR process is revisited once every five years.


Supporting Documentation

Documentation Reference Document Title
[1] Office of Academic Planning and Assessment Website
[2] OAPA Assessment Resources Website
[3] CampusLabs User Guide
[4] Academic Planning and Assessment (2016-2017 cycle)
[5] Assessment Mini-Grants Website
[6] Institutional Effectiveness (2016-2017 cycle)
[7] Administrative Program Review Website
[8] Human Resources (2016-2017 cycle)
[9] Human Resources Training Website
[10] UPD (University Police Department) and Parking (2016-2017 cycle)
[11] All-Hazards Awareness Training Website
[12] IT Client Services (2016-2017 cycle)
[13] Alumni Relations (2016-2017 cycle)
[14] Administrative Effectiveness (2014-2015 cycle)
[15] Administrative Effectiveness (2015-2016 cycle)
[16] Administrative Effectiveness (2016-2017 cycle)
[17] Revised SHSU Meta-Assessment Rubric (4-point Scale)
[18] Previous SHSU Meta-Assessment Rubric (3-point Scale)
[19] IT Enterprise Services Meta-Assessment Rubric (2016-2017)
[20] Payroll Office and Tax Specialization Meta-Assessment Rubric (2016-2017)
[21] Advancement Services Meta-Assessment Rubric (2016-2017)
[22] APR Presentation
[23] Institutional Effectiveness Administrative Program Review Self-Study
[24] Visitor Services Administrative Program Review Self-Study
[25] Institutional Effectiveness Administrative Program Review Peer Review
[26] Visitor Services Administrative Program Review Peer Review
[27] Institutional Effectiveness Administrative Program Review Action Plan
[28] Visitor Services Administrative Program Review Action Plan
[29] Institutional Effectiveness Administrative Program Review Executive Summary
[30] Visitor Services Administrative Program Review Executive Summary