Conference

Notes from “QM Program Review – What We Learned” #qmconf2015

ron legonNotes from “QM Program Review – What We Learned”, presented by Ron Legon, Executive Director of Quality Matters, at the 2015 Quality Matters (QM) conference.

Quality Matters (QM) created the Program Review rubric as a response to requests from subscribers.

Value of Program Certification

The value to QM is building on the implications of course review rubrics to look at learner support and teaching support. It also broadens the scope of QM services and provides recognition to outstanding programs. Furthermore, the program review rubric addresses increasing public, accreditor, and regulator concerns about online programs.

For institutions, marketability is an important outcome of program review. The program review also allows programs to demonstrate overall program outcomes for students. Others appreciate the opportunity to review themselves more critically and to improve their quality. It is also an opportunity to improve processes across the institution to better serve students.

The Pilot

As a pilot, the program review was a testing phase to determine how the process would work. For example, does the evidence they ask for exist in a variety of institutions? Is it possible for reviewers to interpret that evidence?

For the review, QM used Google Basecamp, but they will have proprietary tools ready for the full launch. They are also working to develop training for program liaisons and reviewers.

The pilot was intensely evaluated throughout. Staff monitored Q&A for all participants and monitored outcomes of the reviews. The participants completed surveys afterward, and the study team approved modifications based on their feedback.

The pilot occurred in Spring 2015, with 10 programs from 7 institutions across all four areas of the review (Program Design, Teaching Support, Learner Support, and Learner Success). Eight out of ten met the criteria and earned the certification.

Highlights of the Program Certifications

The rubric includes criteria and annotations, like the course review rubric. In addition, the program rubric includes specific directions for the types of evidence to submit. It is impossible for an institution to submit a single program, or all of the programs for a department or university. In that case, there are limits in place:

  • For single programs, evidence is submitted from the program overall and 5 representative courses (to extrapolate to the entire collection of courses).
  • For departments/colleges, evidence is submitted from up to 5 programs (and 5 associated courses for each).
  • For institutions, evidence is submitted from 5 representative programs from different areas of study across the institution.

Throughout the rubric, you must have 3 years of data ready to show this commitment over time. This ensures that programs are established and the processes have been tested and improved over time, as well.

Note that the list below of each rubric is incomplete, and provides a high-level overview only.

Program Design

Criteria #1 – essentially, all included Programs should have Measurable Objectives

Criteria #2 – essentially, all course objectives are consistent with program objectives (and an alignment map or table is submitted as evidence)

Criteria #4 – All courses must be on a path to align with the QM Rubric, through official review or revision process, informal reviews, or implementation of a design template that assures certification

Teaching Support

Criterion #1 – Training in Online Teaching is Required. 85% of all instructors (including faculty, part-time, and adjunct) must have undergone training in online teaching prior to teaching online, or concurrent with the first online teaching assignment.

Criterion #4 – The program uses learner feedback to improve online teaching. Evidence includes data, process, and samples of reforms based on feedback

Criterion #5 – The program sets expectations for teacher responsiveness. Essentially, that there are guidelines for faculty in place, and survey data on the level of student satisfaction with availability and responsiveness

Learner Support

Criterion #1 – the program provides an array of online learner support, with a list of recommended services that must be in place

Criterion #2 – Learner feedback is used to improve learner support services. Must submit evidence of data collection, analysis, and reforms based on learner satisfaction over the past 3 years

Learner Success

The definition of learner success and how it is measured are up to the institution, not imposed by QM. This is reflective of the culture, resources, and mission of each institution. Ron Legon points out that this has been the hardest for institutions to approach, because many of the structures and supports are not in place.

Criterion #1 – the organization must have a definition of learner success consistent with philosophy, history, mission, and goals of the institution

Criterion #2 – Identify 3-5 measures of learner success based on data or surveys that demonstrate the extent to which learners are succeeding

Exemplary Program Designation

If you are able to achieve all 4 certifications within a 2-3 year period, you are automatically recognized as an exemplary institutions. No additional review or cost involved. This distinction will be highly publicized by QM, and should be by the institution as it is a high distinction or quality.

Changes Based on the Pilot

  • Clarified language of the criteria
  • Added annotations
  • Structured the process around initial and second submission of evidence
  • Developing an Automated Program Review Management System (PRMS), available in January 2016
  • Creating courses for Program Liaisons and Reviewers
  • Introducing a Readiness Checklist for Program Liaisons and a Data Cover Sheet for each data submission
  • Adjusting pricing and reviewer stipends based on time demands
Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s