Three Approaches to Getting Started with an Academic Program Evaluation System (PES)

November 22, 2022

It is time to decide which academic programs to start, stop or grow. As you go through the process of evaluating your programs, both existing college programs and new opportunities, you will need to decide on the best approach for your institution. You know the consequences poor decisions can create. How do you do it better?

The Old School Approach

At Gray, we often refer to this method as the “throw spaghetti at the wall and hope it sticks” method. Decision-makers get input from faculty, trustees, local employers, students, and others. People contribute, but ideas are often limited by departmental perspectives, politics, and areas of expertise.

Local employers may also suggest programs because they need trained graduates. The caution here is that employers don’t fill seats – students do. Employers may provide good jobs to program graduates, which is important. However, the program won’t be successful for the institution if students aren’t interested in taking it and no one enrolls.

In this old-school approach to program selection, ideas get thrown around, and champions look for data to support their ideas. However, calling this approach data-informed would be misleading. The evaluation criteria are often undefined, and data sources may vary from program to program. The data itself is messy and error-prone, and time-consuming to collect. The decision-making process is usually ill-defined, slow, and opaque, which raises suspicions that it is political, not fact-based. Often these suspicions prove true. Programs are usually considered one at a time, which doesn’t allow a big-picture perspective or a collaborative decision-making process. All too often, the process overlooks readily available data on what students want, leading to expensive investments in new programs that never meet enrollment goals.

Building an Academic Program Evaluation System from Scratch

The benefit of building an academic program evaluation system from scratch is ownership. Institutions that choose this approach must be prepared to devote a few years and several hundred thousand dollars to the process.

First is the need to develop software scrapers to capture some of the data (e.g., search volumes by keyword). Purchasing additional data, such as job postings or enrollment statistics, will be necessary. Once the data is available, it will need to be cleaned, and crosswalk programs developed. For example, a crosswalk from the academic program CIPS to the job SOCS is necessary for analyzing the availability level of employment by the program.

1,500 academic programs, six degree levels, and 700 occupations exist, so this task is not for the faint of heart. There is a shortcut, using the NCES crosswalk. However, it vastly underestimates the jobs available for grads – for example, History majors go into over 500 occupations, not just the five that show up in the NCES crosswalk.

In our experience, Excel cannot handle the volume of data, so a database engineer will be needed to pull in the data and store it. Once it is assembled, data visualizations need to be developed, so that constituencies can access and understand the information.

Once the new academic program evaluation software is built, it will need to be maintained. Each data source must be updated, taxonomies must be adjusted as they change, bugs must be fixed, and the data structure and interface usually need to be improved. Users will need training and support. Maintenance will be a full-time job for at least a couple of people.

Subscribe to Academic Program Evaluation Software

There are many benefits to subscribing to academic program evaluation software. Professionals can have it up and running for you in a few days. The data is cleaned by people who understand the many idiosyncrasies of educational data sources. The crosswalks reflect years of experience and hard data (from the American Community Survey). The data is updated as it becomes available. Over 1,500 programs are included.

Institutions sometimes consider subscriptions to these services expensive. Please keep in mind that vendors pass through the economies of scale they achieve by helping hundreds of institutions. For example, Gray pays more for one of our datasets than we charge a client for all of them. The cost of academic program evaluation software is insignificant compared to the hundreds of thousands of dollars that are often lost on a failed program launch. There are other benefits. When institutions understand the data, they can find and launch programs that generate millions of dollars in new revenue.

No matter which approach you choose, establishing a data-informed academic program evaluation system at your institution will help to ensure long-term success for your academic program portfolio. Your system will help you set clear criteria for program evaluation, score programs, and rank them. This will ensure that the best programs get considered.

Mary Ann Romans

DIRECTOR OF MARKETING

Mary Ann creates, defines, and executes the content marketing strategy at Gray, collaborating with the entire team to support our higher education partners through effective communication and provision of critical industry information.

About Gray DI

Gray DI provides data, software and facilitated processes that power higher-education decisions. Our data and AI insights inform program choices, optimize finances, and fuel growth in a challenging market – one data-informed decision at a time.

Related Posts
Subscribe to Our Blog

Don’t miss our latest research and insights

Related Posts