AUDIO START: [0:00:00]
Mark Keleher: Good afternoon everyone, and thank you for joining Gray’s monthly webcast on the most recent student and employer demand trends for Higher Education. All the trends we will reveal this afternoon are derived from Gray’s program evaluation system. This system supports ongoing academic reviews at colleges and universities across the country.
Before I introduce our partners this afternoon and pass the baton over to our CEO, Bob Atkins, a couple of quick housekeeping items. Please feel free to share any questions with us in the chat window on the left-hand side of the page. We will be sure to answer any and all questions towards the end of today’s webcast. A copy of the presentation and the recording will be emailed via email to all registrants. Without further ado, Bob?
Bob Atkins: Thank you very much, Mark, and welcome everyone to the July version of our webinar. We’ll be sharing results, as Mark mentioned, through July 2018. And today I’m delighted to have on the phone with me Linda Osterlund, and Linda’s going to share some information that she has developed in assessing programs sustainability at Regis. And I’ve worked with them, I think they’re on the cutting edge here in terms of looking at sustainability in a very robust way. So very much looking forward to hearing her presentation, and thank you Linda for joining us today.
Linda Osterlund: Thank you!
Bob Atkins: Let me give you a chance to introduce yourself.
Linda Osterlund: Hi, thank you for having me. I’m currently the Academic Dean for Rueckert-Hartman College for Health Professions at Regis University. And, as Bob said, in my previous role as Vice Provost for Resource Planning, I facilitated our Program Sustainability Project for Regis in 2017. It was an ambitious project and I’m looking forward to sharing more about that with you today.
Who is Gray?
Bob Atkins: Thank you. And before we dive in, a little bit about who the devil Gray is. I will be very brief on this today. We are a strategy consulting firm focused on Higher Education and we have differentiated ourselves by bringing together a variety of data on student demand, employment, competition and demographics, to help schools make their programmatic decisions, and Regis is one of those. Recently, I wanted to let everybody know, we’ve launched our Program Economics Model, which allows you to understand the variable contribution of your academic programs. So, another way of saying that would be to understand tuition and fees net of direct expenses, instructional expenses for your program. So, it’s kind of fun, we’ll talk a bit more about it later. But it’s another light on the sustainability of a program.
Agenda - Demand Trends in Higher Education
Now let’s talk a little bit about inquiry volumes, so what’s happening to demand in the marketplace.
Overall Student Inquiries (All Sources)
As many of you know, we’ve been tracking inquiry volumes for years. Actually, all the way back to 2012. And the way we display that information is by month, so we can account for seasonality, and by year using the bars. So, the blue bar here is 2016 data, the green bar is 2017 data and the deep red bar is 2018 data. So, if those newer bars are below the older bars then we’re shrinking and if they exceed them we’re growing.
Now what’s been going on? As we look at April, May, June, results are better than they had been back last year where more or less at last year slightly up and that’s coming on the heels of three years of decline in the metric. So, it looks as though the decline in inquiry volumes is coming to an end. And that seems to be fairly consistent. That it’s either flat or slightly up for the last six months. So that’s a very welcome change, and in July we’re actually up 3%, which is the second-best result so far this year.
Overall Student Inquiry Conversions (All Sources)
Now an inquiry is great but it’s nice if it converts to something later in the process and ideally ultimately a student. Conversions are also up. About 2%. Now you might well ask, what exactly is a conversion? If we count conversions in the month in which the inquiry was received, so if somebody enquired in April and then converted to a later stage in the buying process in July, they would be counted as an April conversion.
Now, what later steps are included -- the catch here is that many schools report to us differently, so we have to mash it all together and call it a conversion. So that really is just a step beyond an increase in the buying process, in some schools that’s an application, in other schools that might be a new student. So, it encompasses everything up to a new student after somebody enquires. And as I said, that’s a very important metric in understanding the productivity of the inquiry data and whether people are actually converting into students or just looking.
Average Price for Pay-per Inquiry
Now those inquiries come at a price and this price was actually skyrocketing for a while there. It ran up from $45 almost to $50 in three months per inquiry, and fortunately it started to drop off again. It dropped to $46.31 this month, which brings it back down to the level it was towards the end of last year. So, this is relatively good news, hopefully not just some sort of seasonal artifact but prices seem to be steadying here.
A word of caution though, if you’re thinking that you can get a student for $46 or something like that, a more reasonable ratio would be to say that one in about three to five percent of inquiries will convert to a sale. So, you’d have to multiply that number by 20 or 30 to understand what it costs to get a student. So, think $800,000 on this data if you’re buying inquiries. And I should also say while the prices are down relatively over the year, they’re actually year over year from last year still, so we’re not seeing a year over year decline but we are unfortunately seeing a month over month decline.
Inquiries for Online Programs
Now let’s see how it breaks down between online and on ground. For those of you who are betting, yes online is indeed up, it’s up 11%, and very much a bright spot in the demand data. And that’s been true for some time, online’s been doing quite well over the beginning of this year. On-campus, of course, supported 1% or 2% overall; on-campus had to fall, and it did. It’s down 11%, so we’re seeing less and less interest in students going to a more traditional on-ground school.
The Big 5 Programs: July Growth Year-over-Year
Let’s break down a little bit by degree level and by program and such. If we look at the five largest programs in our data set, two of the five are up now; Healthcare Administration and Criminal Justice. The decline in a couple are very modest; Registered Nursing and Business Administration are basically flat. And Medical Assisting continues to plummet down 24%; that program has been dropping for quite some time.
But I will also say that, if you looked at these numbers a year ago, all 5 of these big programs would have been shrinking so this is actually quite an improvement year over year, to have things be largely flat or even up for several of our major programs.
The Fast 5 Programs: The Higher Education Programs with the Fastest Inquiry Growth
So, in terms of growth, we’ve got five fast-growing programs, all of these are up 15% or more, one of them up 70%; Public Administration. You’ve got 39% growth in Healthcare Administration, both big and growing programs. And then some other things we’ve seen often in this fast-growing area, which is Web & Digital Design and Graphic Design. And then Medical Insurance Coding made it to the list this month. So those are the ones that are growing fastest in our data set.
Inquiry Volumes by Degree: Year-over-Year Change
Now, if you’ve said what degree levels are doing well, it’s kind of interesting. The first observation I would make here is that certificates are very volatile. There are a couple of reasons for that. One I think because it’s so short term, they can in fact go up and down fairly quickly depending on who’s marketing what and student interest in a given area. So, you can see here Postbac certificates are up 56%; for some reason, post-master’s certificates dropped 41%. And undergrad certificates are actually down 4% as well. So, a lot of volatility there, including some very rapid growth.
After that the story gets a little bit clearer, the higher the degree level, the better the growth. So, doctor’s degrees are up 21%, masters are up 16%, bachelors up 5% and associate’s up 2%. So, it’s growth rising by degree level.
Online Inquiries: Year-over-Year Inquiries by State
You might well ask where is it growing. And if you look at online inquiries in the US, it looks to me like we’ve got kind of a power alley running down through the Midwest where growth is more robust. You can see Wisconsin, they’re above Illinois, and that must be one of the Dakotas there in the middle. And Kansas as well, I think that’s Arkansas, testing my US geography, and New Mexico. So, there are a number of states there in the Midwest that are growing very rapidly, 20% or more in terms of inquiry volume. And then some good growth on the Pacific Coast as well. East Coast overall is much weaker than the other areas.
Student Inquiries: The Big 5 Cities
Let’s drill down geographically a little bit more specifically and look at individual cities. Now I’m going to switch out of online, we’re talking about our overall inquiry data. And the good news here, last year all these cities were down. And down big time, over 10% each. This year we’re starting, finally, to see growth in the major cities with Philadelphia up 8%, Chicago up 13%, New York more modest just up 4%, and then LA and Houston shot the lights out this month, both of them up over 20%.
Google Search Trends: Programs
Now if we dig into another way of looking at demand, we’ll talk about Google for a minute. With the data we were just looking at with inquiries, now we’re looking at Google quick finds and trying to understand what programs people are looking for on Google. First off, we are seeing a little bit of growth here this month, and that’s a change. We haven’t seen growth for almost eight months. So, this is good news again, that demand may be picking up in terms of searches for academic programs.
Google Search Trends (Brands)
We cross-checked that by looking at brand search for a sample of 75 higher education brands. You might well ask why bother? Aren’t you looking at the same thing? The answer’s not really. There are many students who pick their college first, not their program. And they won’t make up their mind until they’ve been at school for a year or two and they picked their major. So they may never show up in programmatic search. So, it’s important to keep track of searches for brands as well for all those students who pick the brand first and the program later.
These two are pretty well synced right now. I’m showing kind of flat to slightly down overall. This month they diverged, as I mentioned the program search is up. Brand search is off just a shade this month, down 2%.
And in terms of what programs are winning in Google, I’ll focus in here on the ones that are actually losing. We seem to be getting hit in Teaching and Hospitality. We’ve got three teacher programs here that are down double digits, 20% or more. General, Teacher Education, Multiple Levels, Elementary Education and Education, General are the dark blue bars.
And then we’ve got two Hospitality programs that I can only say got hammered this month. Both down over 40%, Hospitality Administration and Hotel/Motel Administration and Management. So those really took a strong hit and that’s not the first month in which they’ve been down. And then there’re an assortment of other programs that also dropped; Chemistry, Criminal Justice and Admin Assistant all down 20% or more.
In terms of brand search, coming back and looking at growth, we’ve got a hand full of schools here that are up 10% or more starting with SNHU up 11%. Our winner again this month is UTI, that has actually had its brand search go up 18% year over year. So they’re doing some great things there in terms of stimulating brands.
We’ve got West Coast University at 17%, Western Governors is doing very well here at 15%. And then we drop down to around 12% with University of Wyoming, Florida Career College and SNHU, all growing double digits. Harvard made the list this month, which is near and dear to my heart, up 7%, but probably on much smaller volumes than some of those others. And University of Kansas actually up 6% as well.
Program Sustainability: An Integrated View
Now, what we’ve been looking at so far is primarily market data. When you think about Program Sustainability, it’s about a lot more than just whether the program is in a good and healthy market, where student demand is growing and employer demand is growing. You also need to understand program economics, whether it fits with the institutional mission, and whether it’s supporting the academic standards of the institution, are students getting through the program? Is it rigorous enough?
We believe that a good program review actually is multi-dimensional and would include all of these aspects. What I’m going to do now is talk about a couple of them, Program Markets and Program Economics, and then I’m going to turn it over to Linda who’ll show you how Regis actually integrated data from all of these areas in its program review.
So first, when we talk about understanding the attractiveness of program markets, what do we really mean? Well we would suggest you look at four broad elements on this. First, is there Student Demand? And that’s obviously what we’ve been touching on, but obviously you need that data by program. Second, are there employment opportunities for your graduates, and do they pay a wage that will enable your graduates to pay back their loans? Third, Competitive Intensity. Is this field already full? Are you able to compete against the other folks who are offering that degree? And finally, Degree Fit, which is what level is this degree and is the level of the degree appropriate for the school that you’re in? So, if you’re a community college and it’s predominantly a PhD program, it’s not much help. So you need to look at all four of those elements to assess the market.
Program Markets: Program Scorecard
Once you’ve done that, we would suggest pulling a great deal of data. Obviously, this is something we have and do. And we create these score cards that take data on student demand in terms of inquiries, as we’ve touched on in today, Google search and completions, and give you metrics on each of those things. Importantly, we find when you’re dealing with this many numbers, color coding also helps, who do you think is like the dark green programs or numbers are in the top 2% of all programs in the dataset, which is about 1400 programs.
So, for example online inquiries for Computer Science is 26,723, that is one of the top 2% of all programs in terms of searches for online programs. And you can see there’s data here on competition, including how many competitors there are and some competitive intensity indexes, one from Google and several of our own. We also would suggest you keep data on employment. And we would suggest you have data from somebody like Burning Glass or [16:04 -- inaudible] Services, that either keeps track of open job postings as well as BLS which gives you longitudinal data on total employment and a forecast out into the future for what it’s likely to become. And finally, wages are also reported in BLS, you can also get that data from the American Community Survey as we do for gainful employment for your two-year programs.
And then, as a cross check, we suggest you take a look at the BLS statistics. It will tell you what percentage of the workforce has what degree, and that will help you make sure you aim your program at the right degree level. And you can also see what degrees people are graduating with now using IPEDS and, in this instance, 56% of all people graduating Computer Science programs are coming out at the bachelor’s level. So that’s a little bit on the sort of data you’ll want to collect when you get into doing Market Analysis.
Program Sustainability: An Integrated View
Now let’s touch a little bit on Program Economics. I suspect this is somewhat of a politically charged topic. But given budget constraints at schools, it’s only being responsible to understand what your economics are. Like all the other metrics you could look at, you can’t make a decision on this alone. It’s not more important than markets and certainly not more important than Academic Standards or Mission. But it is important to know what the numbers are.
Program Economics: Methodology
And the trick is that a program is not the same as a department or a course. Programs generally span more than one department. If you’re in Nursing, about 20% or 30% of your courses are actually taken outside of nursing. Most of your credit hours will be taken inside Nursing, but still a wide variety of courses like English and Social Studies that may need you to be taken outside of the major itself. So, to calculate program margins you need to take each student, calculate the tuition and institutional grants and then assign that to the courses that student takes, so you can see what revenue is generated by each course, by student. Then you do a similar exercise by instructors and look at the salary and benefits of the instructors and assign that to courses as well. And then those students attending the courses carry the cost and revenue forward into their programs, so that the sum of all courses taken by a Nursing student would contribute to the profitability of the Nursing program. And as I mention that’s going to include courses both inside and outside of Nursing.
Once you’ve done that, you can create a fairly simple scorecard and we like to do this both in the aggregate as you can see over we keep track of total revenue, instructional costs and contributions. As well as by student credit hour, which is a much better way of comparing one program to another because it adjusts for overall programs size. Now we also color code here. Most of the numbers you’re looking at in this top part for student credit hour is in the 75th percentile. So, in this case that tells you that this program is doing better than 75% of other programs offered at the institution in terms of revenue per credit hour, cost per credit hour and contribution per credit hour.
So, while people often think of Nursing as a high cost program, what we’ve found is that because the classes run full, the cost per credit hour is actually quite reasonable and actually in the top quartile of programs offered by this institution. And there is some other information here in terms of number of courses, number of students and so forth. The total student credit hours taught to people in the Nursing program. So, you can keep track of those gross volume metrics. And there’s a stack of other analytics you can do behind this, that we support. But the scorecard is the critical piece of input when you’re thinking about a program so you have that one place you can go and see the numbers and see how it compares to other programs offered at your school.
Program Sustainability: An Integrated View
Now let me turn it over to Linda and let her describe how Regis went about creating a scorecard and a process to assess Program Sustainability across all four of our dimensions.
Linda Osterlund: Thank you, Bob. I’ll just talk a little bit about what we did last year in 2017. We started our process by inviting Gray to talk to us about how they do their scoring in those four categories for the market data. They helped us to, first identify what all our programs were, that was our biggest hurdle. And then to score all of our programs, so that we could create a portfolio of all our current programs iIn order to create a strategy for short term and long-term planning.
What we found helpful in this process was to add additional data points to what we were looking at. So, what we needed to do is basically look at an additional four categories of performance criteria.
Regis University’s Program Sustainability Project
I’m going to skip over here to this slide, 35, to show you.
5 Key Performance Criteria
So the additional performance criteria that we added to the four categories of market data that we got from Gray was Mission and Institutional Fit; this was for us the qualitative data that we needed and information that we were able to get from each of the program. So we took that we had that was quantitative and asked the units to give us their mission focus and how that strategically fit into our overall university strategy.
The second things we looked at were Assessment of Trends. In Assessment of Trends we looked at new starts, head count, credit hours. We also added Resource Efficiency. In Resource Efficiency we were looking at revenue and contribution margins. Bob mentioned how, on the economic standpoint, that they’re looking at being able to drill down into the contribution margin at the program level. In our study we were still at the point where we were looking at the margin on the college level. So, I find it really helpful and I think our group would find it really helpful to be able to drill down a little bit further, to see what the margin was at the program level. We also looked at student’s success in terms of graduation, grades, retention and number of completions.
Program Sustainability Report
So those were all pieces that we added into an overall sustainability report. So, this kind of small, I think, in terms of looking at the information. We wanted it to all fit on one page. And what happened with this information was it went to the faculty. First of all, to the faculty that led the program, so that they could look at the data and they wrote up a couple of pages. They were limited to their comments, but we wanted to get their feedback on what they were seeing in the data. So, they were able to not only contribute what the mission and institutional fit was, but they were also able to contribute pages of commentary on the data, to give context to what we’re seeing.
The next step that happened was the report went to our university academic council. And the academic council is made up of faculties from all of our five colleges. Regis University has five colleges and so those representatives were able to weigh in on the report, that included the quantitative data, the qualitative data and the commentary. And they ranked them and looked at, in terms of putting each of the programs into one of three categories.
Program Sustainability Decision-Making Matrix
And those categories were growth opportunity, sustaining or intervention. And so, the group was actually able to take those scores and put those programs into one of those three categories. In order for us to decide how to utilize our resources the best. So I’m going to go over a little bit more about how we got each of those bits of data, and then I’ll go back to how we categorized and what we did next.
So, here’s an example of one of the scorecards, looking at demand, that was incorporated into the scorecard. This is an example of Computer Science, from our College of Computer & Information Sciences.
Computer Science 3 Year Credit Hours Trend
Here’s an example of how we looked at the Credit Hour Trends, so the credit hours was part of our assessment of trends. And this is something that’s a dashboard that our Dean’s can look at, our Chairs, they monitor this on a regular basis.
Competitive Analysis for Computer Science
And here’s also the competitive analysis that we would also incorporate into how we’re looking at the competitive demand.
Sample Regis University Academic Council (RUAC) Dashboard Screenshot
This is one of the other pieces of data that went into the contribution margin. Actually, this is the number of completions. So, what we did was, looking at the number of completions, was to put the number of completions into perspective of all of our programs to see what are those programs that are graduating the most number of students. And it helped to give perspective to the programs that are contributing the most overall to our overall bottom-line in order to help us strategize how important each one of those programs are, because we could have a program that is doing really well in terms of student completions or in terms of resource efficiency but if it’s contributing a very small amount overall to the university it may not be a priority for resource allocation.
So now I’ll go back, I’d like to just look back to our matrix and just say what we ended up doing as a result of this program was we were able to target programs for growth, sustaining and intervention. And we took those programs to the Deans and they were able to create strategies and priorities. They had to priorities this year three programs which were either in the growth or intervention category and strategize around those programs in terms of what are the programs that require the least amount of resources to grow, with the highest amount of value and potential. And what are programs that we may look at revamping or even closing, in the intervention category.
I did just want to speak a little bit to what we learned; some of the things that we learned in terms of takeaways was that we initially started with naming our intervene category a stop category. And the academic council felt strongly that they wanted to make sure that any program that ended up being reviewed in that quadrant was not seen as something that we would intentionally stop right away. It may not even be a problem in terms of stopping a program. It may be that the program needs intentional focus on reworking the academic pieces of it to make it more resource efficient or to improve the quality to make it more attractive to students in terms of taking those programs. So, we changed it to intervention.
And the other big takeaway for us was to include the faculty. We really felt strongly that the faculty had to be involved with this process from the very beginning. They helped, actually, to choose and to give feedback on what all of that criteria were that we were looking at. We also took that criteria to the budget committee to get their feedback. And so those areas that I was showing you on our scorecard, I wrote a report, a lot of feedback from the whole university went into that. So, there was buy-in across the board. So when we came to the point of creating the portfolio strategies, we had buy-in from all of the major stakeholders. So, I’m open to any questions that anyone might have.
Bob Atkins: Well, if you assemble your thoughts and we would welcome your questions. Let me take a moment and just summarize what we’ve discussed today. I think we’re looking at a growing market, finally. It’s not growing fast, but it’s growing slightly and it looks as though the long decline is finally grinding itself to a close, which is certainly welcome. Second is, and by the way, the growth is happening online and not on the ground, so many traditional schools are still going to continue to feel the crunch.
The second part is just how do you evaluate a program? And, as Linda and I mentioned, you really have to take a multi-dimensional look at it. You do need to understand that there’s still a market for that program. Some of the programs that are nearest and dearest to me, the markets really have dropped down to the point where they’re very challenging. Language programs is an example, which I’ve studied as an undergraduate, are really challenging to offer these days.
Second, you need to understand your program economics. Very, very few schools have the kind of endowment and funding that would allow them to continue to offer programs that are losing money, unless they’re core to the mission as a school. So, knowing which ones make money and which ones don’t is actually quite important. Are you going to continue to offer programs that may not make money? But you do need to have others that offset those losses. And keeping that all in balance is critical to the financial sustainability of the institution. That’s an important dimension, again I always have to mention when we talk about economics, it ain’t the only thing. And it’s not even the most important thing. But it is one of the things you have to keep an eye on.
And finally, Institutional Mission is vital, as are Academic Standards and those two items really go hand in hand. And when we talk about Academic Standards it’s both the rigor and quality of the curriculum as well as the outcome the students actually get from that program, do they make it through, do they learn what they are supposed to learn from that program? And I think, when you put those altogether, you really have a robust view of the health of an individual program. And as you aggregate that up, you can get a sense of whether the overall portfolio of programs you offer as a school is healthy as well. So those are our thoughts. I want to thank all of you for joining us, especially thank Linda for taking the time to join with us today. I’ll stop and, again, we’re open questions if anybody has anything they’d like to ask us?
Mark Keleher: Thanks for that, Bob and Linda. And we have a couple of questions coming in here, but again, all of these slides will be available at the conclusion of the webcast this afternoon, you’ll be able to download them on the post meeting survey page as well as a recording will be emailed to all of the registrants.
So first of all, I guess this question’s for Linda. Linda, going through the process you mentioned it was an aggressive process or a task at hand. I guess we’re wondering what surprised you about the process? It sounds like you had some outcomes where you guys learned about making sure that including faculty in the process was key, but I’m just trying to understand if there’s something about the process that surprised you?
Linda Osterlund: I think that we were impressed. I don’t know if I’d say surprised. We were impressed by the way we were able to engage the faculty in looking at the data. Initially faculty, many of them, may not be those who focus on data. We had faculties who were professors of English or Fine Arts and we were impressed with the fact that they were all able to get onboard and begin to understand the data that they were looking at. So that they could be engaged in the process of decision making. And I think we did a pretty good job of balancing both the qualitative and the quantitative aspects of the data.
The other thing that I think I was impressed with was just how on target they were. So, when those final decisions were made about the growth and the sustain and the intervene, what the academic council decided in terms of which programs went where, pretty much 99.9% sit with what the Dean’s thought it should be and when it was presented all the way up to our board of trustees, there was a sense that we nailed it.
Mark Keleher: Perfect. Thanks for that insight, Linda. Another question, this is for Bob or Linda. On an economics aspect, did you fully load credits with institutional and school indirect costs?
Bob Atkins: At this point I would say so. The reason for that is when you’re making programmatic decisions, you’re probably not going to be changing those indirect costs. So, the president’s salary is not going to change if you drop your French program. And as a result, we’ve excluded them. We’re going to be building that in so it’s an option, but if you do go down that path, I would suggest you have the ability to turn your overheads on and off in your models so that you can see what the fully loaded cost is but also be able to see the direct instructional cost, which is what really will change when you make program decisions. Linda, I’m not sure what you all did on that?
Linda Osterlund: So, I think that what happened for us is we loaded the credits, we didn’t include the indirect costs. So, ours was primarily just direct costs and we were only able to drill down into the college level for this report. I think we’re working on getting a better picture where we might also -- we’re developing a formula where we will look at indirect and direct costs to give us a better idea of what our program costs are at that level. I’m appreciating what Gray is developing though and seeing that that could add value to what we’re looking at the contribution margin by program.
Bob Atkins: Great.
Mark Keleher: [36:08 -- inaudible] the next question. One of the audience members was asking if we have a platform and system to provide this information on a quarterly basis to division chairs and Deans.
Bob Atkins: So let me take a crack at that. There are two kinds of information you may be referring to: market data and the program economics. On the market data side, we absolutely do have a system, we do update it quarterly, it’ what we call our program evaluation system and it tracks all that market data and what can be updated is, so some of the databases -- we use national databases that are only updated once a year, but a lot of the demand data is also updated -- we get monthly and we update quarterly. So yes, absolutely can do that. On the program economics, I think quarterly is probably a little bit too often. If it’s more often than your numbers are going to change. So, I would think we’d be looking at that by semester for most schools, as opposed to by quarter.
Linda Osterlund: I can just add, in terms of how we’re using the Gray data, we are using it not only to look at our entire portfolio but every time a program or school comes forward with an idea for a new program or proposal, we make sure to go back to the database and see where that program lands. And that’s part of the criteria for having a program approved to be started. So that addresses the start quadrant of what I showed you in terms of the matrix.
Mark Keleher: Perfect, very helpful. Another question I need to follow up to the indirect costs that was said earlier. What about marketing and recruitment cost per program? Since those can vary. Are they considered in the economic program evaluation?
Bob Atkins: They should be, yes. And they will be to the extent a school can identify how they vary by program. Some schools rely almost exclusively on brand marketing. So, they’re not going to be able to break that cost down any real way by program. Other schools are almost exactly the opposite, all their marketing is by program and it should be tracked that way. So, it will all depend on the school in terms of what’s available as well as what makes sense to allocate to an individual program. On admissions, it tends to be most schools more of an aggregate function which is often broken down in terms of individual schools, rarely by program. So, I would say there you will end up having to say some sort of allocation based on student head count. But if there are any dedicated people, they may be dedicated to say the Healthcare and Nursing or something like that and then I think absolutely you could split it out and count it. But in many schools that doesn’t exist.
Mark Keleher: Perfect. Thanks for that, Bob. One final question, letting the others come in, but I just want to say that we are happy and appreciate the questions so if there are any other additional questions please feel free to put them into the chat handle on the left-hand side of the page. Bob, what service do we use to serve up the visuals? Tableau? Excel?
Bob Atkins: We do all our stuff on something called ClickView. It’s one of the top three or four business intelligence applications. It’s not got the prettiest graphics actually. But it’s very robust from a data management standpoint and we’re moving around millions of records inside the system. So, it’s important to us. What really distinguishes ClickView is that in runs in memory, that is in RAM. Which means that it can work with data, ten to a thousand times faster than a traditional disk search, depending on what you’re trying to do. So, we’re dealing with really huge data sets. It’s a wonderful tool. If I were looking at it right now I’d probably be more on the side of using a product of theirs called, Click Sense. So, if you’re in that market looking for big data handling tools, that has really nice visualizations as well as very robust data architecture.
Linda Osterlund: For the Regis visuals, we use PowerBI, from Microsoft. And that’s something we can give the Deans and Chairs access to.
Bob Atkins: Yes, that’s one of the top three or four as well. It might even be number one.
Mark Keleher: Great. Well thank you all very much and thanks as well to Linda. I look forward to speaking with you all next month when we’ll share the results for August.
Linda Osterlund: Thank you.
AUDIO END: [0:40:47]