Monthly Archives: March 2009

PSOL Gap Analysis

Watch the gap signListed below are the 12 largest gaps out of the 36 items in the PSOL for Lake Superior College in spring 2009. The difference between importance and satisfaction as rated by the students is known as the gap, which represents the amount of improvement needed in your satisfaction score to bring it up to equal the importance score. Gap analysis is a way of concentrating your improvement efforts in the right areas to maximize value for students.

CC photo by Joe Shlabotnik

FY09 PSOL (Priorities Survey for Online Learners) LSC Students
Item – Gap: high to low Import. Satisf. Gap
20. The quality of online instruction is excellent. 6.56 5.68 0.88
12. There are sufficient offerings within my program of study. 6.44 5.58 0.86
09. Adequate financial aid is available. 6.46 5.66 0.80
06. Tuition paid is a worthwhile investment. 6.55 5.77 0.78
32. Layout of courses, as designed by instructors, is easy to navigate and understand. 6.60 5.83 0.77
11. Student assignments are clearly defined in the syllabus. 6.51 5.80 0.71
05. My program advisor helps me work toward career goals. 6.11 5.40 0.71
33. Instructions to students on how to meet the course learning objectives are adequate and clearly written. 6.58 5.90 0.68
07. Program requirements are clear and reasonable. 6.50 5.82 0.68
35. Instructional materials have sufficient depth in content to learn the subject. 6.54 5.91 0.63
30. Interactions I have with online instructors are useful to me in the learning process. 6.46 5.84 0.62
22. I am aware of whom to contact for questions about programs and services. 6.28 5.68 0.60

Many schools have gaps above 1.0, but we typically (in 5 years of surveying) have not had any gaps that large. Still, it is important for us to focus on a few areas each year for targeted improvement. In order to achieve an improvement in the survey scores over time, a couple of factors need to be present:

  • the item needs to be something that you have some control over or can influence where that control is held
  • the item needs to be controllable in the short-term (if you want to see reportable improvements in the next year or two)
  • the item needs to be something where the cost of achieving the improvement is less than the benefits to be received, if possible

For example, item #9 is the statement that “Adequate financial aid is available.” Although we might be able to add another scholarship or two to those already offered, we really don’t have much ability to make a difference in the amount of financial aid available to students. So, although it’s nice to know what they think about the availability of financial aid, there’s little we can do about it. That is not a gap that we (as an institution) can do much about. However, the Feds and the State of Minnesota sure could make a difference there if they wanted to help educate the populace.

On the other hand, there are five items in this top 12 list which could be targeted for improvement with a major campaign to help faculty improve these items in their courses:

  • #11 – Student assignments are clearly defined in the syllabus.
  • #30 – Interactions I have with online instructors are useful to me in the learning process.
  • #32 – Layout of courses, as designed by instructors, is easy to navigate and understand.
  • #33 – Instructions to students about how to meet the course learning objectives are adequate and clearly written.
  • #35 – Instructional materials have sufficient depth in content to learn the subject.

Our online faculty peer review process absolutely makes a difference in items 32, 33, and 35 shown above. In fact, I shudder to think how large the gap might be if we didn’t have that process in place and if we hadn’t made good progress over the past several years. Still, not all faculty participate in the voluntary process, and not all courses have been reviewed. Therefore, we have the opportunity to encourage all online faculty to consider spending some time to beef up these areas, and of course we can provide some professional development opportunities to help them do so. By doing so, we should also help make a difference in the largest gap on the list, item #20.

Added Questions – Mission Accomplished

For the Noel-Levitz PSOL, there are spaces for 10 customized questions in addition to the 26 standard questions (statements, actually) where students indicate both their levels of importance and satisfaction. Since we give the survey in conjunction with many other MnSCU schools we all use five common statements that get at some of the data that they need to collect for the MnOnline system as a whole. That leaves five more survey slots for each college or university to ask whatever they want.

This year I made an effort to find five statements that the students would rank with high importance scores. I don’t fall into the camp that says that you need to keep asking the same questions each year for the sake of continuity. If a statement didn’t pan out with a high enough importance score in previous years, I want to try a new one in hopes of hitting those things that are most important to students.

I sent out a call to other MnSCU schools to try to find out which statements they had used in previous years that had the highest importance scores for students. I set up a wiki for people at other schools to share information about the added statements that they have used in previous years. Some good information was gathered through the wiki, but at the last second I had a bit of an epiphany – yet one that I wasn’t sure would bear fruit.

For five years now we’ve had an online course design peer review process, modeled after Quality Matters of MarylandOnline. It occurred to me that there would probably be some items in the course review rubric (sample completed rubric PDF) that would make worthy survey statements. BINGO! I picked five statements from different sections of the rubric and added them to the PSOL, statements 32-36. I was very pleased to see these five items come back with very high importance scores. In fact, the five LSC added items were all ranked in the top 10 (out of 36) for importance. Our five added statements are highlighted in red below.

FY09 Noel-Levitz PSOL (Priorities Survey for Online Learners) LSC Students
Item – Importance high to low (top 10 items only)
Import. Satisf. Gap
32. Layout of courses, as designed by instructors, is easy to navigate and understand. 6.60 5.83 0.77
33. Instructions to students on how to meet the course learning objectives are adequate and clearly written. 6.58 5.90 0.68
20. The quality of online instruction is excellent. 6.56 5.68 0.88
28. The online course delivery platform (Desire2Learn or D2L) is reliable. 6.56 6.04 0.52
34. Grading policies are easy to locate and understand in courses. 6.56 6.09 0.47
06. Tuition paid is a worthwhile investment. 6.55 5.77 0.78
18. Registration for online courses is convenient. 6.55 6.36 0.19
36. Clear standards are set in courses for instructor availability and response time. 6.55 5.99 0.56
31. Taking an online course allowed me to stay on track with my educational goals. 6.54 6.18 0.36
35. Instructional materials have sufficient depth in content to learn the subject. 6.54 5.91 0.63

In addition to the five highlighted in red above, statements #28 and #31 are part of the 5 added statements for MnOnline. Therefore, only 3 of the top 10 items come from the 26 standard statements on the basic PSOL.

This seems particularly reaffirming to me. One – it indicates that many of the items identified on the course design quality rubric are not just important to teachers, but they’re important to students as well (that’s not always the case). Two – because the satisfaction scores on those items are also pretty decent (all in the top 50% for satisfaction ranking), it also appears that our peer review process is making a difference that is recognized by students. Three – it’s always nice when you find out that what you thought should be important actually is important. Overall, I think all of us can be very proud of these survey results.

For the record, the five highlighted statements shown above correlate to the following standards on the LSC course design rubric:  I.2, II.2, III.2, IV.1, and V.3.

Survey Incentives for Everyone

This year for the Noel-Levitz PSOL I tried a different approach with survey incentives. Every student who completed the student satisfaction survey received a 2GB flash drive for their trouble. Since this is the fifth time we have used the survey, I have some pretty good baselines to compare results with, and partly I was interested in what a difference the more liberal incentives policy would make. Last year we gave out 40 flash drives in a random drawing and saw an increase in submissions from prior years. This year we went another step down that road.

PSOL response rates chart

We received 121 more responses than the previous high number of returns from 2008. The percentage response rate only went up from 23% to 25%, which is a pretty modest increase. However, that increase of only 2 percentage points resulted in an increase of 24% more responses (121 more with a 2008 base of 458 responses) since our online student population was also a couple hundred students higher in 2009 than in 2008.

Basically, we spent about $3,000 in incentives in order to get significantly more information. You’ll see in the upcoming posts that the satisfaction data collected this year is very similar to the data in all the other years. One thing I was interested in seeing is whether the added incentives might change the results – would more students answer the survey without really putting out the effort to answer sincerely, for example? Apparently not (at least in my opinion).