All posts by ax8729ap

PSOL Aggregations

For the PSOL, Noel-Levitz creates five categories from the 26 core items that have both a satisfaction and importance score. Those five categories are:

  1. Academic services (7)
  2. Enrollment services (4)
  3. Institutional perceptions (2)
  4. Instructional services (8)
  5. Student services (5)

In each of these categories, there are two or more questions (indicated in parentheses above) that are combined to create the aggregate score.

Column chart of PSOL results

This column chart shows the LSC scores as compared to both the peer group of 13 similar two-year schools as well as the national results from more than 60 schools. Our best performance is in the Enrollment Services category and our largest opportunity for improvement is in the Student Services category.

To really make any improvements in these rankings you must look at the underlying specific survey items and attempt to improve services or otherwise improve the level of satisfaction expressed by the students. These are the five questions that comprise the Student Services category score:

  • 10. This institution responds quickly when I request information. (Our score here is good!)
  • 15. Channels are available for providing timely responses to student complaints.
  • 19. Online career services are available.
  • 22. I am aware of whom to contact for questions about programs and services.
  • 26. The bookstore provides timely service to students. (NOT considered to be Student Services on our campus.)

Age Differences

In the previous post I suggested that we needed to compare the demographics of the LSC PSOL students with the students in the peer group to see if there were any significant differences which might make the results not perfectly comparable. My conclusion was that there are several demographic factors where the groups differ, but in total they appear to just about even out without a huge bias either in favor of or fighting against one group or the other.

One of the factors of possible differences has to do with the larger percentage of younger online learners (46.2% vs. 34.2%) that we have at LSC. At the Noel-Levitz conference this summer I learned that their research has shown that the younger students generally show significantly lower satisfaction rates than the older students. It finally occurred to me that I do have the data to study for evidence of this phenomenon, at least for the LSC students.

The LSC data does show a huge difference between the two groups of learners. Those 24 years and younger are less satisfied than those 25 years and older on all but one of the 36 items ranked for both importance and satisfaction. The only item where younger students are more satisfied is item #5: “My program advisor helps me work toward career goals.”? Even then the difference was relatively small (5.21 vs. 5.10 on a 7-point scale) and this was the item where we had the overall lowest level of student satisfaction.

It was also interesting to note that they were not only universally less satisfied, but that they also ranked 35 of the 36 items as being less important than the older students did. Many of the differences in importance are rather large, such as shown in the embedded worksheet below. These 12 items had the largest difference in Importance score from the two student groupings. I couldn’t get the embedded spreadsheet to work here, so I inserted a screenshot below, or go here to open in a new window.

age_differences


Again, this really begs the question about how do we alter our services provided on the basis of this information? You let me know if you have any ideas and I’ll let you know if I ever come up with one of my own.

Peer Group Demographics

I posted recently about our 13-school peer comparison group for the 2006 PSOL and how this would be our best group for benchmarking since they are all two year schools. It is also necessary to look a little deeper to make sure that the respondents are similar so I compared the demographics of the two groups.

  • Ours were 81.6% female, theirs were 78.7% female. Females generally give more favorable satisfaction ratings on such surveys (unless they are being surveyed about their husbands).
  • Ours were 46.2% age 24 and under, theirs were 34.2% in that age range. Younger students tend to give lower satisfaction ratings than the older age groups.
  • 58.2% of ours indicated that they were primarily online students, compared to 59.7% of theirs. It has been my experience that those who identify themselves as mainly online tend to look more favorably upon their experience learning online than those who consider themselves to be primarily on-campus learners just taking an online course or two.
  • 61.2% of our students indicated that they are taking full-time class loads, compared to 47.1% of the peer group students. Part-time students tend to rate their satisfaction more highly than full-time students.
  • It was almost a draw for the percentage indicating that their educational goal was to complete an online degree program: 21.9% for us and 22.7% for them.
  • Even more of a draw for the percentage indicating that their educational goal was to complete an on-campus degree program: 34.7% for us and 35.0% for them.
  • For our students, 62.1% indicated that they had taken 0-3 previous online courses, while 75% of the peer group had that same level of previous experience. From a pure logic perspective (or maybe it is common sense), those with more experience with online courses tend to have higher satisfaction ratings, with the best measure of their satisfaction being their continued enrollment in more online courses.

Therefore, the question to be answered is whether our student group would tend to be more favorable than the peer student group. Here is my opinion:

  • Gender: slight bias in favor of high LSC satisfaction ratings, but not much.
  • Age: significant bias against high LSC satisfaction ratings.
  • Primarily online: no significant difference.
  • Part-time class loads: significant bias against high LSC satisfaction ratings.
  • Educational goal: no significant difference.
  • Previous online experience: significant bias in favor of high LSC satisfaction ratings.

The net result? I’d call it a draw, although really I think that there is a slight demographic bias against favorable satisfaction ratings for LSC compared to the peer group. The fact is that the LSC student satisfaction ratings were significantly higher than the peer group ratings. I’m arguing here that those are real differences, not caused by survey respondent biases. The first chart below shows how the LSC online students answered the questions about current plans/reasons for taking one or more online courses. The second chart is for the peer group of students.


Current Plans - LSC Students - http://zohosheet.com


Current Plans - Peer Group - http://zohosheet.com

2006 PSOL Strengths

PSOL Strengths for Lake Superior College

Our 2006 Institutional report for the Noel-Levitz PSOL identified the following strengths for LSC Online (listed in rank order beginning with the strongest item).

I’ve had a few discussions with Noel-Levitz representatives about how I think the “strengths” need to be redefined. Right now the strengths are what I would call “internal strengths,” without considering whether they are strengths in the larger context of the e-learning environment, or at least of the schools that have used this instrument.

  • 11. Student assignments are clearly defined in the syllabus.
  • 18. Registration for online courses is convenient.
  • 33. Logging-in (managing usernames and passwords) for various services across the campus is easy and consistent. (Campus item 7)
  • 6. Tuition paid is a worthwhile investment.
  • 25. Faculty are responsive to student needs.
  • 7. Program requirements are clear and reasonable.
  • 3. Instructional materials are appropriate for program content. **
  • 23. Billing and payment procedures are convenient for me.
  • 13. The frequency of student and instructor interactions is adequate.

** Note: I am not willing to consider this a strength since our satisfaction score is slightly below (-.03) the national average for this item. Strengths are defined as those items above the mid-point in importance and in the top quartile of satisfaction, for your institution only. In other words, it is possible to have something identified as a strength when it is not a very favorable score, such as falling below the national average. I have a hard time seeing that as a strength. To really consider something as a strength, I believe that it must be both an internal strength (as defined by Noel-Levitz) and an external strength which means that it exceeds the performance level of the national group or preferably it exceeds the performance of a peer group (as defined by me).

Technorati Tags: , ,

PSOL Peer Group

On September 13, 2006, I received new data from Noel-Levitz regarding the PSOL (Priorities Survey for Online Learners) that we gave for the third year in a row during spring semester 2006. The data have always been useful but the report received today is the most useful for benchmarking that we have ever received.

Coming from a two-year public institution, our comparisons with the national results were interesting but not terribly useful for benchmarking purposes. Many of the schools included in the national results are very dissimilar from LSC. It includes graduate schools, for-profit schools, schools that are completely online (not blended as we are with both F2F and online), and overall just a hodge-podge of different schools that have used the survey during the past several years.

This spring we received better data because 16 of our sister institutions in MnOnline also gave the same survey at the same time. This was definitely better because it gave us comparison data with a group of schools from our own system. However, that still compares us with the state universities as well as some two-year schools who aren’t as experienced as we are at providing online learning and services. Also, no one really wants to talk about the consortium data because the aggregated student satisfaction levels are really quite low.

So, this new report from Noel-Levitz (that we paid for) compares the LSC results with a peer group of 13 two-year institutions from throughout the country. Approximately 3,900 student responses comprise the peer group and they come from a mixture of traditional community colleges as well as technical colleges, much like how LSC is a combined community and technical college.

On the 26 standard items on the PSOL, LSC students were more satisfied on 19 of the items and less satisfied on 7. Even more importantly, there were six of the 26 items where the difference in satisfaction levels were statistically significant. On all six of those items, the LSC score was much higher than the peer group score.

Below I’ll post a couple of charts that show some of the data comparing the LSC “Primarily Online” students with the similar group from the peer group institutions. The first chart shows the three most significant differences (.001 level) from that data set.

LSC vs. Peer Group - http://www.zohosheet.com

The second chart shows the two items that were significant at the .01 level, again coming from the “Primarily Online” segment of the data pool. FY06 Noel-Levitz PSOL: LSC vs. Peer Group - http://www.zohosheet.com The third and final chart shows the four items that were significant at the .05 level. This data set of primarily online students had nine total differences that were significant, and all nine were positive differences for LSC.

FY06 Noel-Levitz PSOL: LSC vs. Peer Group - http://www.zohosheet.com

Charts made with ZohoSheet.

Technorati Tags: , , ,

2006 Noel-Levitz Conference

Noel-LevitzIn July 2006 I attended the annual Noel-Levitz conference in Denver. The first day was a client workshop on the various N-L surveys of student satisfaction and importance, such as the SSI (Student Satisfaction Inventory) and the PSOL (Priorities Survey for Online Learners). They had just updated the overall results for the PSOL. There are now just over 34,000 student survey submissions from 78 institutions. This is approximately a doubling in the number of records since the 2005 data was released. Here are a few of the demographics of this national group of online learners:

  • Female: 68%, Male: 32%

  • Age Distribution:
    • 24 & under: 19%
    • 25-34 years: 30%
    • 35-44 years: 27%
    • 45 & older: 24%

  • Current Enrollment:

    • primarily online: 82%
    • primarily on-ground: 18%

  • Class load:

    • Full-time 57%
    • Part-time: 43%

  • Employment:

    • full-time: 71%
    • part-time: 16%
    • not employed: 13%

  • Educational goal:
    • Associate degree: 14%
    • Bachelor: 34%
    • Master: 26%
    • Doctorate: 22%

  • Current online enrollment:

    • 1-3 credits: 27%
    • 4-6 credits: 34%
    • 7-9 credits: 16%
    • 10-12 credits: 12%
    • 13 or more credits: 11%

  • Previous online enrollment:

    • no classes: 25%
    • 1-3 classes: 37%
    • 4-6 classes: 15%
    • 7-9 classes: 8%
    • 10 or more: 15%

Those numbers include a little less than 3,000 students from MnOnline who completed the survey during Feb-Mar of 2006.

Technorati Tags: ,

Comparing Online to On-ground

Click screenshots to enlarge. These two charts above show the survey results from 2004 from two different Noel-Levitz surveys at LSC. The N-L Student Satisfaction Inventory (SSI) is given to a sample of student taking traditional on-ground (or Face-to-Face or F2F) courses. This post deals with six questions that appear in essentially the same form on the two questions. I feel comfortable in comparing the results on those seven questions to guage the differences in relative importance and satisfaction when comparing the F2F students with the online students. There are an additional 6 or 7 questions that are reasonably comparable but I will hold those for a later post.

The importance scores for these six items provide some useful information. Three of the six items indicate much greater importance from online learners, and the other three show no significant difference in the level of importance. The three with a large difference in importance are:

SSI #18. The quality of instruction I receive in most of my classes is excellent. Importance = 6.35
PSOL #20. The quality of online instruction is excellent. Importance = 6.59

SSI #46. Faculty provide timely feedback about student progress in a course. Importance = 6.01
PSOL #04. Faculty provide timely feedback about student progress. Importance = 6.47

SSI #62. Bookstore staff are helpful. Importance = 5.75
PSOL #26. The bookstore provides timely service to students. Importance = 6.37

For those same three items, the satisfaction scores are as follows.

SSI #18. The quality of instruction I receive in most of my classes is excellent. Satisfaction = 5.43
PSOL #20. The quality of online instruction is excellent. Satisfaction = 5.84
Difference of .41 (huge) in favor of online.

SSI #46. Faculty provide timely feedback about student progress in a course. Satisfaction = 5.06
PSOL #04. Faculty provide timely feedback about student progress. Satisfaction = 5.80
Difference of .74 (massive) in favor of online.

SSI #62. Bookstore staff are helpful. Satisfaction = 5.10
PSOL #26. The bookstore provides timely service to students. Satisfaction = 5.28
Difference of .18 (significant) in favor of online (and this was before we had an online bookstore).

For those other three items, the importance was about the same. Satisfaction scores are as follows.

SSI #14. Library resources and services are adequate. Satisfaction = 5.45
PSOL #21. Adequate online library resources are provided. Satisfaction = 5.61
Difference of .16 which is somewhat significant, but not huge.

SI #45. This institution has a good reputation within the community. Satisfaction = 5.34
PSOL #01. This institution has a good reputation. Satisfaction = 5.70
Difference of .36 which is very significant.

SSI #50. Tutoring services are readily available. Satisfaction = 5.32
PSOL #24. Tutoring services are readily available for online courses. Satisfaction = 5.38
Difference of .06 which is inconsequential.

The last chart shows the “Gap Analysis” whihc is a Noel-Levitz recommended measure to focus on those items where there is a large difference between importance and satisfaction. The large the column in the chart below, the greater the opportunity for improvement.

The largest gaps reported by on-ground students are in “Timely Feedback from Faculty” and “Excellent Quality of Instruction.” In both of those case, online students reported a significantly smaller gap between their expectations and reality. The largest gap for online students was with bookstore service, even though the online students were more satisfied with the bookstore than on-ground students (but they placed a higher level of importance on bookstore service.) Also, it can be argues that the questions are not perfectly comparable with F2F survey asking about “helpfulness” of employees and online survey asking about timely service. Also, the gap for online students decreased in the next year after we started an online campus bookstore. More on that in a future post.

Reasons for Using PSOL

Noel-Levitz surveys are generally of very high quality and the PSOL fits that description. It is the only nationally-normed satisfaction survey that focuses on the online learners that I was aware of when we first used it in 2004. Using it three years in a row gave us a good baseline for measuring overall satisfaction and also for monitoring changes in satisfaction over time. We had tried to use a survey developed internally but found the survey results to be of limited usefulness since we had no comparison data.
PSOL_Reasons

PSOL Components

The PSOL is made up of various types of questions. A PDF of the survey is available here. Please keep in mind that this survey is the copyrighted property of Noel-Levitz.

PSOL_Components

The most useful information comes from the 36 Priorities questions. There are 26 standard questions (used for national averages) and you can add 10 more campus-specific questions. For each of these 36 questions, students indicate both (A) how important the item is to them and (B) how satisfied they are with that item.

It is this comparison of Importance/Satisfaction that is the focus of planning for areas to make the most meaningful improvement in services or information.