Customer satisfaction

I’m a firm believer in student feedback. My institution carries out an instructor evaluation survey each semester, but as it takes a while to process the data, we rarely get the results in time to actually do something about them the following semester. This is why I always ask my students to complete a course satisfaction survey that I can use immediately. (Incidentally, I’ve not noticed before that the names seem to indicate the institution primarily wants to know if the students are happy with me, whereas I’d rather know if they’re happy with the course – or maybe the institution sees us as one and the same?) 

When I first started teaching my writing skills course, I used paper-based feedback forms, then I moved on to Google Forms, while last semester I used the Moodle Feedback activity for the first time and was very pleased with the presentation/analysis of the results. Even though I chose not to record user names when gathering responses, I waited until the exams were over before looking at what the students had to say as I wanted to a) not have anything prejudice the grades, and b) wrap up the course with their comments and suggestions.

The new semester started last week and a new group of students will be taking the course, so it seems to be the perfect time to indulge in a (hopefully) brief analysis of the results and bring some order to my initial impression – enthusiastic, but not particularly insightful – that there is a lot of useful information in there.

A satisfied customer about to take a nap
A satisfied customer about to take a nap

I included 15 questions in the feedback form. My first impulse was to include about three times as many, but I nipped that one in the bud, reasoning that this would probably result in random clicking just to get the questions over with. Besides, the students had been reflecting on each unit of the course in their portfolios as the semester progressed, so I already knew how they felt about some aspects.

I tried to strike a balance between Likert type (7), multiple-choice (3) and open-ended questions (5). Here they are in the order they appeared in the form, each followed by a comment on the answers.

1. The required level of English in the course was too advanced for me to participate easily.

Sure, it stands to reason that I would know this after four months of reading their writing. It’s also a course requirement that students are at least B1 and they were given a placement test at the beginning of the course (which showed that the majority were B2 or even C1). So, yes, I was reasonably sure that nobody would agree completely with this statement, but as the course was entirely in English (we had Erasmus students), and quite a lot of authentic materials were included, I wanted to see hard numbers.

On a scale of 1-5, most chose 1 (to indicate they disagreed completely), but not everyone, which puts me in a bit of a dilemma. If it’s a course requirement that students are B1, surely those who are at that level shouldn’t be made to feel at a disadvantage. On the other hand, if the great majority of students are B2/C1, they won’t be challenged by B1 content.

2. The required level of technical knowledge was too advanced for me to participate easily.

Again, the majority disagreed, but this time only 50% did so completely. Most of the rest opted for “mostly disagree”, which would indicate that the technical difficulties weren’t serious enough to prevent participation, but still, (lack of) technical knowledge was apparently a greater obstacle than the required language level.

This was not entirely unanticipated, as we’d dealt with various technical issues throughout the course, so next came an open-ended question:

3. Which technical problems did you experience and could the instructor have done anything to help you overcome these?

There weren’t many answers to this, possibly also because I’d set it as optional. Most referred to a lack of familiarity with online courses – only two students said they’d taken such a course before – but problems with videos were mentioned. There were a few Present.me videos in the course, which worked fine when embedded, but I’ve heard suggestions that it’s better to upload videos directly into Moodle. So I did that, but it didn’t work very well; the videos were viewable, but apparently not without a separate set of instructions. Sorting this out is one of my priorities this semester.

Luckily, the instructor got positive reviews in the second part of this question, which I was particularly pleased with, especially after I saw students in another course were simply instructed to contact the admin in case of technical difficulties.

4. The activities in the course were very time-consuming.

The answers to this were quite diverse. A small number (15%) mostly disagreed – no one disagreed completely – 20% neither agreed nor disagreed, 40% mostly agreed and 20% agreed completely. (The numbers don’t add up to 100; I rounded them off for convenience.) This question was actually intended as a lead-in to a more specific one, namely:

5. On average, I spent the following amount of time on the course each week:

If the class were held on campus, students would spend 3 hours in class weekly, plus any additional time on homework/portfolio assignments. As this is equivalent to a certain number of ECTS points, there wasn’t supposed to be a noticeable discrepancy between the amount of time spent on the course f2f and online. And, in fact, there apparently wasn’t. 50% of the students said they’d spent 1-3 hours working online, and 50% claimed to have spent 3 hours or more. From this perspective, I should have provided the option of saying how much more.

6. If you did not participate regularly in the course – if there was any period apart from the Christmas break when you did not log on for 3 weeks or more – why was this? Could the instructor have done anything to make you participate more actively?

Perhaps not surprisingly, those who took part in the survey had participated regularly, and so, even though I would have very much liked to hear what those who hadn’t would have given as reasons, there wasn’t much in the way of clarification here. Two answers hinted at the difficulty of getting back on track once some activities had not been done on time, but were careful to absolve the instructor of any blame. 🙂

7. The instructions and updates on the course noticeboard were clear and timely.

Happily, most students completely agreed with this (only one mostly agreed). In a previous post I mentioned that I’d written roughly 5800 words on said noticeboard, so I think I wouldn’t have taken kindly to disagreement. 🙂 On a more serious note, I spent a lot of time wording particular posts just so, trying to be as precise and detailed as possible, and predict every possible doubt and query, so it was good to see that this had apparently been achieved.

8. The course content will prove useful in the future.

Another question that I liked the answers to. 60% agreed completely and 40% mostly, although now I wish I’d included the option of saying which parts students felt might not prove so useful. The course is based on selected chapters from two course books, chosen some years ago (not by me). I think I can see which parts might not seem so relevant from the students’ point of view, but I’d like to know for sure as maybe it’s more of an issue of how the content is being presented, rather than the content itself. Then I could do something about it.

9. When I addressed the instructor directly with an issue, my questions were answered quickly and satisfactorily.

Everyone agreed completely with this statement, although purely from the point of view of questionnaire construction I overlooked the possibility that some students didn’t need to contact me directly with an issue, so this question shouldn’t have been compulsory. Anyway, I expected most to agree here because I checked my email regularly and answered queries in the evenings and at the weekends. There really weren’t many of these overall, and I used to do this when I taught f2f as well, so it wasn’t as if it required any extra effort.

10. I found the feedback (from the instructor) in the portfolio relevant and useful.

The instructor is important here because the students also commented on one another’s journal entries in the portfolio. A high rate of agreement (85% agreed completely and 15% mostly) indicates that all the time spent on various feedback modes – error correction for some assignments, and podcasts and screencasts with more general comments – was worth it. However, I felt that despite doing my best to cover as much ground as I could in the feedback, there were still aspects that I wasn’t able to touch upon because it would have meant going into too much detail. I think this could be remedied by using the Moodle gradebook and I plan to do this next semester.

11. I would recommend the course to other students.

This question offered a choice of three answers: ‘yes’, ‘no’ and ‘don’t know’. Looking back, I should have included the option of saying why. Most did opt for ‘yes’, but 20% said they didn’t know, and it would be helpful to know whether their answer would have been the same had this not been an online course. Happily, no one went for ‘no’.

12. I have noticed an improvement in my writing skills in English over the past semester.

This was another multiple choice question, with the same answers as #11. Everyone answered ‘yes’ here, which I’m particularly happy with because I honestly wasn’t sure what to expect. With some people the improvement was obvious, but with some a little less so, especially those who had strong writing skills to begin with (those whom the placement test showed to be at C1, for instance.) I always worry a little, with advanced students, that they might feel they aren’t being challenged enough, particularly as many have been learning English for years.

The last three were open-ended questions and listing all the answers in this post would make it tediously long, so I opted for a random selection which I think illustrates well what the students found important.

13. List three things that you liked about the course and say why.

Not having to attend classes on campus and being able to complete tasks in pajamas was appreciated by a number of people. Students didn’t feel pressured as they progressed through the course and found it interesting to read one another’s reflections on individual units. Interactive activities and online memo boards got a lot of positive feedback as well. Some students particularly liked the fact that the course was online – they described this as modern and innovative – and liked learning about new technologies.

14. List three things that you did not like about the course and say how they could be improved.

I always ask for suggestions as to how something that a student didn’t like could be improved upon, but I don’t often get them. I don’t know whether this is because people feel it wouldn’t be appropriate if they suggested improvements, or simply don’t feel inspired?

Several people noted the course was time-consuming. Reflecting on the work done in individual units was also often seen as a little too much (yes, it is apparently fun reading other people’s reflections, but not writing your own. 🙂 ) Vying for the title of least popular were the unit on punctuation – I’m planning some serious revision of those materials – and the glossary activities. I found it particularly interesting that at least one person was annoyed at being faced with a new online tool in each unit and having to figure out how it worked. I guess I might have overdone it a little.

15. Do you have any other comments, suggestions or any other feedback? Perhaps you wanted to elaborate on an earlier (multiple choice) question?

I thought it was interesting that the majority of the answers here focused on the method of delivery rather than the content. Those who commented on the online aspect said that online learning is the future and that more courses at their institution should incorporate educational technologies to some extent. And it seems fitting to end by quoting the person who said, “It was definitely an experience!”

Do you include any of these questions when asking for feedback? If there is a question you think should be included or if you have picked up on something in the results which I didn’t address, I would love to hear from you in the comments!

Advertisements

2 thoughts on “Customer satisfaction

  1. Pingback: One step forward, one step back | After Octopus

  2. Pingback: Expectations satisfied? | After Octopus

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s