Correct me if I’m wrong

This post has been sort of brewing for a while: since the spring of 2015 if I’m honest. You may wonder how come I’m so sure about this. It’s because at the time I was using Kaizena for feedback and wanted to write about that. Only I never did.

Handwritten correction of less to fewer
Photo taken from http://flickr.com/eltpics by @sandymillin, used under a CC Attribution Non-Commercial license, https://creativecommons.org/licenses/by-nc/2.0/

Also, James Taylor had suggested at that year’s BELTA Day that instead of simply correcting student work, I could indicate the problem areas in the sentence and have the students do the correcting themselves. I found this idea very appealing and immediately put it into practice. I think we ran with it for a couple of semesters, but it turned out to be terribly time-consuming as I had to check every submission at least twice. Some I had to check three times because not all the students managed to do what I was hoping they would; i.e., they made a stab at correcting the error but went off in the wrong direction. I wanted to write about that too, only I never did.

In the post 7 things students expect from an online writing course (see the fourth thing), I briefly wrote about how I don’t actually do that much correcting. I’m not sure this is highly popular with students, as they’ve been taught to expect the instructor to correct their work, and there’s always the nagging feeling that they think I’m not doing my job properly. At the beginning of most semesters we discuss a couple of statements about writing as a group, one of which is: I expect the teacher/instructor to mark all the mistakes in my work. I ask the students to mark the statements as true or false and I don’t think I’ve ever had a student claim this particular one to be false for them.

I use this as an opportunity to explain that there are going to be three slightly longer pieces of writing throughout the semester on which they’ll be receiving detailed feedback and where everything that could be seen as a mistake or potentially confuse readers will be addressed, but apart from that, I won’t be correcting their grammar. One of the reasons for this is that a lot of the writing they do on the course is read by other students and I’ve always figured it wouldn’t exactly be productive to analyze to death something they’ve already used to communicate successfully.

I’ve recently completed this detailed correction for the first assignment of this semester and I wanted to have a kind of record what I do these days, both in terms of the tools involved and how I go about making corrections/giving feedback.

Since I stopped using Kaizena, I abandoned the idea of having students make corrections themselves. A quick digression: I’m pretty sure I’ve come across papers on Twitter on whether student correction of their own mistakes is effective, but haven’t bookmarked any, so please let me know if any research comes to mind. I think what I do now is fairly conventional. Students submit their work as a Word doc – or very occasionally in a different format which I then convert to Word so I can do my thing – and I upload the corrected versions of these back to Moodle when I’m done.

There are two types of interventions I do with the Word doc. If something is likely to be considered a mistake in terms of conventional grammar rules, I use the track changes option to correct this. If at all possible, I will add a comment explaining that this would be considered a mistake as far as standard usage rules are concerned. I’m not sure it’s very helpful to treat absolutely everything as fine just because it is fine in some dialect or other, although I do think students should be (made) aware of dialect differences. In my case, communication science students are generally aware of this in their L1, too, so my job is easier in this respect.

If I want to make a more general point, such as suggest that a student run a spell check on their submission, consider breaking up a longish paragraph into two or more if it seems to be addressing several ideas, or double check the meaning of a word they’ve used, I’ll add comment bubbles. I’ve done a post on a comment bank which I had – still have – in a regular Google Doc, but I’ve since come across this post on the Control Alt Achieve blog and started building up a comment bank in Google Keep, which does feel more organized. In the spirit of Sarah’s Twitter anniversary resolution, I think it was thanks to Adi Rajan that this post came up in my feed about two years ago.

Even though my online groups are small, giving feedback and correcting student work is time-consuming enough to make me want to know if there’s some kind of uptake, even if it’s just students reading my comments. When I used to ask them to correct their own mistakes, this obviously wasn’t something I worried about because they had to do it, even if perfunctorily, to make the corrections. The way I currently give feedback and correct though gives me no indication of whether the corrected version of the document has even been downloaded. There’s something I do about this in the second and third longer piece of writing (hopefully more on that in a future post) but for this first piece, what I do is include a reflection prompt on corrections and feedback in the portfolio section of the course. Not every student addresses this topic, but enough people do for me to feel that the work hasn’t been thrown away.

One other thing I should mention is the track changes option. There’s a tutorial in the course materials on how to view suggested changes if this option has been used. When I’ve corrected everyone’s submission, I post an announcement on the course noticeboard, pointing out that this tutorial is available should anyone want to have a look. (An indicator that they’ll want to have a look is if the suggested changes don’t show up for them automatically and they don’t know what to do about this.) Step two, when each individual submission is uploaded, the student is notified of this and the message says, among other things, that they should make sure to view the suggested changes – now as I write this, I realize that I should add a link to the tutorial on how to do this to the message.

The reason I mention this is that even though you think you’ve got it all covered, of course you don’t, and it is through a random comment that you realize that a student was completely unaware of any changes suggested to their text apart from the comment bubbles. Panic sets in as the idea surfaces that maybe no one has ever, in any of the last couple of semesters, seen any of the corrections. You’ve been doing it all for nothing, plus the students all think you haven’t actually been doing anything! The panic gradually fades away and you do all you can do, which is post an announcement explaining once again how corrections are made, a link to the tutorial, and a screenshot to illustrate how to access the review tab.

Thanks for reading and I’d love to hear how you address corrections/feedback/corrective feedback on written work, not necessarily online. Any tips?

Advertisements

Reflections on reflective writing

Photo taken from ELTpics by Ian James, used under a CC BY-NC 2.0 license

Often when I’m writing a blog post I realize there’s something I could go off on a tangent about and then I vaguely decide I’ll come back to that in another post, which I don’t very often do – I guess this is due to my irregular blogging habits. This is one of those other posts: when I blogged about our introductory campus sessions earlier this semester, it occurred to me that it might be a good idea to say a few words about the learning journal component of the course, or more specifically, about reflecting on learning in an online environment and possible attendant issues.

A learning journal can be very helpful in a semester-long asynchronous course. Apart from giving students an opportunity to think through and reflect critically on the material they’ve covered, it gives the instructor an insight into how everyone is coping in a different way than student assignments do. I might find out, for instance, how students feel about the time they have available to complete tasks, what they find useful about the feedback they receive, or what they think about task types that perhaps aren’t very typical in their offline courses, such as peer review (which I wrote about in more detail in this post). If the course were held on campus, I would probably be able to find much of this out in class.

The course has included this component since I first moved it online and has undergone a couple of tweaks in the meantime. Initially, the students had complete freedom re what they chose to reflect on after each unit, in that there were only some broad suggestions on the type of information they might want to add. The problem with that, it soon transpired, was that although some students clearly did not lack inspiration, there were others who found reflecting challenging, felt they didn’t have much to say or failed to see the point of the activity. Whatever the reason, students in this category wrote exceedingly brief comments whose sole purpose, I suspect, was to tick the “post a reflection in your journal” box.

At first, I tried to address this by posting questions on these students’ entries, hoping they would respond in greater detail, which met with varying degrees of success. After a couple of semesters, I added questions that could serve as writing prompts after each unit. Looking back, I have no idea why it took me so long to do this – I guess it was probably because I thought these were questions students should actually be asking themselves and they needn’t be the same for everyone. I still think so, but learning journals aren’t common practice in the Croatian education system and given that I was aware of this from the start, I’m surprised it didn’t occur to me sooner that students might find model questions useful. The semester I introduced questions student journals became noticeably more focused overall.

Some time after this, I began covering reflective writing in the introductory face-to-face sessions as well, the idea being that this would help students see the learning journal as more than just an afterthought. I begin by explaining what this component entails and show the students a sample journal from an earlier semester, to illustrate what the final product looks like. I choose one at random, although I think I’ll have to start checking with ex-students if I have their consent, on account of GDPR. Afterwards, we take a look at one of those exceedingly brief comments from one of the early iterations of the course, discuss what seems to be lacking at first glance and how each point could be expanded on. This is followed up by a few general good practice suggestions on reflective writing.

What I try to do in the session before this one is set aside 15 minutes for students to answer 2-3 questions of the type they will be addressing in their learning journal entries. This can be at any point during the session. I simply ask them to answer the questions however they think they should best be answered in the next 10 minutes or so. At this point I don’t want students to think about reflective writing as a genre, so there is no guidance nor are there any constraints apart from the time they have available.

I collect these and in the next session, after we’ve talked about how a very general comment can be made more specific, I show them a few examples of how this has been achieved in the pieces of writing they handed in in our last session. These are anonymized but I hope people recognize what they’ve written and that it has a motivating effect. Some terms are marked in red because they are still a little vague and we discuss how these parts of the text could be rendered more specific.

This in combination with the questions to reflect on after each unit generally produces good results. There are still students each semester who struggle with what to write about but after they’ve received personalized feedback on their first reflection, suggesting how they could expand on areas that may be overly general and thus possibly not so useful, their reflections generally become more detailed and specific.

One thing I’m not as happy about is the fact that since the questions have been introduced, the majority of students rely on these and rarely choose other aspects to reflect on, even though the instructions always stress that the questions are only there to provide inspiration and don’t (all) *haveto be answered.  This tends to make the reflections a tad predictable in structure, and to an extent in content. 

Another thing I sometimes feel I could use some help with are the questions themselves. I tweak them most semesters, adding new ones and removing those which don’t seem to have been helpful or produced much engagement. If you know of any resources that provide suggestions on how to structure reflection questions or which aspects of learning to target, they would be much appreciated!

Leveling up

Stefanie L: cat watching tv (CC BY-NC-ND 2.0)

A couple of weeks ago I blogged about H5P and how excited I was to discover this new resource I could make use of in Moodle. In that post I described the process of setting up a drag & drop activity and adding it to my online course. I was sure I wanted to try out a number of other content types – which is what the 40 odd H5P activities are officially called – but I wanted there to be a reason for adding them, apart from novelty and the thrill of experimentation.

There are a few screencasts in the course, which I thought I could use the interactive video content type with. A little bit of background on the screencasts: they started out as presentations I used when I taught the same course offline. Yes, they were PowerPoint, but I didn’t think that was enough to ditch them, especially as they were brief and had been designed to get the students to interact with the content. The first time I moved them online I used Present.me, which I’m not sure even exists anymore. About three years ago I re-recorded them, uploaded them to YouTube and added subtitles: a far more user-friendly experience overall.

There aren’t many – six in a four-month course – partly because they’re pretty time-consuming to make for someone who doesn’t do this on a regular basis and partly because I don’t think a writing skills course actually requires many. The longest screencast is just under ten minutes, if you don’t count the one in the revision unit, where I chat about what students can expect at the exam (a little under 15 minutes). That one is unscripted and those are likely to be longer anyway.

Eventually I settled on the longest screencast – the ten-minute one – to experiment with. As with the drag & drop, I first added the H5P interactive content activity to the course and selected the content type: this time around it was interactive video. You can either upload a video directly (in which case I think there’s a size restriction) or add a link to YT, which was the route I took. One aspect I was immediately unhappy about was the disappearance of subtitles; apart from the fact that they’re important to ensure accessibility, I think they can be helpful even for pretty advanced students. I got around this (sort of) by embedding the YT video directly below the interactive one and recommending the students first try the interactive version, then watch the one with the subtitles if they felt they needed them.

The screencasts are based on short sets of slides that are often meant to be presented in the following way: I do a bit of talking, the students work in pairs to answer a question or discuss it as a group, and then we check their ideas on the next slide. Because of this, in the recordings I would often ask the students to pause the video and try to answer a question I’d asked – one that they would address in pairs or groups in class. I’d suggest they make a note of their responses somehow, so they could compare them with what came next in the screencast. These were great natural places to add questions to the video and I took advantage of them.

The interactive video content type lets you add a range of different question/interaction types, (MCQs, T/F, drag the word – which can be used for gapfills – matching, and more). I was able to add a link to external content as well, and at the end I wrote up a brief summary of what the video was about and made it into a gapfill activity. I rather liked the option of having the students choose the best summary out of three possible ones – I gather this is a separate question type – but it seemed like it might take a while to set up and I didn’t have much time.

Another advantage of these H5P activities is that you can view the results of the interactions in the Moodle gradebook and see how well the students did on average, as well as if there’s anyone who seems to need a little extra help. Of course, what you can’t see is whether those who did well maybe did a bit of research before answering the questions or if they simply knew the answers from before, nor can you see if those who did less well rushed a little/took a random stab at the answers or if they didn’t really understand/follow the explanations in the video. I did add a question about this to the list of questions the students might want to address in their learning journals, so we’ll see if any interesting insights emerge.

How do you feel about interactive videos: have you used them with your students? Are there any effective tools you would recommend for this besides H5P? I recently came across an article which recommended Edpuzzle, but I’m sure there are others. Thanks for reading!