Categories
Edtech Moodle online course

Type little and give extensive feedback

Photo taken from http://flickr.com/eltpics by @sandymillin, used under a CC BY-NC 2.0 license, https://creativecommons.org/licenses/by-nc/2.0/

It all started on Twitter, as these things do. I had covid and was stuck at home, so it was as good a time as any to do some marking. Then I came across Neil’s tweet.

I recommend you click through for the answers because quite there were a few suggestions and several people mentioned text expanders, which is useful for context, but the answer that caught my eye was this one:

https://twitter.com/sensendev/status/1330107775440089093

I don’t use Linux, so I’m not entirely sure why I decided to try espanso out. Now I think about it, I’m pretty sure Neil tweeted an update on how well it was working out for him. Anyway, espanso works on Windows and Macs, although I use it on Windows most of the time.

I did need a little bit of help installing the program but I probably would’ve been able to do it myself if I’d put in a little effort. The point is, it’s pretty simple and quick. (To be fair, it was more complicated to install on a Mac.)

The idea of this post is to reflect a little on the past 6 months of using it and note down some pros and cons. 

First of all, this is what it looks like in practice. Please ignore the huge gap between the top and the bottom comment; it’s my first attempt at a gif.

User types "main idea" and this is automatically expanded to This seems like a new main idea and might be best in a separate paragraph. 
User types "meaning" and this is expanded to I'm not sure what you mean by this (in this context), so consider the possibility that other readers may not be sure either.
Demo of how a text expander works

And it works everywhere. If I typed :main idea it would expand like in the gif regardless of whether I was commenting on a Word doc, typing in a Google doc, in the Moodle gradebook… 

My initial reaction was – this is bliss! My days of spending ages on marking are over! All I need to do is add the comments which are already in my comment bank to espanso and I’m all set. 

This is why in the end it wasn’t as easy as that. 

I have a huge number of comments in my comment bank. I’ve written about the comment bank I have in Google Docs in this post and in Google Keep in this one. At first I thought it would only take a long time to transfer them all to espanso, but then I realized that I would have to come up with as many triggers as there are comments. (The trigger is the combination of : and the word or letter combination that gets expanded.)

It probably wouldn’t be that taxing to come up with a long list of triggers, but eventually I didn’t because it became obvious I couldn’t remember them all. In my comment banks the comments are categorized by unit and activity (in Google Docs) and by aspect of writing like punctuation or formality (in Google Keep). Categorization isn’t possible in any meaningful way in espanso, so you’re probably best off if you choose a trigger that will most easily remind you of the longer comment you wanted to add (or vice versa). 

What tends to work best (for me) is if I add a whole word or word sequence, like “comma splice”. Great, I hear you say, so do that. But the longer the trigger is, the more likely you are to mistype something and then you need to delete what you’ve typed and start again (at least if you’re using Windows). Also, if you want to use “comma” as part of a trigger for anything other than comma splice comments, you can’t. Say you wanted to use “comma not needed” as a trigger. The nanosecond you type :comma, espanso expands it to your comma splice comment. You could use “unnecessary comma” as a trigger, but it’s not what I think of first when I see one – when I start typing, my brain has already categorized that as a comma-related error, and “comma” is the word that first comes to mind, not “unnecessary”. So if you’re old and forgetful, you’ll catch yourself going through the espanso bank, muttering “Why did I ever think I’d remember “unnecessary comma”?!” You get the idea. This is just an example, incidentally; I’m not that concerned about commas.

In order to really save time and reduce the potential for confusion, the triggers need to be short. Ideally, just a few letters. But the shorter they are, the easier they are to forget. Did I say old and forgetful? Add stressed out over a million things. Coming up with a trigger like “spe” for spelling sounds easy enough to remember… okay it is. That one is. But when I have a comment which is essentially just positive feedback on participating in a discussion in unit 4, that is quite tricky to reduce to a three-letter combo that I will remember longer than a day. Yes, you are right to wonder how I deal with PINs. 😛

What I tend to do now is work with up to 20 triggers. I always open up espanso before I start to remind myself of the triggers and attendant comments. Then I mark everyone’s work in the unit I am currently grading, where I won’t need that many different comments because the mistakes and the things done well tend to be quite similar. When I move on to the next unit, I prefer to work with the same triggers and update the expanded feedback in espanso. I won’t be needing the comments for the unit I’ve just marked until next semester anyway. Then the trigger for my positive feedback can always just be “yes” and for negative comments/suggestions for improvement it can be “no” – definitely easy to remember.  

What I’ve also decided works for me is adding as much text as possible to one single trigger. In other words, instead of thinking up three different triggers for three variations of positive comments, I add all three to the same trigger, delete the unnecessary/non-applicable comments when the text expands (and then customize further if needed).  

In short, the tool isn’t as ideal as I’d initially expected it to be, but it does speed up the feedback process considerably once you’ve figured out how it can best serve you. I still use the comment banks and, of course, a large number of comments are personalized and context specific anyway, so nothing really helps there.  

What do you do to speed up the marking and feedback process? If you have any tips, either on how to use text expanders more efficiently or which other tools have been useful to you, I’d love to hear them! 

Advertisement
Categories
Edtech MOOC online course

Some (new) observations on peer review

I recently completed a MOOC called Elements of AI. Let me first say that I am privately (and now perhaps not as privately) thrilled to have managed this because I’m highly unlikely to commit to a MOOC if it looks like I might not have time to do it properly (whatever that means) and it often looks that way. I enjoyed the course and definitely learned a bit about AI – robots will not replace teachers anytime soon in case anyone was wondering – but I couldn’t help noticing various aspects of course design along the way. This is what this post is about, in particular the peer review component. 

S B F Ryan: #edcmooc Cuppa Mooc (CC BY 2.0)

Most of my experience with peer review is tied up with Moodle’s workshop activity, which I have written about here, so the way it was set up in this course was a bit of a departure from what I am used to. There are 5 or 6 peer review activities in Elements of AI and they all need to be completed if you want to get the certificate at the end – obviously, I do. *rubs hands in happy anticipation*

Let’s take a look at how these are structured. To begin with, the instructions are really clear and easy to follow – and despite reading them carefully more than once, I still occasionally managed to feel, on submitting the task and reading the sample “correct” answer, that I could have paid closer attention (the “duh, they said that” feeling). The reason I note this is because it’s all too easy to forget about it when you’re the teacher. I often catch myself thinking – well, I did a really detailed job explaining X, so how did the student not get that? 

Before submitting the task, you’re told in no uncertain terms that there’s no resubmitting and which language you’re meant to use (the course is offered in a range of languages). I read my submissions over a couple of times and clicked submit. In the Moodle workshop setup, which I am used to, you can then relax and wait for the assessment stage, which begins at the same time for all the course participants. Elements of AI has no restrictions in terms of when you can sign up (and submit each peer review), so I realized from the start that their setup would have to be different. 

The assessment stage starts as soon as you’ve made your submission. You first read a sample answer, then go on to assess the answers of 3 other course participants. For each of these three you can choose between two random answers you’re shown before you commit to one and assess it on a scale of an intensely frowning face to a radiant smile (there are 5 faces altogether). You are asked to grade the other participants on 4 points:

  1. staying on topic
  2. response is complete/well-rounded
  3. the arguments provided are sound
  4. response is easy to understand

The first time I did this, I read both random responses very carefully and chose the one that seemed more detailed. This was then quickly assessed because the 4 points are quite easy to satisfy if you’ve read the instructions at all carefully. However, I did miss the fact that there was no open-ended answer box where I could justify anything less than a radiant smile. I’m guessing this was intentional so as to prevent people from either submitting overly critical comments or spamming others (or another reason that hasn’t occurred to me) but I often felt an overwhelming urge to say, well, yes, the response was easy to understand, but you might consider improving it further by doing X. Possibly those who aren’t teachers don’t have this problem. 😛

It was also frustrating when I came across an answer that simply said “123” and another that was plagiarized – my guess is that the person who submitted it had a look at someone else’s screen after that other person had already made their submission and could access the sample answer. Or maybe someone copied the sample answer somewhere where others had access to it? The rational part of my brain said, “Who cares? They clearly don’t, so why should you? People could have a million different reasons for signing up for the course.” The teacher part of my brain said, “Jesus. Plagiarizing. Is. Not. Okay. Where do I REPORT this person? They are sadly mistaken if they think they’re getting the certificate.”

Once you’ve assessed the three responses an interesting thing happens. You’ve completed the task and can proceed to the next one, but you still have to wait for someone to assess your work. This, you’re told, will happen regardless, but if you want to speed up the process, you can go ahead and assess some more people. The more responses you assess, the faster your response will come up for assessment. I ended up assessing 9 responses per peer review task, so clearly this incentive worked on me, though I have no idea how much longer I would’ve had to wait for my grades if I had only assessed 3 responses per task. I only know that when I next logged on, usually the following day, my work had already been assessed. 

For a while I was convinced that either whoever had assessed my work had been very lenient or else all responses were automatically awarded four radiant smiles. My work hadn’t been that good, I thought. Then in the very last peer review I got a less than perfect score, so I assume there was at least one other teacher taking the course. 🙂 

In theory then, once your work has been assessed by two of your peers, you’re completely done with the task. However, at the very end of the course, you’re told that in addition to the grades you received from your peers, your work will also be graded by the teaching staff. Happily, your tasks are still marked as complete and you can get your certificate nevertheless. I suspect I’ll be waiting a while for that grade from the teaching staff and it seems a bit irrelevant, to be honest. It would make sense for someone other than other course participants to check the responses if this were done before course completion was officially confirmed (so those who submitted “123” wouldn’t get their certificate, for instance) but now I think of the course as finished and my work as graded, I’m not likely to go back and check whether I received any further feedback, especially if it’s only emoji. 

There were other interesting aspects of the course but I’ll stop here so as not to mess up my chances of posting this soon. In short, the course reminded me of why l like peer review (if everyone participates the way the course designers intended them to) and has given me some new ideas of how similar activities can be set up.

Have you completed any MOOCs or other online courses lately? Did they include peer review? What do you think makes a good peer review activity?

Categories
Edtech Moodle online course Tertiary teaching

Practice makes perfect?

This has been sitting in my drafts folder for about a year. If I don’t post it now, I’m not sure I ever will, so here goes. 

danna § curious tangles: words (CC BY-NC-ND 2.0)
March 2020

I recently tweeted this.

It then occurred to me that the tweet could be construed as a promise of a blog post, so I thought I should do something about that.

I’ve previously written about H5P activities in our online course here, here and here. I like the versatility of the tool and the fact that it’s available as a Moodle plugin, so I can use it free of charge. There are many content types (activity types) you can use and so far I’ve only been able to test out a relatively small number: drag and drop, interactive video, audio recorder and course presentation.

I was pretty confident H5P would be useful for tasks more obviously associated with vocabulary learning (probably primarily because I’d noticed a content type called Fill in the blanks), so at some point I thought I should look into it as a possible substitute for Textivate. 

I’ve been using Textivate for several years now and really like it, but if you want to use it in combination with your own resources – to practice specific vocabulary, for instance – you have to get the paid version. Which I completely understand and *have* renewed my subscription a couple of times, but as I’m an adjunct and get paid comparatively little, I’m always on the lookout for free versions of apps and such. 

I decided to revamp our revision unit with the help of H5P content types and made the following changes.

  1. User-defined gapfill (Textivate)Fill in the blanks (H5P). For vocab revision. In the Textivate version, once you define the words students need to add to the text, they conveniently and automatically appear below the text and you drag them to where you think they should go. In the H5P version, you define where the missing words should go and get blank boxes where students need to type these words. It’s not incredibly flashy or exciting in Textivate either, but in H5P it’s really bland, so I decided to jazz it up a little by creating an image in Canva and adding the missing words to it. Then I added the image above the text. When you have a go at the activity in student mode, the image shows up as much smaller than it is, but you can click on it to enlarge it, which I thought could be convenient for practice. If a student was doing the activity for the first time, they could click on the image and view the words, but if they wanted to try it again, they could see if they recalled any of the words without first clicking on the image. Incidentally, you could also add a video instead of an image to the activity, which wasn’t suited to my purpose but could work well in other contexts. 
February 2021

So, here we are, back in the present. It’s a little more difficult now to identify the type of activity I used in Textivate because I can only make a guess based on what the activity looked like in earlier iterations of the course. I knew there was a reason I should have done this sooner. 

  1. Shuffle? Multimatch? (Textivate)Fill in the blanks (H5P) For joining sentences. The point here was to practice joining sentences in a highly controlled way, with relatively little creativity and thus few unexpected outcomes. (No, it’s not one of my favorite activities either, but it’s useful for exam practice.) When I used Shuffle, students were instructed to rewrite the sentences in a separate document (which is not the happiest of solutions and I’m doubtful whether anyone ever actually did this, especially if they revised half an hour before the exam) and then do the Shuffle activity where they matched the two sentence halves and checked if they corresponded (in terms of punctuation, etc.) with what the students thought was correct. I’m happier with how it works with Fill in the blanks (H5P) because you can add hints to the blank boxes. The students have to type out the sentences in order to check if they joined them correctly and if there’s anything you want them to watch out for (something that might cause them to slip up at the exam) you add it as a hint. Now, you might think well, maybe Shuffle/Multimatch wasn’t the best choice of activity for joining sentences and you’d be right, but as I already had a Textivate subscription I sometimes used it in ways that were not ideal. 
  2. User-defined gapfill (Textivate)Mark the words (H5P) For identifying parts of text that need alteration. This was a really convenient change because it made me break the activity up into two steps, which I think is easier to process. Again, in the Textivate version the students were instructed to type the changes they would make in another document and then drag and drop the suggested answers into the correct gap. If they skipped the first step, the activity was deceptively easy compared to what it would be like at the exam. With Mark the words (H5P), they first need to highlight the parts of the text that need to be changed (and can check if they were correct), then there is another Fill in the blanks with hints where they focus on making the actual changes. 

And that’s it as far as H5P in the revision unit goes. There are also a couple of Quizlet sets and Moodle quizzes, so the H5P activities are just part of what students have the opportunity to complete if this is how they wish to practice. The unit is entirely optional, although I sometimes think perhaps it shouldn’t be, but that is material for another post.

If you teach online (synchronously or asynchronously), do you have materials that your students can access in their own time and practice, say, for an exam? What (tools) have you used for this and are these activities optional?