Photo taken from ELTpics by @eannegrenoble, used under a CC BY-NC 2.0 license
This is a brief post to report on something that I came across on Twitter last year and finally tried out at the final exam a couple of days ago.
There was a tweet – unfortunately, I forget by who and I didn’t bookmark it – that described an intriguing tweak to written exams. Essentially, one of the questions was left blank and students could add any question that hadn’t been asked but they knew the answer or had studied for this. As I understood it, this was entirely optional and was an opportunity to score extra points.
The tweet seemed to garner quite a bit of attention and approval but for all I know the idea isn’t as revolutionary as all that; it might only represent a novel approach in my context, which is not exactly prone to experimentation, especially when it comes to exams. In any case, I knew at once this was something I was going to try in February.
This was the question I added to the last page of the exam paper:
Is there anything else you wish were included in the exam? Something you studied or know the answer to but that question is not in the exam paper? Write down the question(s) and what your answer(s) would be and you may be able to score extra points. Of course, it needs to be related to this course.
Because I’d left it entirely up to the students how many bits of information to include, if any at all, I didn’t settle on how many points they’d actually be able to score. I had this vague idea that the answers might help someone pass if they were short of a few points or get a higher grade if they were pretty close to the cut-off point. As the exam went on, for a short while I thought nobody would take up the option of answering the question and I was sorry the exam had only been scheduled to last an hour because I thought maybe there wasn’t enough time.
Eventually, about half the group did answer. When I read the answers, I realized I’d expected them to refer to the students’ takeaways from the course and say, for instance, things like “I’ve discovered some really effective spell check tools: X and Y” or “I’m much more confident than before about where I could/should use a semi-colon,” along with a sentence to illustrate this.
Instead, they mostly referred to items they’d revised in preparation for the exam, which is not surprising given how the question was phrased. The last unit online included a screencast in which I talked about what they could expect at the exam, so they were (presumably) all aware of what to focus on during revision.
There was a category of answers that clearly didn’t aim at getting a higher grade: one student included suggestions as to how the instructions could be worded more clearly in an exercise and another said they wished they’d been asked to write an essay rather than being tested on a number of discrete points (but then didn’t go on to write an essay, saying that they doubted this would impact on their grade in any way).
Overall, I was able to use those answers where the students had shared what they remembered about the course content to give them a higher grade, so I am counting the experiment as a success, though for some reason my impression is that the person who shared the idea on Twitter was a lot more enthusiastic than me about the results. I think I’ll be using the question again, although probably not on the very next exam date.
Have you used this type of question with your students? What was your experience? If you haven’t, do you think you might? Also, if you have any ideas on how the question could be improved upon (or the link to the original tweet – it might have been phrased much better there), I’d love to hear them. Thanks for reading!
2 replies on “Design your own exam question”
Hi Vedrana, this is a very interesting experiment! As always, it seems to me language skills are much harder to assess than other subjects, so if this was biology, for example, the question would be worded right and might produce the answers you expected. For writing skills, it does seem quite tricky. I’ll think about it and let you know if I get more ideas. Otherwise, I don’t test my students much, except for our Czech class, which uses standardised tests. But I know how difficult it is to design sound exam questons.Thanks for an interesting read!
LikeLiked by 3 people
Now you say that, I can’t recall which subject was being tested in the tweet. It might not have been language skills and you’re right, it probably would be a lot easier to test content recall in biology.
I wish I didn’t have to test my students at all; it’s the aspect of the job I dislike the most, but unfortunately it you can’t avoid it in my setting. One thing I can do is have the work they do during the semester carry more weight than the actual exam, though they do have to pass it. A concern I’m always grappling with is whether I should let students who put in minimum effort during the semester (which need not correlate with their language proficiency of course) get away with an easy exam too, so it tends to be a bit more challenging than a bunch of multiple choice questions. However, this can be unfair on those students who worked hard all semester.
In an earlier post on blended learning I wrote about how if you have at least some classes on campus, they can be used to “force” students to recycle vocabulary, for instance, which may lead to higher scores on the exam (even if not necessarily to higher vocab retention in the long term). When everything is completely online, the onus is on the students to take care of revision. For university students this shouldn’t be unexpected, but often they don’t take prep seriously because it’s just English, they’ve had it for years and they’ve always had good grades, so how hard can this be?
LikeLiked by 3 people