Co-authored with Arjan Schakel, originally published by Active Learning in Political Science on 10 February 2021. Students and staff are experiencing challenging times, but, as Winston Churchill famously said, “never let a good crisis go to waste”. Patrick recently led a new undergraduate course on academic research at Maastricht University (read more about the course here). Due to COVID-19 students could choose whether they preferred online or on-campus teaching, which resulted in 10 online groups and 11 on-campus groups. We were presented with an opportunity to compare the performance of students who took the very same course, but did so either on-campus or online. Our key lesson: particularly focus on online students and their learning. In exploring this topic, we build on our previous research on the importance of attendance in problem-based learning, which suggests that students’ attendance may have an effect on students’ achievements independent fromstudents’ characteristics (i.e. teaching and teachers matter, something that has also been suggested by other scholars). We created an anonymised dataset consisting of students’ attendance, the number of intermediate small research and writing tasks that they had handed in, students’ membership of an on-campus or online group, and, of course, their final course grade. The latter consisted of a short research proposal graded Fail, Pass or Excellent. 316 international students took the course, of which 169 (53%) took the course online and 147 (47%) on-campus. 255 submitted a research proposal, of which 75% passed. One of the reasons why students did so well – normal passing rates are about 65% – might be that, given that this was a new course, the example final exam that they were given was one written by the course coordinator. Bolkan and Goodboy suggest that students tend to copy examples, so providing them may therefore not necessarily be a good thing. Yet students had also done well in previous courses, with the cohort seemingly being very motivated to do well despite the circumstances. But on closer look it’s very telling that 31% of the online students (52 out of 169) did not receive a grade, i.e. they did not submit a research proposal. This was 9.5% for the on-campus students (14 out of 147)[1]. Perhaps this is the result of self-selection, with motivated students having opted for on-campus teaching. Anyhow, it is clear that online teaching impacts on study progress and enhancing participation in examination among online students needs to be prioritised by programme directors and course leaders. We focus on students that at least attended one meeting (maximum 6) and handed-in at least one assignment (maximum of 7). Out of these 239 students, 109 were online students (46%) and 130 on-campus (54%). Interestingly, on average these 239 students behaved quite similarly across the online and on-campus groups, they attended on average 5 meetings (online: 4.9; on-campus: 5.3) and they handed-in an average of 5 to 6 tasks (online: 5.0; on-campus: 5.9). We ran a logit model with a simply dummy variable as the dependent variable which taps whether a student passed for the course. As independent variables we included the total number of attended meetings and the total number of tasks that were handed-in. Both variables were interacted with a dummy variable that tracked whether students follow online or offline teaching and we clustered standard errors by 21 tutor groups. Unfortunately, we could not include control variables such age, gender, nationality and country of pre-education. This would have helped to rule out alternative explanations and to get more insight into what factors drive differences in performance between online and offline students. For example, international students may have been more likely to opt for online teaching and may have been confronted with time-zone differences, language issues, or other problems. Figure 1 displays the impact of attending class on the probability to pass for the final research proposal. The predicted probabilities are calculated for an average student that handed-in 5 tasks. Our first main finding is that attendance did not matter for online students, but it did for on-campus students. The differences in predicted probabilities for attending 3, 4, 5, or 6 meetings are not statistically significant (at the 95% confidence level) for online students but they are for on-campus students. Students who attended the maximum of six on-campus meetings had a 68% higher probability to pass compared to a student who attended 3 meetings (89% versus 21%) and a 52% higher probability to pass compared to a student who attended 4 meetings (89% versus 37%). Figure 2 displays the impact of handing-in tasks on the probability to pass for the final research proposal. The predicted probabilities are calculated for an average student that attended 5 online or on-campus meetings. Our second main finding is that handing-in tasks did not matter for on-campus students, but it did for online students. The differences in predicted probabilities for handing-in 4, 5, 6, or 7 tasks are not statistically significant (at the 95% confidence level) for on-campus students but they are for online students. Students who handed-in the maximum of seven tasks had a 51% higher probability to pass compared to a student who handed in four tasks (69% versus 18%) and a 16% higher probability to pass compared to a student who handed-in five tasks (69% versus 53%).
Note that we do not think that attendance does not matter for online students or that handing-in tasks does not matter for offline students. Our dataset does not include a sufficient number of students to expose these impacts. From our previous research we know that in general we can isolate the impact of various aspects of course design with data from three cohorts (around 900 students). The very fact that we find remarkably clear-cut impacts of attendance among on-campus students and of handing-in tasks for online students for a relatively small number of students (less than 240) reveals that these impacts are so strong that they surface and become statistically significant in such a small dataset as ours. This is why we feel confident to advise programme directors and course leaders to focus on online students. As Alexandra Mihai also recently wrote, it is worth investing time and energy in enhancing online students participation in final examinations and to offer them many different small assignments to be handed-in during the whole time span of the course. This is not to say that no attention should be given to on-campus students and their participation in meetings but, given limited resources and the amount of gain to be achieved among online students, we think it would be wise to first focus on online students. [1] The difference of 21% in no grades between online and offline students is statistically significant at the 99%-level (t = 4.78, p < 0.000, N = 314 students).
33 Comments
Originally published by Active Learning in Political Science on 17 November 2020. The ongoing Covid-19 crisis has forced us all to rethink our teaching, but not all innovation has to start from scratch. For instance, when you feel uncomfortable with recording a video for your lecture, you can also simply use the narrated slides option in your presentation software. And when you want to stimulate student engagement and interaction during an online talk, existing audience response tools, such as GoSoapBox, Kahoot!, Mentimeter and Wooclap are ready for online use. I’m a frequent user of Wooclap myself, but also have experience using GoSoapBox and have trialed some other options too. My choice for Wooclap is partly one based on its user-friendliness – though the additional perks that come with Maastricht University’s subscription are welcome too. I’ve been using Wooclap offline for quite some time already, and I’ve continued using it when we went online. Wooclap functionality Wooclap comes with an easy-to-use, clutter-free interface, minimising possible distraction for you and for your audience. It is also easily accessible, regardless of the device that students are using. The weblink is short, plus you can generate a QR-code. The existence of different types of questions and various ways to present results is really helpful. You can ask multiple-choice and open questions and conduct polls. You can have students fill in blanks or locate something on an image. When you want students to actively work together, you can opt for the brainstorm option. Issues can be sorted or you can ask students to prioritise what they would like to discuss. Answers can be given in writing, but you can also ask students to make a meme and upload it. Even when you decide to only use multiple-choice and open questions, you can choose to present answers to the latter as a word cloud instead of a list of answers. This presents a nice and useful overview, because with a big group you’ll never be able to read every answer. It is very easy to reorder questions and to integrate slides – though the later comes with potential limitations when you are a savvy user of funky slide transitions and other moving bits and pieces. Other useful options include a timer for answering questions and allowing audience members to ‘like’ each others answers. One option I haven’t used yet, is gamification, which allows you to rank participants – and hand out prizes – adding a fun element to your talk. But one which can also create a sense of unease among your audience. What I also find particularly useful is the ease with which you can copy polls and questions; convenient when you want to re-use polls while keeping existing data. Indeed, you can also export results, so you could for instance look at differences between cohorts of students. Online vs offline use
To me, the offline usefulness of Wooclap is evident. It is a really simple and fun way to involve your audience in an active way, individually and in groups. I have for instance used Wooclap during interactive lectures on Euroscepticism, academic skills, you name it. You can ask students to ‘define Euroscepticism’ but you can also ask them what type of resources they’ve consulted for their research paper. When I write “really simple” I do not mean that it is self-evident. You’ll still have to explain what the purpose is of using Wooclap. Sometimes additional instructions are needed, in particular when it comes to brainstorming – talking to each other may be easy, but how do you succinctly contribute to an online brainstorm? – but you may also want to take the time to explain your questions. This is where integration of slides comes in handy. Using Wooclap in an online setting requires additional planning. Two challenges are noteworthy. First, you may have to switch between several screens, sharing one screen, stop sharing it, and moving on to the next. As I mentioned, the integration of slides goes a long way towards solving this challenge – but comes with its own limitations. Second, in a lecture theatre it is relatively easy to get a sense of how engaged students are with your Wooclap tasks. Yet, not being able to see your online audience it is easy to fail to engage audience members. These limitations should, however, not stop you from considering using Wooclap. A good plan for your talk is a must. I recommend either having a few short questions at the beginning of the lecture to, for instance, gauge students knowledge of a topic, or to have them mid-way to, for instance, see whether you are getting your point across. If you plan to have several questions – I’d say anything above five – best to distribute them across your talk instead. Because, while Wooclap is a fun and useful way to engage your students, you can also overdo it, with students ending up asking for its purpose. This post was originally published by Active Learning in Political Science on 6 December 2019. I’ve always considered myself an approachable teacher; someone students can come to with questions or worries or just for a talk. And from what I hear, I amconsidered to be approachable. Still, I am noticing something that worries me. I have been having open office for about 9 years now, but fewer students have been showing up. Weeks go by when no one comes, even in periods when I am teaching and coordinating courses. I know that I am not the first one raising this issue. It is even the topic of students’ research! But I still believe that students can learn from meeting with us for input and feedback, whether this concerns a relatively simple question or my assessment of their paper. So, why does no one come and talk to me anymore? Turnout during open office hours again was low during the first weeks of this year, when I coordinated and taught a first-year course on academic research and writing. At the end, students write a short paper. These are randomly distributed among teaching staff, myself plus 10 other colleagues – together we teach 25 problem-based learning groups of about 12 students. As soon as results are out, all students, whether they have failed or passed, are invited to meet with the person who marked their paper to discuss the assessment during scheduled open office hours. This year I asked colleagues to inform me about the number of students that had shown up. The table below shows the data for those who failed the course. Interestingly one colleague had to do her open office hours via Skype; no less than 7 out of 9 students showed up. Yet, there is some research that suggests that using technology does not make a huge difference. Why did so few students show up?
I decided to ask some simple questions to the students themselves during a session in our mentor programme. The approximately 100 students who attended (out of nearly 300) might not be representative of the group of students that does not turn up in my office. But I still learned something interesting. Of the 86 students completing questions via an online survey tool, 36 had failed the course and 29 had attended the open office hours. Those who attended, generally did so to get clarification regarding their paper’s assessment. Of those who did not attend, some simply stated that they passed the course and saw no need to discuss the feedback. Others referred to having been sick, stressed and/or busy with the new courses – when asked, quite a few of these students did not write to staff to ask for another appointment. Asked why they thought others had not come, some answered that these must be lazy students or that they missed motivation because they knew what they had done wrong. But quite a few answers touched upon something that we might all too easily overlook, namely students’ expectations regarding feedback opportunities. These answers did not just concern not knowing what to do with feedback. For instance, one student wrote that students who did not show up might be “insecure and/or uncomfortable with getting feedback”. Another student wrote that “you have limited time with the tutors and tutors often have a lot of work and not much time for you”. Could it be that low attendance during open office hours is due to barriers to students’ engagement with feedback or, more generally, a lack of feedback literacy? This is something that I want to explore in more detail. I have already briefly discussed this with our academic writing advisor, and we may want to see whether we can specifically address this issue in a forthcoming curriculum review. But what about solutions for the here and now? There are many ways in which open office are organised, but what works best? One colleague suggested changing times. Admittedly, my open office hours are Wednesdays from 08:30-09:30, but this never was a problem – and the feedback open office hours during the aforementioned course were scheduled in the afternoon. Elsewhere in cyberspace people have been suggesting other solutions, including a rethink of faculty office space. I’d love to squeeze in a couch, but my office is rather tiny. On Twitter someone suggested that the wording ‘open office hours’ is unclear to students and that ‘student drop-in hours’ may make more sense. So, the name plate next to my door now mentions my student drop-in hours and so does the syllabus of an upcoming course. Let’s see what happens. I hope students will come and talk to me again. The door’s open, simply turn up at the stated time! |
Archives
December 2023
Categories
All
|