Complete Story
 

12/11/2022

Oral exams in support of academic integrity: from catching dishonesty to promoting excellence

Written by Huihui Qi , Curt Schurgers

Remember the moment when, in March 2020, the COVID pandemic forced universities across the country to suddenly switch to remote instruction? At University of California San Diego, we had to pivot right in the middle of our last week of classes, before the Winter finals, because we run on the quarter system. Few of us had a well-tested strategy for remote exams and even fewer really knew what to expect. As we scrambled, a student in one of our classes, whose performance at that point had been stellar, expressed concern that others in the class might cheat. She was afraid that cheating would break the curve. Surveys showed she was not alone in this fear. And, like most instructors, we also shared the concern for maintaining a high standard of academic integrity. While we followed suggested guidelines, such as using pools of questions, we each independently took the additional step of adding oral follow-up exams to our courses. The idea was simple: due to the adaptive nature of oral exams, we would have the opportunity to ask probing questions to see how much students really understood their answers.

We each implemented slight variations on this idea, but the broad strokes were the same. One of us only followed up with a few students, whose final exam performances were markedly different from how they did on the midterm. One of us administered a 15-minute oral exam to all 150 students, with the help of three Teaching Assistants (TAs). This approach brought additional challenges of training the TAs to ensure that the oral exams would be consistent and fair, as well as the sheer task of each doing 8 to 10 hours of exams. Despite these differences in the scope of our oral exams, both of us noticed similar trends and serendipitous outcomes. 

Importantly, students reported that they believed the oral exams strengthened the academic integrity of the course. This belief itself is valuable in reducing the perceived need to cheat to keep up with everyone else. In terms of the actual detection of violations, there were instances where students could not explain their written answers. Upon further probing, we found that some of these cases did involve academic misconduct. However, in other cases, the issue was that the students’ learning strategy had been focused on mere memorization of processes and recipes; and this lack of conceptual understanding was laid bare by the unique interrogative nature of oral exams. On the flip side, there were students who actually did markedly better on the oral exam compared to the written one. From follow-up surveys, we learned that some students had reviewed material in more depth because they felt a one-on-one oral exam was intimidating. For others, the oral exam modality, which lets them explain their thinking, simply suited them more. These observations were somewhat eye-opening, and encouraged us to consider oral exams beyond this initial trial. We asked ourselves: could we use oral exams not only to support academic integrity, but also for these other pedagogical benefits that had emerged?

We decided to explore this question. However, we needed to address an issue that had also come up in our surveys: honest students felt mistrusted. Going forward, we would therefore pitch the oral exam as a learning opportunity instead of a policing strategy. We believed that this more constructive view could still serve the original academic integrity goal, while also delivering the additional pedagogical benefits we had observed.

Fortunately, other engineering faculty on campus had considered similar interventions, and together we quickly started brainstorming ideas. We would implement variations of oral exams in our future courses to study what was most effective, how we could overcome the issues of large class sizes and examiner inconsistency, and how we could replicate the benefits we found earlier. Expanding our group with pedagogical researchers from across campus, we embarked on this research, for which we received a three-year NSF grant in January 2021. 

Since starting this collaboration, our team has implemented oral exams in 32 classes, covering 9 distinct electrical and mechanical engineering courses. Working with instructional teams has been instrumental in successfully administering oral exams. From our experience, a ratio of one

instructor or TA to 30 students seems to be the maximum ratio for implementation. Throughout this process, our research evolved to consider exam variations and to explore how they help students identify their strengths and weaknesses, lower the barrier to seek help, provide additional touch points for instructors and encourage students to pursue conceptual understanding. In a way, we are now also contributing to integrity through a different lens: if we can better support students in their learning and give them the tools and confidence to succeed, they are less likely to feel the need to resort to dishonest means. 

We have moved away from seeing oral exams as a way to catch cheaters and are exploring their use as a much richer tool in our courses. Still, the impression among students that we started from persists – they report in surveys to perceive oral exams as positively contributing to academic integrity in the course. This impression has a known positive impact on actual integrity, as it reduces “having to cheat to keep up” as a driver, which should not be underestimated.

Now should you implement oral exams in your courses solely for reasons of academic integrity? We would say “not necessarily”, as they are fairly labor-intensive. Nevertheless, we are discovering that, with the other benefits we mentioned, they are a useful tool in our arsenal. And they are also a tool that actively champions academic integrity, both by supporting students’ learning and improving the perception of integrity in the course.

For more information about our research studying oral exams, please check out our website https://huqi31.wixsite.com/ucsdoralexam.

Acknowledgements:

This work was supported by the National Science Foundation under Grant No. 2044472. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation. The rest of our research project team members are: Dr. Nathan Delson, Dr. Marko Lubarda, Dr. Carolyn Sandoval, Dr. Saharnaz Baghdadchi, Joanna Boval, Xuan Gedney, Dr. Maziar Ghazinejad, Dr. Minju Kim, Dr. Leah Klement, Leo Liu, Dr. Mia Minnes, Dr. Alex Phan, Dr. Celeste Pilegard, Dr. Josephine Relaford-Doyle and Tony Wang.


Thank you for being a member of ICAI. Not a member of ICAI yet? Check out the benefits of membership and consider joining us by visiting our membership page. Be part of something great!

Printer-Friendly Version

0 Comments