Considering Integrity in Assessments for Large Classes
Educate Students
As course instructors, we spend considerable time thinking about assessing our students: How will we know whether students have achieved learning outcomes? What worked or didn’t work last year? What’s manageable given the class size and available resources? Considering how much time we spend thinking about assessments, it’s easy for us to forget that our students don’t often have any of this context. Students don’t automatically know your rationale for the assessments you use in your class, they don’t necessarily understand how different assessments are connected (or not) to each other or how their knowledge in this class is intended to prepare them for another course (or the world outside academia). If we want students to ‘buy in’, to invest their time and energy for their own learning and development, we need to let them see behind the curtain.
Be clear about purpose of each assessment.
Explaining to students why you’ve implemented an assessment helps them to understand the value of that assessment. For example, explain that quizzes help them stay on top of material and focus on the basic, foundational concepts and vocabulary. Explain that they will need to have a solid understanding of these concepts to succeed on the higher-stakes exam because the exam requires them to integrate their learning across topics, think critically, and apply knowledge. Students often don’t have the background or context to understand why we’re asking them to complete an assessment or perform a task. As instructors it’s our responsibility to ensure they can see the path we’ve mapped for their success.
o If you’re using quizzes to help students solidify their understanding of basic concepts, consider providing automated explanatory feedback for each answer, depending on a student’s response. This helps to reinforce the intention of using these quizzes as a learning tool.
Include an academic integrity module at the beginning of your class.
This might refer students to general university-wide information, (e.g., the module available to all incoming Dalhousie students via Brightspace, or the resources provided by the University Secretariat, but should also contain information specific to your class. Explain what you expect of students for each assessment–what is meant by ‘open book’, ‘discussing ideas with others’, ‘individual submissions’, etc. While upper-year students may be familiar with some of our expectations (e.g., ‘individual submissions’), for many this is uncharted territory (e.g., online, non-proctored, open-book exams). It’s important to communicate intention and expectations, especially since those expectations could vary across courses.
o If you’re concerned about students inappropriately using ‘homework’ websites like CHEGG and Course Hero (there are many, many more), explicitly explain that the use of these sites is prohibited in your course and that those caught using them will be penalized. CHEGG and other sites will reveal the identity of students if a case is presented to them by an academic integrity officer or Dean (there is precedence for this at Dalhousie).
Carefully consider the grade breakdown in your course.
Use of high-stakes assessments worth a large percentage of the final grade is going to put pressure on students to do well and may lead to integrity issues. A combination of transparency and balanced grade distribution (e.g., multiple, lower-stakes assessments) will help students to understand the value in completing course assessments without relying on their classmates or the internet for the solutions.
Test Strategies
In large classes, often with a single instructor and relatively little TA/marker support, the need for auto-marked questions is real. Although scaffolded, written, introspective projects can deter cheating and allow for deep learning, the reality is that scaling these types of assessments for classes of 300-1000+ is unrealistic. This is the case especially for many STEM survey courses, that often have hundreds of students enrolled and are responsible for teaching fundamental (read: easily “Google-able”) concepts.
Below are eight (8) concrete strategies for creating quizzes and exams that are more resistant to academic integrity violations but that can still be auto-graded (multiple-choice, multi-select, or true/false).
Quiz Settings
(These suggestions are based upon the Brightspace learning management system (LMS) but most LMS platforms would have similar capabilities.)
1. Use a question library and create pools of questions that are at a similar difficulty level and on the same topic. You can set Brightspace to randomly select questions from each pool, ensuring students don’t all receive the same set of questions.
2. Randomize answers for multiple-choice or multi-select questions. This helps prevent easily sharing answers (e.g., 1=C, 2=A).
3. Randomize the presentation of the questions so each student gets the questions in a different order.
4. Don’t release the answers/scores until after everyone has completed the assessment.
Question Format
5. Create multiple-choice questions that rely on knowledge application rather than memorization. Taking a concept and modifying it or applying it to an unfamiliar setting that wasn’t explicitly covered in lectures or readings can make it more challenging to look up the answer and gets at a deeper level of understanding from students.
For example, if testing about colour vision, rather than asking students to identify which type of cone receptor responds to long wavelengths, you could instead ask what might happen to colour vision if you were to remove one of the types of cones.
6. In STEM, many of us focus on research design in our classes. Giving students examples and asking them to identify variables (e.g., what is/are the dependent variable(s) in this example) is a great way to test understanding and isn’t something that can be easily ‘Googled’.
7. To increase the complexity and range of an auto-graded question, consider using a multi-select question format to ask a combination of true-false statements. This allows for more nuance than a typical multiple-choice question that only allows for one true or one false statement (e.g., “Which of the following statement is TRUE?”). Using a multi-select format allows you to ask, “Out of the five (5) statements below, select all TRUE statements”. On Brightspace, you can then choose to grade ‘all or nothing’, ‘right minus wrong’, or ‘correct answers only’. To locate the correct answer to this question on the internet or in a text book, students would need to look up every single one of the answer options to determine which ones were true or false rather than only searching for the stem and finding the correct option.
8. Consider using “analogy” questions to test students’ understanding of the relationship between concepts. For example, if testing the concept of transduction (converting sensory stimuli into neural signals), a question might be
Photoreceptors are to vision, as ________ are to ________.
a) hair cells; audition
b) hair cells; olfaction
c) papilla; taste buds
d) cilia; hair cells
(Students need to understand that photoreceptors are the cells responsible for transduction in vision and which of the answers has a pairing of the cells responsible for transduction and the corresponding sense.)
Conclusion
Non-proctored, online assessments pose unique challenges for instructors, especially for the large, survey courses often offered in STEM. “Homework” sites like CHEGG and Course Hero will continue to be a serious threat to academic integrity and demand attention from university administrations and government. However, by implementing some of these suggestions it is possible for instructors to demonstrate to students the learning potential and value of course assessments and fortify the academic integrity of their courses.
Acknowledgements: Many of these ideas were generated through engaging and productive conversations with colleagues from across disciplines at Dalhousie’s Faculty of Science and Chad O’Brien from Dalhousie’s Centre for Learning and Teaching.
Additional Readings: