A common method for incorporating breaks and assessing student understanding of content during lectures is the use of classroom response systems (also known as student response systems, audience response technology, or clickers).
Most classroom response systems (CRS) are designed to have students use specific devices (clickers) or more recently their own technology (phone, tablet, or laptop; Hung 2017) to send responses to questions posed by the instructor during a lecture. The technology then tabulates and graphs the students’ responses allowing the instructor instantaneous feedback on student learning. Common CRS programs include i>clicker, Poll Everywhere, Zoom polls, Canvas quizzes, TopHat, PlayPosit, Qualtrics, Slido, Mentimeter, SurveyMonkey, and many others.
Depending on the program used, the questions can be multiple-choice, numeric response, or short text (e.g., word clouds). Although learning to use a CRS can be cumbersome in the beginning (depending on the program used and how it is accessed), once the instructor understands the system it usually only takes a few minutes to create questions (typically prior to class) and then only 2-3 minutes of class time for each question. Some CRS programs can be linked to the learning management system (e.g., Canvas) of a course to incorporate grading or tracking of responses (the university supports iClicker technology).
When incorporating the use of CRS in a lecture, it is important to also use a learning strategy along with the technology. For example, you may want to pose a question and have students use Think-Pair-Share to think about (Think) and discuss with a partner (Pair) before submitting their own response (Share) with the class using the CRS. Other instructional strategies used with CRS include sequential elimination, experiential exercises, “What would you do?” prompts, and forced choice (Muncy et al 2012). Alternatively, there are specific strategies that have been designed for use with CRS including Technology-Enhanced Formative Assessment (TEFA) which uses question-driven instruction, discussions, and metacognition as a framework for instruction with CRS (Beatty and Gerace 2009) and ConcepTests which focuses on providing students with higher-level, open-end questions with multiple-choice responses selected and then discussed as a class (Joshi et al. 2021).
The use of CRS has been shown to increase student engagement and learning in college courses. Students report higher engagement with course content and better ability to assess their own learning when clicker questions are used during lectures (Ragano and Paucar-Caceres 2013, Wang et al. 2014, Hung 2017, Joshi et al. 2021). Student learning was improved through collaborative exchanges using clicker questions (Blasco-Arcas et al. 2013, Hung 2017, Joshi et al. 2021). Additionally, student perspectives on their own learning were increased in courses that incorporate CRS questions and activities (Blasco-Arcas et al. 2013, Ragano and Paucar-Caceres 2013, Joshi et al. 2021). International and ESL (English as a second language) students as well as students with learning disabilities also showed increased engagement and content retention which was attributed to the anonymity in responding to questions using the CRS approach (Ragano and Paucar-Caceres 2013, Wang et al. 2014).
There are a few caveats to consider when using CRS in a course. First, depending on how CRS is implemented, it can show no or limited improvements to student learning. For example, some studies show no improvements for student learning when comparing CRS review sessions to traditional in-class review sessions (Lasry 2008, Fike et al. 2012). Others have found that gains in learning using CRS are only made if the use of clicker questions is highly structured using peer collaboration (Weiss et al. 2020). These findings indicate that CRS work better at improving student learning when students use them in collaboration with other active learning strategies. An additional caution for CRS is the use of student devices for completion of clicker questions. In one study, observations of students found 42% of students engaged in off-task behaviors five minutes after the completion of a clicker question (Ma et al. 2020). Thus, it is important for instructors to think through many aspects of how students will use the CRS in class to ensure student learning.
Beatty, I. A. and W. J. Gerace (2009) Technology-enhanced formative assessment: a research-based pedagogy for teaching science with classroom response technology. Journal of Science Education Technology 18:146-162.
Blasco-Arcas, L., I. Buil, B. Hernandez-Ortega, and F. J. Sese (2013) Using clickers in the class. The role of interactivity, active collaborative learning, and engagement in learning performance. Computers and Education 62:102-110.
Fike, D., R. Fike, and K. Lucio (2012) Does clicker technology improve student learning? Journal of Technology and Teacher Education 20:113.126.
Hung, H. (2017) Clickers in the flipped classroom: bring your own device (BYOD) to promote student learning. Interactive Learning Environments 25:983-995.
Joshi, N., S. Lau, M. F. Pang, S. S. Yu Lau (2021) Clickers in the class: fostering higher cognitive thinking using ConcepTests in a large undergraduate class. Asia-Pacific Education Research 30:375-394.
Lasry, N. (2008) Clickers or Flashcards: Is there really a difference? The Physics Teacher 46:242-244.
Ma, S., D. G. Steger, P. E. Doolittle, A. H. Lee, L. E. Griffin, and A. Stewart (2020). Persistence of multitasking distraction following the use of smartphone-based clickers. International Journal of Teaching and Learning in Higher Education 32:64-72.
Ragano, R., and A. Paucar-Caceres (2013) Using systems thinking to evaluate formative feedback in UK higher education: the case of classroom response technology. Innovations in Education and Teaching International 50:94-103.
Wang, Y., C. Chaung, and L. Yang (2014) Using clickers to enhance student learning in mathematics. International Education Studies 7: doi:10.5539/ies.v7n10p1
Weiss, D. J., P. McGuire, W. Clouse, and R. Sandoval (2020) Clickers are not enough. National Science Teachers Association 49:58-65.