Skip to: Site menu | Main content

Questions in Asynchronous Online Courses

Asking and answering questions in asynchronous online courses is important for sustained discussion and engagement of students. Questions may be posed by students, student moderators (or student facilitators), or instructors. Some questions are answered and others are not. Some types of questions lead to knowledge-construction and others do not. In what ways does the success of question types depend on the types of enrolled students? I've become very interested in understanding the dynamics around questions and have collected some resources here on the topic.

Papers

Bradley, M. E., Thom, L. R., Hayes, J., & Hay, C. (2008). Ask and you will receive: How question type influences quantity and quality of online discussions. British Journal of Educational Technology, 39(5), 888-900.
This paper is solely responsible for my initial interest in the topic. The study involves an instructor posing six different types of questions (based on question types by Andrews) to undergraduate students, and measuring the consequent higher-order thinking, word count, and answer completion. Two findings stand out: type of question impacts these three outcomes, and the best type of question for one outcome is not necessarily best for the other outcomes. For example, course link questions (requiring students to integrate course information) yielded lots of higher-order thinking but low word counts and low answer completeness. In addition, students tended to avoid the course link questions, which is unfortunate because those seem to be the ones of most benefit!

Zingaro, D. (2012). Student Moderators in Asynchronous Online Discussion: a Question of Questions. Journal of Online Learning and Teaching, 8(3).
This is my own paper on the topic of questions in asynchronous online courses. Given the popularity of student moderators (student facilitators who lead the weekly discussions), I wanted to know a) what types of questions do these students naturally ask, b) to what level of depth do their peers respond, and c) do they ask the good types of questions discussed in the above paper? I found that students naturally ask only a small range of question types. In particular, they don't seem to ask course link questions! I wanted to follow-up by scaffolding different question types to study whether students can successfully use the good questions, but that won't happen any time soon. (Anyone?)

Della Noce, D.J., Scheffel, D.L., and Lowry, M. (2014). Questions that Get Answered: The Construction of Instructional Conversations on Online Asynchronous Discussion Boards. Journal of Online Learning and Teaching, 10(1).
In face-to-face settings, instructor questions are usually answered. In asynchronous settings, sometimes students choose not to answer the instructor's questions. Why? Which questions do students answer in online courses? These authors offer a case study of an online undergraduate course where they examine the features of instructor questions that tend to lead to student responses. Questions that exhibit uptake, authenticity, or both, tend to garner responses; jarring questions that lack thread coherence tend to be ignored (or students impose coherence on behalf of the instructor -- check out the paper for this interesting finding!).

Christopher, M. M., Thomas, J. A., & Tallent-Runnels, M. K. (2004). Raising the bar: Encouraging high level thinking in online discussion forums. Roeper Review, 26 (3), 166-171.
Like my study above, this one examines the questions asked by student moderators and the Bloom level of replies. The authors don't find strong relationships between questions and answers, however, perhaps because questions were measured on Bloom's level and not question type (as per the above studies).

Ertmer, P. A., Sadaf, A., & Ertmer, D. J. (2011). Student-content interactions in online courses: the role of question prompts in facilitating higher-level engagement with course content. Journal of Computing in Higher Education, 23 (2-3), 157-186.
This study is similar to the Bradley et al. paper above. They classify questions by Andrews' type, and also by level of critical thinking on Bloom's taxonomy; then they coded answers using Bloom's taxonomy. These questions were sampled from ten courses (not experimentally manipulated like in the Bradley et al. paper). One general finding is that questions at high Bloom levels facilitate responses at higher Bloom levels but, regardless of Bloom level or Andrews type of question, critical thinking in responses was low.

Blanchette, J. (2001). Questions in the online learning environment. Journal of Distance Education, 16 (2), 37-57.