Oxford-Style Debates in a Microbiology Course for Majors: A Method for Delivering Content and Engaging Critical Thinking Skills †

Oxford-Style Debates in a Microbiology Course for Majors: A Method for Delivering Content and Engaging Critical Thinking Skills

Dwayne W. Boucaud 1 , * , Michael Nabel 2 , Christian H. Eggers 1

1 Departments of Biomedical Sciences, Quinnipiac University, Hamden, CT 06518
2 Mathematics and Computer Sciences, Quinnipiac University, Hamden, CT 06518

Developing scientific expertise in the classroom involves promoting higher-order cognitive skills as well as content mastery. Effective use of constructivism can facilitate these outcomes. However this is often difficult to accomplish when delivery of content is paramount. Utilizing many of the tenets of constructivist pedagogy, we have designed an Oxford-style debate assignment to be used in an introductory microbiology course. Two teams of students were assigned a debatable topic within microbiology. Over a five-week period students completed an informative web page consisting of three parts: background on the topic, data-based positions for each side of the argument, and a data-based persuasive argument to support their assigned position. This was followed by an in-class presentation and debate. Analysis of student performance on knowledge-based questions shows that students retain debate-derived content acquired primarily outside of lectures significantly better than content delivered during a normal lecture. Importantly, students who performed poorly on the lecture-derived questions did as well on debate-derived questions as other students. Students also performed well on questions requiring higher-order cognitive skills and in synthesizing data-driven arguments in support of a position during the debate. Student perceptions of their knowledge-base in areas covered by the debate and their skills in using scientific databases and analyzing primary literature showed a significant increase in pre- and postassignment comparisons. Our data demonstrate that an Oxford-style debate can be used effectively to deliver relevant content, increase higher-order cognitive skills, and increase self-efficacy in science-specific skills, all contributing to developing expertise in the field.


Constructivist pedagogy relies on individuals making meaning of ideas with which they come into contact by forming interactions with their own understanding and experiences. This type of learning requires students to become actively engaged in the subject matter, a process that involves students becoming exposed to new ideas, evidence, and data. Students are tasked with clarifying their own views on these ideas, identifying the views of others, and critically evaluating and discussing the different data relevant to the subject (38, 45) . By challenging their own views in the context of data supporting or refuting the views of others, students can create new meaning through cognitive restructuring (35) . Development of meaning may also take place within a group dynamic where members have the opportunity to share input and to come to agreement on the ideas discussed (35) .

Constructivism as a theory of learning and teaching is sometimes misunderstood and misused (18) . Active student participation is a necessary component of constructivist pedagogy; however, student participation should be placed in a rigorous context that promotes the understanding, analysis, and application of ideas and concepts (18, 11) . In the sciences, active learning in the teaching laboratory has been shown to be an important method for evoking students’ higher-order cognitive skills in an applied setting (1, 22, 23) . In the laboratory, successful constructivist teaching and learning methods are utilized that involve students in true inquiry-based experiments in which they generate new knowledge to solve real-world problems (25, 32) . Coupled with guidance by the instructor, students can develop technical laboratory skills while engaging in activities common to the application of the scientific method; searching the primary literature, formulating a hypothesis, designing a logical method to test that hypothesis, analyzing data, and presenting the findings to a wider audience (14, 17) .

There is a recognized need to employ active learning methodologies and engage students fully with scientific material in the lecture, as well as in the laboratory; however, constructivist pedagogy is less well accepted in the science lecture. The presumed goal of a science curriculum is to develop scientific expertise that requires students to cultivate analytical and problem-solving skills as well as skills germane to the discipline (26, 44) , talents that are well-fostered by constructivism (38, 45) . Constructivist teaching tools, such as case studies and problem-based learning strategies, are recognized methods for developing classroom engagement as well as critical thinking and analytical skills (5, 16, 28) and these types of exercises have been shown to increase the retention of knowledge, while also increasing social skills and student engagement (8) . Interestingly, despite the description of these types of lecture-based teaching tools within the sciences (34, 43) , critical evaluation of common educational practices reveals a persistent failure to develop the pre-requisites for expertise (44, 3) . A significant reason for this likely is the resistance to the use of these methods in the lecture (24, 3) . Barriers to the application of these tools include the real or perceived difficulty of their use and the amount of class-time needed to implement these strategies (13, 47) . One major concern is whether the inclusion of these methods in a lecture will diminish the amount of content that can be delivered (33) . In order to address these issues, alternative constructivist teaching methods that can effectively deliver content and are easy to use and adapt to a specific course are needed. In this report, we detail the use of an Oxford-style debate assignment that utilizes constructivist pedagogies to promote active, independent student learning, and to deliver content outside of the classroom.

The process of debate has shown resurgence as an active learning tool (29, 31, 39) ; recent literature details its use in a variety of disciplines, including sociology, business, dentistry, and education (20, 30, 37, 46) . A debate allows students to become actively engaged in their own education as they acquire knowledge and place that learned content into the structure of the debate itself, analyzing whether the content supports or contradicts an argument 40). Debates also challenge student knowledge, which may enhance the learning of new concepts. Analysis, synthesis, and evaluation of ideas are promoted as students assess sources, evaluate the appropriateness of those sources, seek relevance in the data, and examine various viewpoints (19, 21, 36) . These practices lead to an effective engagement of the higher-order cognitive skills as defined by Bloom’s Taxonomy (7, 12) .

In the fall of 2011, an Oxford-style debate assignment was given to students in an undergraduate introductory microbiology course; the participants in this course are primarily sophomores and juniors interested in careers in the health sciences. The majority of the assignment took place outside of the classroom using on-line Wiki technology and introduced new topics which were minimally, if at all, discussed in previous lectures.

We hypothesized that students would be able to learn content effectively when actively engaged in constructing and analyzing data-driven arguments. To assess this hypothesis, we compared student knowledge of topics covered during a traditional lecture with those introduced during the debate assignment. This included analysis of student use of lower-order and higher-order cognitive skills as defined by Bloom’s taxonomy (7, 12) . We also used pre- and postassignment surveys to investigate whether student attitudes toward the subjects discussed were affected positively or negatively by completing the assignment. Additionally, we assessed the students’ pre- and postassignment confidence in their ability to find and use scientific databases and in their ability to analyze primary literature in pursuit of increasing their scientific knowledge.


Class information and participants

Sixty-one students taking an introductory microbiology course during the fall 2011 semester at Quinnipiac University in Hamden, Connecticut, took part in this study. The majority of students taking the course were from the following four majors/programs: the Entry Level Master’s Physician Assistant program, Biology, Biomedical Sciences, and Health and Science Studies. Other majors represented included Microbiology and Chemistry. The majority of the students were juniors, and all students taking the class have taken one year of both introductory biology and introductory chemistry. When the assignment was given, topics already covered in the course included the history of microbiology, microscopy, prokaryotic structure, microbial growth, metabolism and its regulation, gene expression, and viruses. Lecture classes met twice a week for 50 minutes each.

Debate assignment

The assignment was composed of three main areas: 1) construction of an informative web page, 2) a 10-minute presentation of the debated topic, and 3) the debate itself. The debate assignment was introduced in a lecture shortly after midterm in a 20-minute presentation by the instructor. During this introduction students were divided into teams of five or six. Pairs of groups were then given a deliberately broad topic from which their debate position would be drawn; in the fall of 2011, the topics included “the Human Immunodeficiency Virus” (HIV), “Antibiotics in Agriculture” (Agr) and “Oral Bacteria and Bacteremia” (Bac) (Table 1). The rubric for grading each of these sections was delivered to the students during the initial presentation (Appendix 1). In the subsequent class period, a 20-minute presentation was given by the Quinnipiac University Research Librarian on the use of available scientific databases and the information they contained.

TABLE 1   Topics and debatable statements provided to the students.


The construction of the website occurred in three phases. In the first phase students were given one week to produce an introduction containing general information on their topic. Websites were constructed in Blackboard Learn, version 9.1 (Blackboard, Inc.) using the Wiki tool configured to allow only team members and the instructor access to the group’s website. After the initial week, students were sent an e-mail by the instructor asking groups to reflect on their work and reassess their introduction, taking into account the following questions:

  1. Does your web page include basic information on the topic including its relevance to cultural/social issues (such as demographic groups affected by topic), economic issues, as well as medical issues?

  2. Does your web page contain statistics supporting this information? If so, how and when was this information collected? How does this affect the relevance of the information and your choice to use this information?

Students were then given an additional two days to make any changes they deemed appropriate. Following the construction of the introduction, students were given a debatable statement regarding their topic. The statements for the fall 2011 topics are found in Table 1. The students were then tasked with constructing the second section of their website; this section was to detail both the affirmative (pro) and negative (con) arguments of the debatable statement and provide supporting data for both sides. After one week, students were prompted to reflect, review, and edit the web pages in light of the following questions:

  1. Does your section include data-supported evidence for each position?

  2. Are the data objective or subjective?

  3. How were the data collected and how does the method of data collection influence your decision to use it?

After an additional two-day period student groups were given the last phase of the web page assignment. Each group was assigned a position (pro or con) on the debatable question, and was tasked with producing the final section of their web page. This section was a persuasive argument concerning their position relative to their opponent’s. In order to maintain the balance in viewpoints of the previous section, students were no longer allowed to edit the previous two sections. After a one week period the completed web pages were opened to the class for viewing. This both allowed members of opposing teams to view the opposition’s arguments and gave the class as a whole the chance to study the material on the sites.

The presentation and debate components of the assignment were performed during the same class period. The two teams assigned the same topic were given 10 minutes each to present background information on their topic, their position (pro or con), and why the data more strongly supported their position. This initial presentation was given by two or three students from each group, depending on whether the team was comprised of five or six students, respectively. The remaining three students took part in the Oxford-style debate. The debate was broken up into timed components as outlined in the initial assignment handout (Table 2 and Appendix 1). One student from each group was responsible for the cross-examination of the opposition and for responding to the opposition’s cross-examination, while a second student gave the rebuttal, and the third gave the closing argument (Table 2).

TABLE 2  Outline for the format of the Oxford-style debate used in introductory microbiologya.


Assessment and evaluation

Student retention of content was assessed via multiple-choice questions, while higher-order cognitive skills were assessed using essay questions. All test questions were assessed by the instructor of the two classes prior to statistical analysis.

To assess the retention of content, students were given 10 questions pertaining to material delivered via lecture and 10 questions derived from debate material; three questions were developed from the HIV and Agr topics, while four were derived from the Bac material (Appendix 2). These questions were of the “Knowledge and Comprehension” skill level as defined by Bloom’s taxonomy (7, 12) . Statistical analyses of scores on debate- and lecture-derived questions were done using paired two-sample, one-tailed t-tests for differences of test scores (lecture vs. debate) and the generalized linear model (GLM) for the analysis of variance of individual test scores. The hypothesis being tested was whether or not test scores for debate-presented material were higher, on average, than test scores for lecture-presented material. The data were also analyzed to determine whether students in a particular debate topic group scored higher on questions related to their particular topic. A difference-of-means test was used with a one-tailed t-test using pooled estimators for the variance.

Higher-order cognitive skills were assessed by essay questions that asked students to evaluate data, explain the meaning or relevance of that data, and use that data to form a position or hypothesis (7, 12) . Data given were either derived from a lecture given by the instructor or from published articles concerning the topics covered during the debate assignment. Three questions were developed from each of the debate topics and one from material discussed during a normal lecture. Scores were analyzed using paired comparisons and the t-test. A student’s lecture essay score was subtracted from her/his topic essay score to obtain a paired comparison (Δ score). Since the hypothesis being tested was that the essay topic scores would be greater than the essay lecture score, a one-tailed test was used.

Higher-order cognitive skills of the members of the debate teams also were assessed during the in-class debate by the audience members and the instructor. Students within the audience evaluated the collective performance of the debate teams in various categories by assigning a score of 1–4 (with 4 being the highest) based on defined criteria (Appendix 1). The categories to be evaluated included “Use of argument; reasons are given to support the resolution” and “Use of cross-examination and rebuttal; identification of weakness in opposing team’s arguments and ability to defend itself against attack.”

An original survey instrument, deployed pre- and postassignment through Survey Monkey (SurveyMonkey.com, LLC), was used to determine (i) student perceptions concerning their knowledge of their topic and those assigned to other groups (9, 10) or (ii) students’ comfort level in finding and using health and medicinal databases and in analyzing primary literature. The preassignment survey was given immediately after the debate topics were assigned to students, before the development of the web pages. The postassignment survey was given after the completion of the assignment, before students were tested on the material. Statistical analyses of trends in student responses were accomplished using difference-of-means and the z-statistic. The data from both sections were pooled and the resulting response frequencies for each question were converted to weighted averages and the standard errors were determined. A difference-of-means analysis was then performed on the presurvey and postsurvey questions.


Student performance on knowledge and comprehension skill-level questions

In the fall of 2011, teams of students in two sections of an introductory microbiology course were assigned one of three topics to be the subject of an Oxford-style debate (Table 1). The intent of this assignment was to both foster constructivism in the students and to deliver content using a method other than through a traditional standard lecture format in which students take notes while the professor lectures. To measure student comprehension of content delivered via the debate assignment, students were assessed by multiple-choice questions (see Appendix 2 for examples). The multiple-choice section of their exam consisted of 10 questions derived from the debate assignment and 10 derived from topics covered during lecture. These questions were of the “Knowledge and Comprehension” skill level as defined by Bloom’s taxonomy of cognitive domains (7, 12) . The average student score was significantly higher ( p < 0.05) for questions derived from the debate than for those derived from a normal lecture format (Fig. 1). Analysis of variance indicates that whether questions were debate- or lecture-derived significantly affected scores, while the individual student’s lecture section, debate topic, and debate side (pro or con) did not (Appendix 3).



FIGURE 1  Student performance on multiple-choice questions derived either from lecture material or from an Oxford-style debate. Multiple-choice questions were developed to assess student knowledge and comprehension of content (see Appendix 2). The average scores on the multiple-choice questions derived from the individual debate topics and from the lecture material are: 88.5% (HIV), 79.8% (Agr), 87.7% (Bac), and 79.6% (lecture). The asterisk indicates a significant increase ( p < 0.05) in the percent correct answers on the debate-derived questions as compared with the number of lecture-derived questions answered correctly. Error bars indicate the standard error of the mean (SEM).

Student performance on debate questions derived from topics that they actively debated and performance on topics for which they only were audience members was also measured. There was no statistically significant difference ( p > 0.05) between the scores for the groups irrespective of whether students actively debated the topic or were audience members only (Fig. 2).



FIGURE 2  Comparison of the percentage of multiple-choice debate-derived questions answered correctly by students who presented the debate material for a specific topic and by those who were audience members while that topic was debated. There is no significant difference ( p > 0.05) between the percentage of questions answered correctly by the presenters or the audience members on any given topic. Error bars indicate the standard error of the mean (SEM).

The data on student performance on debate-derived questions also were parsed based on the percentage of lecture-derived questions a student answered correctly. Students who performed poorly on the lecture-derived questions (< 70%) had a combined average score on the debate-derived questions statistically similar ( p > 0. 05) to those students who scored higher on the lecture-derived questions (Table 3). This suggests that the debate format is a more effective learning tool for lower-performing students when compared to a standard lecture.

TABLE 3   Average score on debate-derived questions aggregated based on student scores on lecture-based questions.


Student performance on essay questions derived from lecture and the debate assignment was also assessed (Fig. 3). These questions fall under the category of “Analysis and Synthesis,” as defined by Bloom’s taxonomy, and were designed to test students’ abilities to interpret data and use it to build an argument or hypothesis (7, 12) . Student scores, when analyzed by topic, showed statistically significant higher scores for the Bac and HIV essays when compared to the lecture essay question ( p < 0.05). While there was no statistically significant difference between the Agr essay scores and that of the essays derived from the lecture material ( p > 0.05), these data suggest that students understood the Agr material as well as they understood the lecture material, with potential added benefits as described below.



FIGURE 3  Average student scores on essay questions derived from the debate topics and a topic introduced during conventional lecture. Essay questions which required students to interpret data and/or formulate a hypothesis were used to assess students’ higher-order cognitive skills. The asterisk indicates a significant increase ( p < 0.05) in the score earned on the HIV- and Bac-derived essay questions as compared with the score earned on the lecture-derived essay question. Error bars indicate the standard error of the mean (SEM).

The debates themselves also gave students the opportunity to use higher-order cognitive skills (20, 21, 29) . The ability of students to analyze and synthesize data-driven arguments was assessed by the class and the professor. Debate teams were assessed in the use of argument, cross-examination, and rebuttal in defending their position using a four-point rubric (Appendix 1). Student groups scored consistently high in these areas, whether scored by the professor or their peers (Fig. 4).



FIGURE 4   Peer and instructor assessment of critical-thinking skills of debate participants. The ability of the members of a debate team to formulate an argument, cross-examine opposing teams, and formulate a rebuttal was assessed by audience members (peers and the instructor). Scores were assigned based on a rubric from 0 to 4 with 4 being the highest score. Error bars indicate the standard error of the mean (SEM).

Students’ perceptions of knowledge and skills gained

Students’ perceptions of their knowledge of debate topics were assessed using pre- and postassignment surveys. Students were asked to rate their knowledge of their topic immediately after being assigned a particular topic and again after the assignment, but before their examination. The data demonstrate a significant increase ( p < 0.05) in students’ perceptions of their knowledge in the areas covered by their particular debate with 21% and 3.5% of students claiming to be knowledgeable and highly knowledgeable, respectively, before the assignment, and 61% and 33% reporting those knowledge levels after the assignment (Fig. 5).



FIGURE 5   Student responses to a pre- (n = 57) and post- (n = 60) assignment question concerning their perceived knowledge of their assigned topic. Students were asked, “Which of the following best describes your knowledge of the area assigned to you?”

Students also were asked to rate how knowledgeable they were concerning topics covered by other groups. A comparison of student perceptions of other topics pre- and postassignment shows a significant increase ( p < 0.05) in students who considered themselves “knowledgeable” concomitant with a significant decrease ( p < 0.05) in students who regarded themselves as only “slightly knowledgeable” in the other topics; the percentage of students rating themselves as “knowledgeable” rose from 35% to 68%, while those rating themselves as “slightly knowledgeable” fell from 61% to 20% (Fig. 6).



FIGURE 6   Student responses to a pre- (n = 57) and post- (n = 60) assignment question concerning perceived knowledge of topics assigned to other groups. Students were asked, “Which of the following best describes your knowledge of the area assigned to other groups?”

There was a significant increase ( p < 0.05) in student perceptions of their ability to use health and medicine databases, pre- and postassignment. The percentage of students rating themselves as “highly proficient” and “proficient” rose from 14% and 45% to 31% and 57%, respectively (Fig. 7). This was accompanied by a rise in students’ perceptions of their ability to analyze primary literature with significant gains ( p < 0.05) in those who reported themselves as being “highly proficient” (Fig. 8).



FIGURE 7   Student responses to a pre- (n = 60) and post- (n = 61) assignment question concerning their perception of their ability to use health and medicine databases. Students were asked, “How would you rate your proficiency in using health and medicine databases such as PubMed or Ovid?”



FIGURE 8   Student responses to a pre- (n = 60) and post- (n = 61) assignment question concerning their perception on their ability to analyze primary literature. Students were asked, “How would you rate your proficiency in analyzing primary literature, e.g. articles from scientific journals?”


Students take science courses such as microbiology as part of the training necessary to develop expertise in a scientific discipline. Generating expertise involves multiple factors that include increasing one’s quantity of knowledge (content) in a given field and basing that knowledge in meaning (2,6) . An increased knowledge-base facilitates competence within a field while basing knowledge in meaning suggests a deeper understanding of ideas and the ability to relate knowledge to multiple concepts. Metacognitive skills that allow one to develop strategies for the acquisition and use of knowledge are important in acquiring expertise (15, 41) . Furthermore, developing expertise in the sciences also requires an understanding of the nature and methods of science and skills of inquiry and problem-solving (26) . Constructivist pedagogy fosters the development of expertise by training students to make meaning of newly-acquired knowledge by relating it to their own experiences and forcing them to identify and critically evaluate how that knowledge is used or viewed by others (44) . In an attempt to introduce constructivist pedagogy into the introductory microbiology class at Quinnipiac University, we developed an Oxford-style debate that would (i) deliver content in a manner that allowed students to engage in activities common to scientific inquiry, (ii) provide an active and cooperative learning experience that would deliver content without taking away a significant amount of time from the regular schedule of classes, and (iii) engage the higher-order cognitive skills of the students in a lecture-based setting. Overall, the findings of this study demonstrate that the use of an Oxford-style debate in the science classroom is an effective method of student engagement and facilitates the skills required to build expertise. In addition to delivering content on topics not covered in classroom lectures, the debate assignment also provided multiple opportunities to develop skills used during scientific inquiry. Students were responsible for reviewing the scientific literature, presenting a summary of a topic in the form of a web page, and actively seeking data to formulate evidence-based arguments in support of or against a given hypothesis.

Analysis of student gains in knowledge and comprehension show a significant increase in content collectively derived from the debate topics when compared to that from a typical lecture (Fig. 1), irrespective of whether students were actively engaged in the debate or were audience members (Fig. 2). This is especially noteworthy as the majority of the content was acquired independently by the students outside of the classroom setting. Importantly, our data also demonstrate that students who struggled with content delivered during a traditional lecture performed significantly better on the questions derived from the debate topics (Table 3). In addition to acquiring knowledge as part of the debate assignment, students also successfully engaged in higher-order thinking through analysis, synthesis, and evaluation, qualities intrinsic to a successful debate. When analyzing student performance on essay questions it was expected that students would do equally well on the debate topics and the lecture topics (Fig. 3), as improvement in critical thinking skills should be reflected regardless of topic. We found this was not the case, however, as students did better on essay questions derived from the HIV and Bac debate than on those derived from the Agr debate or from the lecture topic. This may reflect greater student understanding of the content of those two topics, an idea that is supported by the observation that students had mastered less of the lecture content as assessed by the multiple-choice questions (Fig. 1) and that students had lower scores on multiple-choice questions derived from the Agr debate than they did on those derived from the HIV or Bac debates (Fig. 1 legend and Fig. 2).

In addition to the data demonstrating both that students had acquired knowledge and the ability to use that knowledge, our analysis also demonstrates that students perceive an increase in their gain of knowledge and in their acquisition and analysis of the primary literature (Figs. 58). While there is a potential concern that these results are somewhat more subjective than those that stem from the assessment of student tests, there also is a growing body of data that suggests that self-efficacy, the belief or judgment of an individual that they can succeed at a task, increases problem-solving efficiency and is, therefore, an important component of building expertise (4, 6, 27, 41, 42) .

The use of an Oxford-style debate assignment in the sciences should not be limited to microbiology; instructors can tailor the assignment to their discipline by choosing relevant topics for their debate. Because content acquisition largely takes place outside of the classroom, this assignment can be used to either add novel content to the curriculum or to support content given in the traditional lecture. This however raises a critical point: constructivist pedagogy involves student-inquiry guided by the instructor (18) . As students engage in knowledge-acquisition there is the opportunity for them to make improper assessments or conclusions based on their findings. Input from the instructor is necessary to ensure that proper student learning is being accomplished. Tools that periodically assess student learning and allow for teacher input are an important part of this process. Subsequent to the study presented here, we have adopted the use of small group-discussion boards that allow instructor input regarding topics covered during the debate.

In conclusion, this study demonstrates that the use of debate in the classroom is an effective method of content delivery. The process of assessing data-driven arguments promotes higher-order cognitive skills and gives students confidence in their knowledge base and use of science databases. Each of these is important in developing expertise within the discipline.


Appendix 1: Initial handout to students detailing debate project and grading rubrics

Appendix 2: Examples of knowledge and comprehension test questions based on material covered in lecture format

Appendix 3: Statistical analyses of scores on debate-and lecture-derived questions


We thank Dr. Donald Buckley for his analysis of the survey instrument on student perceptions. The authors declare that there are no conflicts of interest. The research described above complies with all federal and institutional policies concerning the use of human subjects.


1. Adams, D. J. 2009. Current trends in laboratory class teaching in university bioscience programmes. Biosci. Educ. 13. http://www.bioscience.heacademy.ac.uk/journal/vol13/beej-13-3.aspx.

2. Alexander, P. A. 2003. The development of expertise: the journey from acclimation to proficiency. Educ. Res. 32:10–14.

3. American Association for the Advancement of Science. 2009. Vision and change in undergraduate biology education: a call to action. American Association for the Advancement of Science, Washington, DC.

4. Bandura, A. 1977. Self-efficacy: toward a unifying theory of behavioral change. Psych. Rev. 84:191–215.

5. Barrows, H. S., and R. M. Tamblyn. 1980. Problem-based learning: an approach to medical education. Springer Publishing Company, New York, NY.

6. Bédard, J., and M. T. H. Chi. 1992. Expertise. Curr. Direct. Psychol. Sci. 1:135–139.

7. Bloom, B. S. 1984. Taxonomy of educational objectives: the classification of educational goals. Longman, New York, NY.

8. Bonwell, C. C., and J. A. Eison. 1991. Active learning: creating excitement in the classroom. ASHE-ERIC Higher Education Report No 1. The George Washington University, School of Education and Human Development, Washington, DC.

9. Boucaud, D. Microbiology Post Debate Survey, Fall 2011A. http://www.surveymonkey.com/s/XR7BQ2J. Accessed 9 December 2011.

10. Boucaud, D. Microbiology Pre Debate Survey, Fall 2011A. http://www.surveymonkey.com/s/K2XRCQB. Accessed 9 December 2011.

11. Committee on Undergraduate Biology Education to Prepare Research Scientists for the 21st Century, National Research Council. 2003. BIO2010: transforming undergraduate education for future research biologists. The National Academies Press, Washington, DC. http://www.nap.edu/openbook.php?isbn=0309085357.

12. Crowe, A., C. Dirks, and M. P. Wenderoth. 2008. Biology in bloom: implementing Bloom's taxonomy to enhance student learning in biology. CBE Life Sci. Educ. 7:368–381.
cross-ref  pubmed  pmc  

13. Fairweather, J. 2008. Linking evidence and promising practices in science, technology, engineering, and mathematics (stem) undergraduate education. Paper presented at the National Research Council’s Workshop Linking Evidence to Promising Practices in STEM Undergraduate Education, October, Washington, DC. http://www7.nationalacademies.org/bose/Fairweather_CommissionedPaper.pdf.

14. Ghedotti, M. J., C. Fielitz, and D. J. Leonard. 2005. Using independent research projects to foster learning in the comparative vertebrate anatomy laboratory. Bioscene 30:3–8.

15. Gillespie, M. K. 2002. EFF research principle: an approach to teaching and learning that builds expertise. National Institute for Literacy, Washington, DC.

16. Glynn, S. M., G. Taasoobshirazi, and P. Brickman. 2007. Nonscience majors learning science : a theoretical model of motivation. J. Res. Sci. Teach. 44:1088–1107.

17. Goldstein, J., and D. F. B. Flynn. 2011. Integrating active learning & quantitative skills in undergraduate introductory biology curricula. Am. Biol. Teach. 73:454–461.

18. Gordon, M. 2009. The misuses and effective uses of constructivist teaching. Teachers & Teaching: Theory & Practice 15:737–746.
cross-ref  pmc  

19. Green, C. S., and H. G. Klug. 1990. Teaching critical thinking and writing through debates: an experimental evaluation. Teach. Sociol. 18:462–471.

20. Gregory, M., and M. Holloway. 2005. The debate as a pedagogic tool in social policy for social work students. Soc. Work Educ.: Int. J. 24:617–637.

21. Hall, D. 2011. Debate: innovative teaching to enhance critical thinking and communication skills in healthcare professionals. Int. J. Allied Health Sci. Prac. 9:16–19. http://ijahsp.nova.edu/articles/Vol9Num3/Hall.htm.

22. Handelsman, et al. 2004. Scientific teaching. Science 304:521–522.
cross-ref  pubmed  

23. Hatfull, G. F., et al. 2006. Exploring the mycobacteriophage metaproteome: phage genomics as an educational platform. PLoS Gen. 2:e92.

24. Herreid, C. F. 1998. Why isn’t cooperative learning used to teach science? BioScience 48:553–559.

25. Herron, S. S. 2009. From cookbook to collaborative: transforming a university biology laboratory course. Am. Biol. Teach. 71:548–552.

26. Hodson, D. 2001. What counts as good science education? p. 7–22. In OISE Papers In STSE Education, Vol. 2. University of Toronto Press, Toronto, ON.

27. Hoffman, B., and G. Schraw. 2009. The influence of self-efficacy and working memory capacity on problem-solving efficiency. Learn. Indiv. Diff. 19:91–100.

28. Kendler, B. S., and P. A. Grove. 2004. Problem-based learning in the biology curriculum. Am. Biol. Teach. 66:348–354.

29. Kennedy, R. 2007. In-class debates: fertile ground for active learning and the cultivation of critical thinking and oral communication skills. Int. J. Teach. Learn. Higher Educ. 19:183–190.

30. Kennedy, R. R. 2009. The power of in-class debates. Active Learn. Higher Educ. 10:225–236.

31. Koklanaris, N., A. P. MacKenzie, E. M. Fino, A. A. Arslan, and D. E. Seubert. 2008. Debate preparation/participation: an active, effective learning tool. Teach. Learn. Med. 20:235–238.
cross-ref  pubmed  

32. Luciano, C. S., M. W. Young, and R. R. Patterson. 2002. Bacteriophage: a model system for active learning. Microbiol. Educ. 3:1–6.

33. Michael, J. 2007. Faculty perceptions about barriers to active learning. Coll. Teach. 45:42–47.

34. Palmer, D. 2005. A motivational view of constructivist informed teaching. Int. J. Sci. Educ. 27:1853–1881.

35. Richardson, V. 2003. Constructivist pedagogy. Teach. Coll. Rec. 105:1623–1640.

36. Roy, A., and B. Macchiette. 2005. Debating the issues: a tool for augmenting critical thinking skills of marketing students. J. Market. Educ. 27:264–276.

37. Rubin, R. W., R. J. Weyant, and C. A. Trovato. 2008. Utilizing debates as an instructional tool for dental students. J. Dental Educ. 72:282–287.

38. Saunders, W. L. 1992. The constructivist perspective: implications and teaching strategies for science. School Sci. Math. 92:136–141.

39. Scott, S. 2008. Perceptions of students’ learning critical thinking through debate in a technology classroom: a case study. J. Technol. Studies 34:39–44.

40. Snider, A., and M. Schnurer. 2006. Goals of classroom debating, p. 34 –39. In many sides: debate across the curriculum. International Debate Education Association, New York, NY.

41. Sternberg, R. J. 1998. Metacognition, abilities, and developing expertise: what makes an expert student? Instr. Sci. 26:127–140.

42. Taasoobshirazi, G., and S. M. Glynn. 2009. College students solving chemistry problems: A theoretical model of expertise. J. Res. Sci. Teach. 46:1070–1089.

43. Taylor, P. C., P. J. Gilmer, and K. Tobin, ed. 2002. Transforming undergraduate science teaching: social constructivist perspectives. Counterpoints: studies in the postmodern theory of education. Peter Lang Publishing, Inc., New York, NY.

44. Tynjala, P. 1999. Towards expert knowledge? A comparison between a constructivist and a traditional learning environment in the university. Int. J. Res. Educ. 31:357–442.

45. Tytler, R. 2002. Teaching for understanding in science: Constructivist/conceptual change teaching approaches. Austr. Sci. Teach. J. 48:30–35.

46. Vo, H. X., and R. L. Morris. 2006. Debate as a tool in teaching economics: rationale, technique and some evidence. J. Educ. Bus. 81:315–330.

47. Ward, J. D., and C. L. Lee. 2002. A review of problem-based learning. J. Fam. Cons. Sci. Educ. 20:16–26.

* Corresponding author. Mailing address: Department of Biomedical Sciences, Quinnipiac University, 275 Mount Carmel Avenue, Hamden, CT 06518. Phone: 203-582-3768. Fax: 203-582-8706. E-mail: Dwayne.boucaud@quinnipiac.edu .

Supplemental materials available at http://jmbe.asm.org ( Return to Text )

(Return to Top)

DOI: http://dx.doi.org/10.1128/jmbe.v14i1.433
Journal of Microbiology & Biology Education , May 2013
© 2013 Author(s). Published by the American Society for Microbiology. All Rights Reserved

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

ISSN: 1935-7885

American Society For Microbiology © 2016   |   Privacy Policy |   Terms of Use