Hostname: page-component-cd9895bd7-jn8rn Total loading time: 0 Render date: 2024-12-28T14:18:26.269Z Has data issue: false hasContentIssue false

Not all Pain is Gain: Lessons From Teaching Critical Thinking Online

Published online by Cambridge University Press:  19 November 2020

John LaForest Phillips*
Affiliation:
Austin Peay State University
Rights & Permissions [Opens in a new window]

Abstract

Type
COVID-19 and Emergency e-Learning in Political Science and International Relations
Copyright
© The Author(s), 2020. Published by Cambridge University Press on behalf of the American Political Science Association

Those who teach political science—especially those like me who teach political theory—overwhelmingly see critical thinking (CT) as one of their priority learning outcomes (Moore Reference Moore2011). Much of the conversation about stimulating CT in the virtual classroom focuses on discussion boards and interaction more broadly. Nearly everyone agrees that discussion, properly conducted, can help students develop CT (Williams and Lahman Reference Williams and Lahman2011). But is there any more that can be done?

Discussions can disappoint. It does not always seem like students make connections between—or inferences from—the assigned materials. There are two possible but conflicting responses to this state of affairs. Instructors can try to add material and assignments to stimulate CT, or they can scale back and try to focus student attention on a narrower range of materials and assignments.

Literature

Most conceptualizations of CT converge on the idea that it involves “an individual’s capability to […] identify central issues and assumptions in an argument, recognize important relationships, make correct references from the data, deduce conclusions from information or data provided, interpret whether conclusions are warranted based on given data, evaluate evidence of authority, make self-corrections, and solve problems” (Pascarella and Terenzini Reference Pascarella and Terenzini2005, 156).

One commonly advanced strategy to develop student CT is known as “scaffolding.” This entails using targeted assignments to help students break down complex judgments into a series of simpler ones punctuated by guided feedback before asking them to tackle more complex types of reflection (Sharma and Hannafin Reference Sharma and Hannafin2004; van de Pol, Volman, and Beishuizen Reference van de Pol, Volman and Beishuizen2010).

Adding assignments online (scaffolded or otherwise) also has been shown to motivate students to complete assigned readings, increase participation in class discussion, and improve performance on exams (Brothen and Wambach Reference Brothen and Wambach2004; Johnson and Kiviniemi Reference Johnson and Kiviniemi2009). In brief, more is better.

This view is not universally endorsed, however. Some advocate a “less-is-more” approach. The idea is to “shift from a broader focus on ‘coverage’ of a variety of types of document and concepts to deeper focus on a more narrow range of topics and/or assignments” (Skurat Harris, Nier-Weber, and Borgman Reference Harris, Heidi, Borgman, Ruefman and Scheg2016, 19).

The rationale for adopting a more minimalist approach stems from the unique characteristics of the online medium. Communication is more uncertain in online courses. Students may not choose to click on all the available course materials. The more materials there are, the higher the likelihood that something important will be missed. Furthermore, because the online environment is usually text based, if students do not read as well as they should (or professors do not write as clearly as they think they do), the potential for miscommunication may be greater. Finally, waiting for an email response to a query takes time; students may not seek clarification if they do not believe an answer will be forthcoming in a convenient time frame.

Advocates of the minimalist approach stress the need to declutter online courses, extend the period between deadlines, and focus scarce student attention on a limited quantity of materials and assignments.

Method and Data

I teach an introductory course in political theory required for political science majors at a midsize American public university.Footnote 1 For the past eight years, I have collected a dataset consisting of essays scored for their CT using a rubric adapted from the Washington State University Critical Thinking Initiative (Condon and Kelly-Riley Reference Condon and Kelly-Riley2004).Footnote 2

During this time, both minimalist and more scaffolded approaches were sometimes adopted. Scaffolded semesters were identified by the presence of specifically designed scaffolding exercises. Minimalist semesters were identified by the small number of required assignments (i.e., fewer than seven). Anything else was put into a residual category. Although the selection criteria are simple, table 1 shows that they align well with other dimensions of the concepts.

Table 1 Summary Statistics: Three Types of Semesters (Standard Deviation)

Notes: Means sharing the same superscript are not significantly different from one another (Tukey-Kramer Honest Significance Test; p<0.05).

* Nominal variables are analyzed using a paired one-tailed chi-square test. Means sharing the same superscript are not significantly different from one another (p<0.05).

The scaffolded semesters have more assignments, more specific scaffolding assignments, more scaffolded CT quiz questions, more quiz questions overall, a greater variety of reading assignments, more required discussion, and more assignments with individualized feedback. The semesters with a minimalist approach have less of all of these elements. The residual category usually falls in between these two approaches. Space does not permit a discussion of the specific types of scaffolding exercises used, but several are discussed elsewhere (see Phillips Reference Phillips2018). Table 1 also shows that sections using the different approaches are not statistically different in terms of academic qualifications of students.

Summary of Findings

Overall, there were no statistically significant differences in mean CT scores across the three different types of semesters (table 2). The extra work that went into scaffolding online classes yielded no aggregate dividends in terms of measured CT. Table 2 also lists secondary learning outcomes for context.Footnote 3 Overall, there were few differences between the minimalist and the scaffolded semesters. The only statistically significant difference is that students withdraw less often from the minimalist semesters. Despite the author’s best efforts, alternate methods of analyzing the data (e.g., regression analysis) did not uncover further differences.

Table 2 Dependent Variables of Interest: Three Semester Types (Standard Deviation)

Notes: Means sharing the same superscript are not significantly different from one another (Tukey-Kramer Honest Significance Test; p<0.05).

* Dummy variables are analyzed using a paired one-tailed chi-square test. Means sharing the same superscript are not significantly different from one another (p<0.05).

Conclusion

Instructors want their students to flourish online. If they follow a more minimalist approach, they may feel as if they (and their students) are not doing enough. This can lead to busy and intimidating online course designs. These courses are more work for students but also more labor intensive for the instructor in both preparing for the course and time spent assessing and giving feedback during the course. If moving the needle on CT online is difficult despite a substantial increase in effort, then—ceteris paribus—it is more efficient for everyone if the more minimalist course designs are adopted. Seeking innovation in online learning is important, but we also should acknowledge that not all pain is gain.

If moving the needle on CT online is difficult despite a substantial increase in effort, then—ceteris paribus—it is more efficient for everyone if the more minimalist course designs are adopted.

Footnotes

1. Students self-selected into online courses but had no advance notice of the course design.

2. Six dimensions of CT are scored: Issue Identification, Textual Interpretation, Logical Consistency, Awareness of Alternative Perspectives, Use of Evidence, and Assessing Implications. Scores are weighted equally to form an additive index.

3. To improve comparability, all exam scores were normalized as a percentage of the top score. Scores for writing proficiency are averages of scores for Spelling and Grammar, Introduction, Organization, and Efficiency. Exams varied in their content and format, but criteria for assessing writing were stable for the entire period.

References

REFERENCES

Brothen, Thomas, and Wambach, Catherine. 2004. “The Value of Time Limits on Internet Quizzes.” Teaching of Psychology 27:5860.Google Scholar
Condon, William, and Kelly-Riley, Diane. 2004. “Assessing and Teaching What We Value: The Relationship Between College-Level Writing and Critical Thinking Abilities.” Assessing Writing 9:5675.Google Scholar
Johnson, Bethany C., and Kiviniemi, Marc T.. 2009. “The Effect of Online Chapter Quizzes on Exam Performance in an Undergraduate Social Psychology Course.” Teaching of Psychology 36:3337.CrossRefGoogle Scholar
Moore, Matthew J. 2011. “How (and What) Political Theorists Teach: Results of a National Survey.” Journal of Political Science Education 7 (1): 95128.Google Scholar
Pascarella, Ernest T., and Terenzini, Patrick T.. 2005. How College Affects Students. San Francisco: Jossey-Bass.Google Scholar
Phillips, John L. 2018. “Making Assignments Count: The Quest for Critical Thinking in Undergraduate Political Theory.” Journal of Political Science Education 15 (2): 142–60.Google Scholar
Sharma, Priya, and Hannafin, Michael J.. 2004. “Scaffolding Critical Thinking in an Online Course: An Exploratory Study.” Journal of Educational Computing Research 31 (2): 181208.Google Scholar
Harris, Skurat, Heidi, Dani Nier-Weber, and Borgman, Jessie C.. 2016. “When the Distance Is Not Distant: Using Minimalist Design to Maximize Interaction in Online Writing Courses and Improve Faculty Professional Development.” In Applied Pedagogies: Strategies for Online Writing Instruction, ed. Ruefman, Daniel and Scheg, Abigail, 1736. Boulder, CO: University Press of Colorado.Google Scholar
van de Pol, Janneke, Volman, Monique, and Beishuizen, Jos. 2010. “Scaffolding in Teacher–Student Interaction: A Decade of Research.” Educational Psychology Review 22:271–96.Google Scholar
Williams, Leonard, and Lahman, Mary. 2011. “Online Discussion, Student Engagement, and Critical Thinking.” Journal of Political Science Education 7 (2): 143–62.Google Scholar
Figure 0

Table 1 Summary Statistics: Three Types of Semesters (Standard Deviation)

Figure 1

Table 2 Dependent Variables of Interest: Three Semester Types (Standard Deviation)