Using an Interrupted Case Study to Engage Undergraduates’ Critical Thinking Style and Enhance Content Knowledge

Kelsey Hall, Ed.D. and Katherine Starzec, Ph.D.

Abstract

The interrupted case study is a structured way to engage students in active learning. Interruptions, or pauses for reflection and discussion scheduled within the case-study presentation, provide students with a chance to collaborate and engage in critical thinking. Critical thinking style, which is a measure of how one tends to think critically, provides insight into how one tackles problem solving. This article describes a pilot project that combined critical-thinking style and an interrupted case study, delivered over a two-class-period time frame, to four college courses. The project’s goals were to assess students’ self-reported knowledge, self-reported ability, changes in thinking, and intentions to use their critical thinking style in the future. The University of Florida Critical Thinking Inventory and an end-of-session evaluation were administered online, and 110 students voluntarily responded. Results indicated that many students enjoyed the discussion-based and problem-solving structure of the interrupted case study. Results also showed increases in students’ self-reported knowledge about critical thinking style and content covered in the case study. For teachers looking to pilot an interrupted case study with a critical thinking style component, two class sessions can have a positive effect on student learning and encourage critical thinking.

Keywords: interrupted case study, critical thinking style, community-based social marketing, local food, evaluation

Introduction

Training college students to engage in critical thinking and solve problems related to food and natural resources is essential (Quinn et al., 2009). Engaging critical thinking in the classroom promotes information discovery and higher-order thinking about complex issues (Snyder & Snyder, 2008), and the “heart of education lies … in the processes of inquiry, learning and thinking rather than in the accumulation of disjointed skills and senescent information” (Facione, 1990, p. 1). Issues related to agriculture, food, and natural resources are complex. Undergraduate students with agriculture-related career paths must be prepared to tackle these issues as they enter the workforce (Akins et al., 2019), and their preparation requires practice in critical thinking and problem solving.

Case studies promote active learning, problem solving, and decision making in a variety of disciplines (e.g., Fiester, 2010; Popil, 2011). Active learning engages students in more than just listening exercises; active learning requires students to discuss their new knowledge, reflect on it, and tie their learning to the real world (Zayapragassarazan & Kumar, 2012). Case studies deal with real-world issues and ask for evidence, which Herreid (2004) argues is the essence of critical thinking. In interrupted case studies, the teacher builds pauses into the case to prompt student questions, discussion, and thinking before moving to the next portion of the case study (White et al., 2009). Interrupted case studies involve discussion-based learning, which engages students in higher-order thinking (Garrett, 2020), and the use of interrupted case studies has been successfully linked to activating critical thinking (Herreid, 2004; White et al., 2009).

Critical thinking style can be measured with a simple 20-question instrument. The instrument, formally called the University of Florida’s Critical Thinking Inventory (UFCTI), categorizes respondents into one of two styles of critical thinkers: engagers or seekers. The two critical thinking styles are measured on a continuum, and discovering respondents’ critical thinking style helps explain how they process, or critically think about, information (Lamm & Irani, 2011). Understanding and practicing their own critical thinking can help college students not only in their learning process but in the workforce as well.

For educators who would like to implement one interrupted case study on a trial basis, or fill a one-week gap in the curriculum, the value of a shortened interrupted case study is currently unknown. In this study, we piloted a two-class-session activity pairing critical thinking style via the UFCTI (Day 1) with a researcher-developed interrupted case study (Day 2) about increasing food access via a community-based social marketing campaign. We wanted to not only measure students’ self-reported change in knowledge about topics covered in the interrupted case study but also their self-reported changes in thinking, motivation, and intention to apply their learned knowledge about both the case study content and their newly discovered critical thinking style to other life scenarios.

Critical Thinking and Critical Thinking Style

Though there are several definitions of critical thinking, a 1990 Delphi report on the consensus of teaching critical thinking defines it as “purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and inference, as well as explanation of the evidential, conceptual, methodological, criteriological, or contextual considerations upon which that judgment is based” where

The ideal critical thinker is habitually inquisitive, well-informed, trustful of reason, open-minded, flexible, fair minded in evaluation, honest in facing personal biases, prudent in making judgments, willing to reconsider, clear about issues, orderly in complex matters, diligent in seeking relevant information, reasonable in the selection of criteria, focused in inquiry, and persistent in seeking results which are as precise as the subject and the circumstances of inquiry permit. (Facione, 1990, p. 2)

Students do not simply learn critical thinking skills through difficult coursework or questions on tests specific to critical thinking; intentional and direct teaching of critical thinking is necessary to result in measurable advances in critical thinking (Bensley, 2010).

There are two distinct critical thinking styles: seeking or engaging, and after completing the UFCTI, individuals fall on a spectrum between the two styles. Respondents acquire a score between 26 and 130 at the end of the assessment. Those with a score between 26 and 78 are considered engagers, while those with a score between 79 and 130 are considered seekers (Lamm & Irani, 2011). Most are either a seeker or an engager, but “the ideal critical thinker would be able to operate in both styles when necessary” (Leal et al., 2017, p. 22). Seekers are motivated to find the truth at all costs, even if the truth does not line up with their expectations or beliefs. Seekers prefer to conduct deep background research rather than finding and evaluating information via discussion. Engagers prefer to use their critical thinking in discussion-based atmospheres, and they are confident in sharing opinions and presenting information in a group setting (Gay et al., 2015; Lamm & Irani, 2011).

The UFCTI is a versatile tool, and examples of audiences that have been studied using the UFCTI include Extension agents (Lamm, 2016), Extension volunteers (Gay et al., 2016) and college students (Akins et al., 2019). Researchers have studied critical thinking style in relation to information seeking about genetic modification science (Wu et al., 2020), cross-cultural differences in critical thinking style (Lu et al., 2021), water conservation behaviors (Gorham et al., 2014), as well as in teaching approaches at the college level (Akins et al., 2019; Stedman & Adams, 2014).

Active Learning in the Classroom

Though lectures are still the most common instructional method in higher education (Lom, 2012), research shows that active learning, or actively engaging students, increases critical thinking and deepens their learning (Cavanagh, 2011; Felder & Brent, 1996; Millis, 2010). Active learning includes not only actions by the student but also cooperative, team activities and holding students responsible for their learning. Studies find that students appreciate active learning activities, such as reflective writing and group discussion, because it makes class “interesting, interactive, and enjoyable” (Lumpkin et al., 2015, p. 129). “Active learning” can be perceived in different ways (Lombardi et al., 2021), and one form of active learning is constructive learning, where students construct their own meaning, build on prior knowledge, interact with others in the learning process, and engage in activities that intentionally mimic real life (Cooperstein & Kocevar-Weidinger, 2004). Because of limited time in class sessions, constructive learning activities are carefully structured to guide students, in a series of small steps, toward answers that they come up with on their own, “gradually weaning students from reliance on support to independence” (Cooperstein & Kocevar-Weidinger, 2004, p. 143).

Interrupted Case Studies

Classroom activities such as case studies, or case-based instruction, can promote active learning, problem solving, and critical thinking among students (Popil, 2011) in real-life and interactive scenarios (Penn et al., 2016). When group discussion or collaboration is added to case-based instruction, learning is also enhanced (Mayo, 2002). Interrupted case studies are case studies with built-in pause points intended for student collaboration, deep thinking, and group problem solving based on prompts that the instructor provides. In interrupted case studies, students examine a real issue presented to them by the instructor in a stepwise manner, and students take time to consider solutions to the issue, similar to approaching a scientific problem to be solved (Herreid, 2004). Interrupting the case study with prompts and opportunities for group discussion provides the instructor with the opportunity to assess student perceptions and responses and redirect when necessary (Anderson, 2019). The interrupted approach is considered a form of problem-based learning, where students are actively engaged in finding solutions throughout the case delivery. Students who are less likely to engage in class discussion particularly benefit from the structured group work and discussion approach (Anderson, 2019; Herreid, 2011).

Interrupted case studies have been used at the college level in a variety of disciplines and over different time scales. Using a video-based interrupted case study over the course of eight weeks, graduate students in a developmental theory class advanced their critical thinking, and the interrupted format encouraged decision making and explanation building (Anderson, 2019).  Using a mining and heavy metals interrupted case study over a five-week period, students developed skill building in a variety of topics, from chemistry to oral presentation of information (Silva de Lima et al., 2023). In an attempt to move away from traditional lectures, an interrupted case study classroom exercise related to microbiology and organic farming, delivered over 3-4 weeks, was “an effective tool in that it has enhanced students’ ability to understand, integrate, and apply targeted genetics concepts” (Stewart et al., 2014, p. 1). Overall, interrupted case studies used as teaching tools can enhance student engagement and critical analysis at a level difficult to achieve with lectures or other traditional teaching methods (White et al., 2009).

Interrupted case study research has investigated (a) student knowledge and understanding gained (Silva de Lima et al., 2023), (b) increases in critical thinking through analyzed text submissions (Anderson, 2019), (c) student self-reported satisfaction in learning and self-perceived growth (Brooks et al., 2012), and (d) students’ ability to “critically evaluate experimental design and data interpretation” (White et al., 2009, p. 26). Examples of interrupted case study implementation in the classroom in current literature range from three weeks to an entire semester.

The Case

The case study developed for this project was framed around a real community-based social marketing project to increase access to local food among community members who are eligible for food assistance programs. In many regions and studies, food access barriers are attributed to income and class (e.g., Block et al., 2012; Breyer & Voss-Andreae, 2013). Utah State University has developed a successful program in reducing barriers of access to local farmers markets and farm stands among Supplemental Nutrition Assistance Program (SNAP)-eligible households. The Utah State University Extension SNAP-Education program, Create Better Health, implemented a community-based social marketing campaign targeting SNAP-eligible households from 2019-2022. Campaign successes included radio ads, Facebook ads, and videos reaching more than 135,000 people, more than one million bus riders exposed to bus ads in five counties, and more than 5,000 mailers sent to SNAP-eligible households.

Purpose and Objectives

This pilot study further investigated the effect of one interrupted case study on student learning, especially in a short delivery time frame. Pilot studies are a way to test and adjust a new idea or approach before implementing it on a large scale. Pilot studies can be useful in educational settings, providing valuable insight into, for example, whether teachers see the value in or have the capacity to implement the new approach (Regional Educational Laboratory Appalachia, 2021). More specifically, the purpose of this study was to determine the effectiveness of a two-class-session activity pairing critical thinking style and an interrupted case study about increasing food access via a community-based social marketing campaign. The two-day activity was presented to students taking agricultural communications courses at Kansas State University (K-State) and Utah State University (USU), as communications classes in agriculture can be a beneficial atmosphere to engage critical thinking style (Lamm et al., 2018). The specific research objectives were the following:

  1. Identify the change in respondents’ self-perceived level of knowledge about their own critical thinking style and key concepts in the case study;
  2. Identify the change in respondents’ thinking about agricultural communications, marketing tools, and influencing behavior change; and
  3. Determine students’ motivations in learning and application as a result of the two-day activity.

Methods/Procedures

In an activity spanning two 50-minute class periods, we administered the UFCTI in the first class session and an interrupted case study in the second class session to a total of four agricultural communications courses at K-State and USU. Students in the four courses (N = 143) were from a variety of majors, 99% of which were agriculture majors. Three of the four classes were taught at K-State and completed the 2-day activity in person, while the activity was delivered synchronously through Zoom to the fourth class, which was based at USU. At the time, the COVID-19 pandemic still restricted travel. The K-State Institutional Review Board entered a reliance agreement with the USU Institutional Review Board and approved this study as exempt (Protocol # 10009.1). During the first 50-minute class session, we taught students about the styles of critical thinking, allowed students time to complete the instrument and receive their score and style, and engaged students in group discussion about their critical thinking style results. Administering the UFCTI requires training and certification, including a requirement that those who administer the questionnaire make sure participants have time and depth to explore and understand the two critical thinking styles and their application (University of Florida, n.d). Across all four courses, 128 students (89.5%) completed the UFCTI questionnaire during the first class period. During the second 50-minute class session two days later, we delivered the condensed, 50-minute interrupted case study, which was developed by the researchers based on USU’s project on local food access among SNAP-eligible households. The case study introduced students to the importance of local food, audiences on food assistance who struggle to access local food, and overcoming barriers to access through tools in community-based social marketing. Prior to presenting case details via PowerPoint slides, we briefly prompted students to connect their critical thinking style to the first day’s lesson and explained that the interrupted case study would include a series of short lectures followed by short, small group “interruptions.” There were three interruptions within the case, lasting approximately 5-6 minutes per interruption, that clearly prompted students to think critically and answer specific questions that were displayed on lecture slides within the larger case study presentation. A brief whole-class discussion followed each small-group discussion.

We developed a questionnaire using guides from state Cooperative Extension Services (Curtis & Ward, 2015; Taylor-Powell & Renner, 2009). At the end of the second class period, we asked students to fill out the questionnaire in Qualtrics, which included retrospective pretest Likert-scale statements measuring self-perceived changes in knowledge (7 items), self-perceived changes in thinking (3 items), statements measuring changes in perceived ability and learning (3 items), and questions related to motivation (5 items). See Table 1 for example questions in each category.

Campbell and Stanley (1963) supported using retrospective pretests (or post-then-pre) as an alternative to traditional self-report pre-post tests. In a retrospective pretest, individuals self-report changes in knowledge, awareness, skills, confidence, attitudes or behaviors simultaneously with their post-training (Taylor-Powell & Renner, 2009). Multiple studies have empirically tested the validity or methodology of retrospective pretests (Chowning et al., 2012; Drennan & Hyde, 2008; Howard et al., 1979; Little et al., 2020; Vinoski Thomas et al., 2018). These studies on retrospective pretests aimed to improve internal validity and addressed response-shift bias, concluding that when individuals in an educational program did not have enough information to rate their initial level of knowledge and skills (i.e., they did not yet know what they did not know), the retrospective pretest provided a more accurate baseline measure (Drennan & Hyde, 2008; Howard et al. 1979; Vinoski Thomas et al., 2018). Advantages of retrospective pretests are that they take less time to administer, are less intrusive, avoid attendance concerns, and, for self-reported change, avoid pretest sensitivity and response shift bias that result from pretest overestimation or underestimation (Chowning et al., 2012; Howard et al., 1979; Howard, 1980; Lam & Bengo, 2003; Little et al., 2020; Rockwell & Kohn, 1989; Pratt et al., 2000). Though there are limitations to the retrospective pretest design, such as limitations in the accuracy of self reporting or bias even within short timeframes (Klatt & Taylor-Powell, 2005), we believed this to be a stronger assessment format than traditional pre-post designs due to possible response shift bias (Rockwell & Kohn, 1989). It was optional for students to participate, so the number of student responses for the Day 2 questionnaire was smaller than the number of participants in the UFCTI instrument, due to some opting out or absences. We received 110 usable responses to the Qualtrics questionnaire for a 77% response rate. No incentive was provided to participate in the study.

Table 1: Example Questionnaire Statements
Example statements about self-perceived knowledge (7 total items) Please select the appropriate answer to indicate your level of knowledge about the following topics BEFORE and AFTER completing the activities:

  • Critical thinking styles (engagers and seekers): before and after, rated on a Likert scale (1 = very low to 5 = very high)
  • Community-based social marketing: before and after, rated on a Likert scale (1 = very low to 5 = very high)
Example statements for changes in thinking (3 total items) Please select the appropriate answer to indicate a change in your thinking about the following topics before and after completing the activities:

  • Communicating about agriculture, food and natural resource issues is important to me: before rated using “Could not judge” and a Likert scale (1 = strongly disagree to 5 = strongly agree); after rated using a Likert scale (1 = strongly disagree to 5 = strongly agree)
  • I think that community-based social marketing is a useful outreach tool: before rated using “Could not judge” and a Likert scale (1 = strongly disagree to 5 = strongly agree); after rated using a Likert scale (1 = strongly disagree to 5 = strongly agree)
Example question for perceived ability and learning: To what extent do you feel you are more able to use your critical thinking style because of this training: Likert scale ranging from 1 = not at all to 5 = a great deal
Example statement related to motivation: The case study about local food, SNAP benefits and community-based social marketing:

  • Motivated me to want to learn more about local food: Answer options: No, Maybe, Yes

 

Post-hoc Cronbach’s alpha was used to determine reliability of question sets, and the reliability coefficient was .80 (pretest) and .81 (posttest) for perceived knowledge; .88 for intention; and .84 for motivation. The researchers used IBM SPSS Statistics (Version 24) to analyze the quantitative data. To compare means in retrospective pretest questions, we ran paired samples t-tests and Cohen’s d to determine if differences in means were practically significant.

Altogether, 129 comments comprised the subset of open-ended data that we targeted for qualitative analysis. This study’s qualitative analysis used inductive analysis to analyze students’ answers to two open-ended, short-answer questions presented to students at the end of the questionnaire (What did you like most/least about [the] class sessions on critical thinking, local food, and community-based social marketing?). The next step in the analysis process consisted of developing categories and coding schemes (Hsieh & Shannon, 2005). The lead researcher did a preliminary scan of the responses to the open-ended questions in the questionnaire to create a codebook. The development of a codebook helps with efficient data analysis and enables replication within qualitative methods (Creswell & Poth, 2018). The lead researcher identified four themes for what the students liked the most and three themes for what the students least liked (Table 2). The codebook contains the themes, descriptions of the themes, and examples of quotes (Weber, 1990).

Table 2: Code Names, Descriptions, and Example Quotes
Code and description Example quote
What students liked most
Real-world scenario:
Appreciated the real-world, real-life scenario
“I loved the real-world examples provided for us as it was extremely beneficial and helped me understand the topics better.”
Critical thinking:
Appreciated the opportunity to learn what is their critical thinking style (seeker v engager)
“I liked that they got my brain thinking. Rather than just going through a lecture and not thinking about the thought process it takes for me to get to an answer I was actually thinking about my critical thinking style and how I could use this to come up with answers regarding the material.”
Gaining new knowledge:
Not one-word answers, but stating a specific topic they enjoyed learning about, or in general, that the topics are something they’ve never thought about until now
“Forced us to learn about something outside of our local community. It was a different topic I hadn’t thought about before.”
  Subtheme: General knowledge “I liked that she brought up a topic that I had never thought about before.”
  Subtheme: Local food marketing “I never really thought about how much of an impact local food can have. I never imagined pairing it with a food assistance program; that is something I think all local markets should try!”
  Subtheme: Nutrition incentive programs “I liked learning about the SNAP program and what it has to offer.”
  Subtheme: Community-based social marketing “I really liked learning about community-based social marketing. I really enjoy marketing so getting to see that you can reach a smaller population using the community based marketing was very interesting.”
Interruptions:
Appreciated the structure of the class, either through small-group discussion, or the opportunity for breaks in lecture to think/problem solve
“I liked being able to interact with my peers and discuss ideas.”
What students liked least
Wanted more time:
Wanted to learn more/not having enough time in class to connect ideas and think
“I wish we would have had more time to go more in depth and discuss more about the topic.”
Connection to critical thinking style:
Needed more opportunity to connect Day 1’s lecture to Day 2’s interrupted case study (to apply their critical thinking style)
“We didn’t really use our critical thinking style on the second day.”
Too much discussion during interrupted case study:
Disliked the small-group discussions, or high number of discussions among students
“so many discussions”

We used the codebook to separately code the responses in Excel and conducted consistency checks. The codebook allows interrater reliability testing to be more easily applied. Using Holsti’s method, the reliability was .91 for the code names on what the students liked and .96 for the code names on what the students disliked, which is considered high (Mao, 2017). During coding, four subthemes emerged within the “gaining new knowledge” code. We further discussed and defined those subthemes, analyzing responses multiple times to ensure all were included under the primary theme.

We followed recommendations from Lincoln and Guba (1985) to establish credibility, confirmability, dependability, and transferability of the qualitative data. Credibility is the researchers’ level of confidence in the truth of the findings. To establish credibility of the qualitative findings, we used the methods of peer debriefing to discuss and agree on theme and subtheme formation (Creswell & Poth, 2018; Lincoln & Guba, 1985). Confirmability addresses the importance of neutrality and unbiased research. We described our data collection procedures and interpretation of findings so that other researchers can confirm the findings in a similar situation. Dependability relates to the ability to consistently find a study’s findings again (Lincoln & Guba, 1985). An audit trail of materials, including the responses from the open-ended questions and the Excel file of codes, can establish confirmability and dependability. Additionally, we established dependability by describing in detail the research methods. We established transferability using detailed quotations in the results.

Results

The UFCTI gathered data on respondent age, gender, race/ethnicity, and their critical thinking style. Of the 128 students who responded to the UFCTI during class period 1, 95% (n = 122) were 18-24 years old, 61% (n = 78) were female, 39% (n = 50) were male; 94.5% (n = 121) were White/Non-Hispanic, 1.5% (n = 2) were African American (Black/Non-Hispanic), 2.3% (n = 3) were Hispanic, one respondent was Asian, and one respondent was multiracial. The UFCTI categorized respondents in the engager critical thinking style (56.3%; n = 72) or seeker critical thinking style (43.7%; n = 56). The questionnaire at the end of the second class session asked respondents to indicate their major and college level status (Table 3).

Table 3: Participants’ Majors and College Level (n = 110)
Characteristic n %
Major
Agricultural business 20 18.2
Agricultural communications 25 22.7
Agricultural economics 12 10.9
Agronomy 6 5.5
Animal sciences 24 21.8
Agricultural technology management 4 3.6
Bakery science 1 0.9
Dual majoring in ASI and ag comm 2 1.8
Feed science 2 1.8
Food science 1 0.9
Horticulture 2 1.8
Marketing 1 0.9
Milling science 4  3.6
Did not indicate 6  5.5
College level
Freshman 14 12.7
Sophomore 37 33.6
Junior 36 32.7
Senior 19 17.2
Did not indicate 4 3.6

 

Objective 1 was to identify change in respondents’ self-reported level of knowledge about their critical thinking style and key concepts in the case study. Paired samples t tests showed that changes in mean for each of the seven statements related to knowledge gain in Table 4 are significant at the <0.001 level, and Cohen’s d values show a large effect size (1.09 to 1.65) for each change in mean. On average, students felt they were more able to use their critical thinking style after the two-day activity, and using a case study helped them learn about both critical thinking and topics covered in the case study.

Table 4: Changes in Student Self-Reported Knowledge Based on Retrospective Pretest Statements (n = 110)
  Before After
M SD M SD t(109) p Cohen’s d
Critical thinking styles (engagers and seekers) 2.19 1.05 3.98 0.62 16.9 <0.001 1.61
SNAP and food assistance programs 2.03 1.10 3.67 0.78 16.5 <0.001 1.57
Community-based social marketing 2.30 0.94 3.88 0.63 17.3 <0.001 1.65
Local food movements 2.46 0.94 3.80 0.68 15.0 <0.001 1.44
Overcoming barriers when trying to influence behavior change 2.80 0.89 3.98 0.65 13.6 <0.001 1.30
Influencing behavior change 2.72 0.89 3.75 0.64 11.4 <0.001 1.09
The value of considering diverse perspectives when thinking deeply about a topic 3.22 0.91 4.13 0.67 12.2 <0.001 1.16

Note: Real limits: 1.0–1.49 = very low; 1.5–2.49 = low; 2.5–3.49 = moderate; 3.5–4.49 = high; 4.5–5.0 = very high.

 

For perceived ability and learning related to Objective 1, Table 5 shows mean responses to prompts falling into the “quite a bit” range, with the respondents reporting that the real-world case study helped them learn quite a bit about community-based social marketing (M = 4.12, SD = 0.67).

Table 5: Students’ Self-Reported Ability and Learning as a Result of the Two-Day Activity (n = 110)
Statement M SD
To what extent do you feel you are more able to use your critical thinking style because of this training? 3.66 0.73
To what extent did using a real-world case study help you learn about critical thinking? 3.86 0.88
To what extent did using this real-world case study help you learn about community-based social marketing? 4.12 0.67

Note. Real limits: 1.0–1.49 = not at all; 1.5–2.49 = very little; 2.5–3.49 = somewhat; 3.5–4.49 = quite a bit; 4.5–5.0 = a great deal.

 

Objective 2 was to identify the change in respondents’ thinking about concepts related to the case study: agricultural communications, marketing tools, and influencing behavior change, using three separate items. Paired samples t tests showed that changes in mean for each of the three “change in thinking” statements in Table 6 are significant at the <0.001 level. Cohen’s d values showed a medium effect size ranging from 0.55 to 0.77. Variations in n were due to the elimination of “cannot judge” responses in the “before” response. Students who selected  “could not judge” for their “pre” response for I think that community-based social marketing is a useful outreach tool (n = 26) scored an average of 3.04 (agree) as their “post” response; students who selected “could not judge” for their “pre” response for Influencing behavior change is something I’m interested in as part of my career (n = 17) scored an average of 2.47 (disagree) as their “post” response; and students who selected “could not judge” for their “pre” response for Communicating about agriculture, food and natural resource issues is important to me (n = 8) scored an average of 3.13 (agree) as their “post” response.

Table 6: Students’ Change in Thinking Based on Retrospective Pretest Statements
Before After
Statement M SD M SD t df p Cohen’s d
I think that community-based social marketing is a useful outreach tool (n = 84) 2.99 0.67 3.44 0.52 6.40 83 <0.001 0.70
Influencing behavior change is something I’m interested in as part of my career (n = 93) 2.57 0.76 3.00 0.61 7.41 92 <0.001 0.77
Communicating about agriculture, food and natural resource issues is important to me (n = 102) 3.23 0.60 3.49 0.52 5.50 101 <0.001 0.55

Note: Real limits: 1.0–1.49 = strongly disagree; 1.5–2.49 = disagree; 2.5–3.49 = agree; 3.5–4.0 = strongly agree.

 

Objective 3 was to determine students’ motivations in learning and application as a result of the two-day activity. With choices between “no,” “maybe,” and “yes,” 78.2% (n = 86) answered “yes” to the following prompts: as a result of the critical thinking and local food activity, do you intend to: (a) use my critical thinking style in the future during this class, (b) use my critical thinking style to examine problems in other classes, (c) use my critical thinking style to examine problems outside of class. Out of 110 responses, 67.3% (n = 74) answered “yes” to “as a result of the critical thinking and local food activity, do you intend to: develop skills in the critical thinking style that I am not as strong in.” Table 7 summarizes students’ motivation to learn as a result of the activities.

Stimulated me to think 3 (2.7)22 (20.0)85 (77.3)

Table 7: Student Self-Reported Motivation to Learn as a Result of the Two-Day Activity (n = 110)
Statement No
n (%)
Maybe
n (%)
Yes
n (%)
Motivated me to want to learn more about influencing behavior change  3 (2.7) 34 (30.9) 73 (66.4)
Motivated me to want to learn more about community-based social marketing  7 (6.4) 34 (30.9) 69 (62.7)
Motivated me to want to learn more about local food  7 (6.4) 42 (38.2) 61 (55.5)
Motivated me to want to learn more about food assistance programs 14 (12.7) 52 (47.3) 44 (40.0)

 

By coding the open-ended questions at the end of the questionnaire, we identified themes related to what students liked most (4 themes) or what students liked least (3 themes) about the class sessions. Gaining new knowledge was the most frequently mentioned theme for what the students liked about the class sessions. We developed four sub-themes to further explore how respondents discussed new knowledge: general knowledge, local food marketing, nutrition incentive programs like SNAP and Double Up Food Bucks, or community-based social marketing. In terms of general knowledge, one student wrote, “It made me realize topics and issues that I had not deeply thought about or recognized before.” As far as local food and nutrition incentive programs, one student wrote, “I never really thought about how much of an impact local food can have. I never imagined pairing it with a food assistance program; that is something I think all local markets should try!” Another student wrote, “Learning about community-based social marketing was great. It taught me that you can market to your community in ways I never thought of.”

Numerous students wrote about the ability/opportunity to learn their critical thinking style. For example, one student wrote, “I liked how we identified if we were a seeker or engager and then talked about what each of them can do in the social marketing aspect.” Similarly, another student wrote, “I liked being able to talk about the issue with a seeker (I’m an engager).” Another student wrote:

I liked that they got my brain thinking. Rather than just going through a lecture and not thinking about the thought process it takes for me to get to an answer I was actually thinking about my critical thinking style and how I could use this to come up with answers regarding the material. The local food and community based social marketing sessions were very intriguing when discussing the Utah project and helped me think more in depth about everything you have to take into consideration for community based social marketing.

Interruptions within the case study prompted students to think critically and answer specific questions. Numerous students wrote about how they liked the interruptions. One student wrote, “​​I really enjoyed how she gave us time to really think and develop thoughts on the topics. It really helped me connect better to the topic.” Similarly, another student wrote, “I liked the fact that she had small discussions to talk with others about the topics. We were able to get more information that way.” Along those same lines, a student wrote, “I really enjoyed talking about to the case study and engaging with classmates to hear their thoughts and ideas and the problem and solution.”

Having a real-world, real-life scenario presented for the interrupted case study was the fourth mentioned aspect that students stated they liked about the class sessions. One student wrote, “I enjoyed the real-world, successful example that we could analyze. It was interesting to hear their strategies and successes.” Another student wrote, “I loved the real-world examples provided for us as it was extremely beneficial and helped me understand the topics better.”

Some students wrote about the aspects of the class sessions that they did not like. Students desired more time during class sessions to either connect ideas and discuss the topics or have the instructor cover the content more slowly/thoroughly. For example, “I wish we had more time to talk about all of it and could have gone more in-depth on the strategy behind it.” One student wrote, “I think some of what she said was too quick and needed more explanation.”

While some students appreciated the interruptions to discuss the questions in small groups and as a class, other students either disliked the discussions or the number of discussions. While one student wrote, “less group conversing and more class discussion;” another student differed in their opinion about what they like least by writing, “The amount of group discussions.” Another student wrote, “As a seeker, I didn’t fully love the group discussion. I wished I had my own time to think individually in class about such topics; however, I know it was good for me and helped me see different perspectives.”

A few students (n = 5) did not see a connection with their critical thinking style and how to use it during the interrupted case study. For example, one student wrote, “There was no connection for me between critical thinking and community based social marketing.”

Conclusions, Discussion, and Recommendations

The two-day activity pairing critical thinking style and an interrupted case study had positive, practical and statistically significant outcomes for students. Students appreciated learning about their critical thinking style as well as the content in the interrupted case study, and many students enjoyed the discussion-based and problem-solving structure of the interrupted case study. Results show increases in students’ self-reported knowledge about content covered during both class sessions, including their critical thinking style, community-based social marketing, local food movements, SNAP and food assistance programs, the value of considering diverse perspectives, and barriers in influencing behavior change. Most respondents (78.2%) indicated that they plan to use their newly discovered critical thinking style in the future, and the two-day activity stimulated most respondents to think (77.3%), motivating many of them to learn more about influencing behavior change (66.4%), community-based social marketing, (62.7%) local food (55.5%), and food assistance programs (40.0%). Results indicate the combination of teaching critical thinking style and the real-world application of an interrupted case study in a short, two-day format can positively affect learning, motivation to learn, and intention to use critical thinking style. This study supports conclusions that the use of case studies can help students think critically about complex agricultural issues (Akins et al., 2019).

The UFCTI is a versatile tool used to assess critical thinking style across many different groups (Barrick & DiBenedetto, 2019; Leal et al., 2017; Putnam et al., 2017). Pairing the UFCTI with interrupted case studies requires at least two 50-minute class periods but could last an entire semester or longer. However, case-based instruction can eventually become monotonous for students (Anderson, 2019). We did not ask our students if they had ever participated in an interrupted case study, but it is possible that delivery of interrupted case studies on a limited basis has benefits, especially when students are not familiar with that style of instruction.

This study has implications for instructors in terms of incorporating critical thinking style into an interrupted case study’s discussions. Students learn their critical thinking style during the Day 1 lesson, and instructors could organize the small-group discussions during the interrupted case study on Day 2 to include students who represent each critical thinking style. Engagers gain information through conversations and use their reasoning ability to make a decision or share a solution to a problem, so they would appreciate the small-group and class discussions in which they can communicate how they arrived at a solution. Furthermore, interrupted case studies incorporate structured discussions, which can encourage students who are less likely to engage in unrestricted discussions (Anderson, 2019). The length of time dedicated to the interruptions could impact information seekers because they are aware of their biases and want to conduct sufficient research to gather information from a variety of viewpoints to help them come to a solution. Because the structure of interrupted case studies allows the instructor to be a facilitator rather than lecturer, the information seekers can employ research, use their personal experience, and think their response out loud to derive solutions that are not predetermined (Mayo, 2002). This type of interaction can lead to deep thinking and sharing.

Some students might resist engagement in the classroom or consider it an unfair expectation to participate, as some students have developed the mindset of being passive learners and expect traditional lectures (Garrett, 2020). To help make interrupted case studies effective for those who prefer traditional lectures, we recommend easing them into discussion with a nonthreatening topic (Wilson, 2017), providing students with a note-taking guide for the case study, and allowing them to think critically alone during one or more of the interruptions.

In summary, two class periods involving (a) critical thinking style and (b) course content delivered through an interrupted case study had effective outcomes for many of our students. Though some students did not like the small group discussions, many students enjoyed learning about and applying their critical thinking style and collaborating with other students to think about a real-world problem. A limitation of this study was not being able to combine results between the UFCTI instrument from the first day and the case study questionnaire from the second day because the UFCTI results were anonymous; thus, we were not able to compare self-reported data to critical thinking style. Future research should compare students’ critical thinking style (information seeker vs. engager) to their self-reported knowledge, self-reported ability, changes in thinking, and intentions to use their critical thinking style in the future. We recommend expanding the timeframe of activity to satisfy the needs of students who wanted to dive deeper into topics. We also recommend future research follow up with students later in their academic careers to assess their perception of critical thinking style through time and lasting impressions of the content learned in the interrupted case study.

References

Anderson, B. (2019). Teaching developmental theory with interrupted video case studies. Journal of the Scholarship of Teaching and Learning, 19(5). https://doi.org/10.14434/josotl.v19i5.25385

Akins, J., Lamm, A., Telg, R., Abrams, K., Meyers, C., & Raulerson, B. (2019). Seeking and engaging: Case study integration to enhance critical thinking about agricultural issues. Journal of Agricultural Education, 60(3), 97–108. https://doi.org/10.5032/jae.2019.03097

Barrick, R. K. & DiBenedetto, C. A. (2019). Assessing the critical thinking styles of international faculty. Journal of Advances in Education and Philosophy, 3(6). http://www.doi.org/10.21276/jaep.2019.3.6.1

Bensley, D. A. (2010). A brief guide for teaching and assessing critical thinking in psychology. Observer, 23(10). Retrieved from https://www.psychologicalscience.org/observer/a-brief-guide-for-teaching-and-assessing-critical-thinking-in-psychology

Block, D. R., Chávez, N., Allen, E., & Ramirez, D. (2012). Food sovereignty, urban food access, and food activism: Contemplating the connections through examples from Chicago. Agriculture and Human Values, 29(2), 203–215. https://doi.org/10.1007/s10460-011-9336-8

Breyer, B., & Voss-Andreae, A. (2013). Food mirages: Geographic and economic barriers to healthful food access in Portland, Oregon. Health & Place, 24, 131–139. https://doi.org/10.1016/j.healthplace.2013.07.008

Brooks, R., Kavuturu, J., & Cetin, M. (2012). Interrupted case method for teaching ethics in transportation engineering and systems management course. ASEE Annual Conference & Exposition Proceedings, 25.836.1-25.836.14. https://doi.org/10.18260/1-2–21593

Campbell, D. T., & Stanley, J. C. (1963). Experimental and quasi-experimental designs for research. Houghton Mifflin Company.

Cavanagh, M. (2011). Students’ experiences of active engagement through learning activities in lecture. Active Learning in Higher Education, 12(1). https://doi.org/10.1177/1469787410387724

Chowning, J. T., Griswold, J. C., Kovarik, D. N., & Collins, L. J. (2012). Fostering critical thinking, reasoning, and argumentation skills through bioethics education. Plos One, 7(5), 1–8. https://doi.org/10.1371/journal.pone.0036791

Cooperstein, S. E., & Kocevar‐Weidinger, E. (2004). Beyond active learning: A constructivist approach to learning. Reference Services Review, 32(2), 141–148. https://doi.org/10.1108/00907320410537658

Creswell, J. W., & Poth, C. N. (2018). Qualitative inquiry & research design: Choosing among the five approaches. Sage.

Curtis, K. R., & Ward R. (2015, April). Program evaluation and survey design techniques [Conference session]. Invited presentation of the 2015 Extension Risk Management Education National Conference, Minneapolis, MN.

Drennan, J., & Hyde, A. (2008). Controlling response shift bias: The use of the retrospective pre-test design in the evaluation of a master’s programme. Assessment and Evaluation in Higher Education, 33(6), 699–709. https://doi.org/10.1080/02602930701773026

Facione, P. A. (1990). Critical thinking: A statement of expert consensus for purposes of educational assessment and instruction: Report findings and recommendations (ED315423). American Philosophical Association.

Felder, R. M., & Brent, R. (1996). Navigating the bumpy road to student-centered instruction. College Teaching 44(2), 43–47. https://www.jstor.org/stable/27558762

Fiester, S., Redfearn, J., Helfinstine, S., Meilander, T., & Woolverton, C. J. (2010). Lab safety and bioterrorism readiness curricula using active learning and hands-on strategies as continuing education for medical technologists. Journal of Microbiology & Biology Education, 11(1). https://doi.org/10.1128/jmbe.v11i1.131

Garrett, C. E. (2020). Three key principles for improving discussion-based learning in college classrooms. Journal of Empowering Teaching Excellence, 4(1). https://doi.org/10.15142/VNKZ-P273

Gay, K., Terry, B., & Lamm, A. J. (2015). Identifying critical thinking styles to enhance volunteer development. Journal of Extension, 53(6). https://tigerprints.clemson.edu/joe/vol53/iss6/28/

Gay, K. D., Owens, C. T., Lamm, A. J., & Rumble, J. N. (2016). Assessing public issues knowledge and needs of Extension agents in Florida. The Journal of Extension, 55(1), Article 24. https://tigerprints.clemson.edu/joe/vol55/iss1/24

Gorham, L. M., Lamm, A. J., & Rumble, J. N. (2014). The critical target audience: Communicating water conservation behaviors to critical thinking styles. Journal of Applied Communications, 98(4). https://doi.org/10.4148/1051-0834.1092

Herreid, C. F. (2004). Can case studies be used to teach critical thinking? Journal of College Science Teaching, 33(6), 12–14. https://www.jstor.org/stable/10.2307/26491296

Herreid, C. F. (2011). Case study teaching. New Directions for Teaching and Learning 128, 31–40. https://doi.org/10.1002/tl.466

Hsieh, H. F., & Shannon, S. E. (2005). Three approaches to qualitative content analysis. Qualitative Health Research, 15(9), 1277–1288. https://www.doi.org/10.1177/1049732305276687

Howard, G. S. (1980). Response-shift bias: A problem in evaluating interventions with pre/post self-reports. Evaluation Review, 4(1), 93–106. https://doi.org/10.1177/0193841X8000400105

Howard, G. S., Ralph, K. M., Gulanick, N. A., Maxwell, S. E., Nance, D. W., & Gerber, S. K. (1979). Internal invalidity in pretest-posttest self-report evaluations and a re-evaluation of retrospective pretests. Applied Psychological Measurement, 3(1), 1–23. https://doi-org.dist.lib.usu.edu/10.1177/014662167900300

Klatt, J. & Taylor-Powell, E. (2005). Program development and evaluation: Using the retrospective postthen-pre design, quick tips #27. University of Wisconsin-Extension. http://www.uwex.edu/ces/pdande/resources/index.html

Lam, T. C. M., & Bengo, P. (2003). A comparison of three retrospective self-reporting methods of measuring change in instructional practice. American Journal of Evaluation, 24(1), 65–80. https://doi.org/10.1177/10982140030240

Lamm, A. J., & Irani, T. (2011). UFCTI manual. Gainesville, FL: University of Florida.

Lamm, A. J. (2015). Integrating critical thinking into extension programming #3: Critical thinking style (No. AEC546). University of Florida. http://www.edis.ifas.ufl.edu/wc208

Lamm, A. J. (2016). Integrating critical thinking into Extension programming #4: Measuring critical thinking styles using the UFCTI (No. AEC547). University of Florida. https://edis.ifas.ufl.edu/pdf/WC/WC20900.pdf

Lamm, A., Harsh, J., Meyers, C., & Telg, R. (2018). Can they relate? Teaching undergraduate students about agricultural and natural resource issues. Journal of Agricultural Education, 59(4), 211–223. https://doi.org/10.5032/jae.2018.04211

Leal, A., Rumble, J. N., & Lamm, A. J. (2017). Using critical thinking styles to inform food safety behavior communication campaigns. Journal of Applied Communications, 101(2). https://doi.org/10.4148/1051-0834.1002

Lincoln, Y. S. & Guba, E. G. (1985). Naturalistic inquiry. Sage Publications.

Little, T. D., Chang, R., Gorrall, B. K., Waggenspack, L., Fukuda, E., Allen, P. J., & Noam, G. G. (2020). The retrospective pretest–posttest design redux: On its validity as an alternative to traditional pretest–posttest measurement. International Journal of Behavioral Development, 44(2), 175–183. https://doi.org/10.1177/0165025419877973

Lom, B. (2012). Classroom activities: Simple strategies to incorporate students-centered activities within undergraduate science lectures. The Journal of Undergraduate Neuroscience Education, 11(1), A64–A71. PMID: 23494568

Lombardi, D., Shipley, T. F., Bailey, J. M., Bretones, P. S., Prather, E. E., Ballen, C. J., Knight, J. K., Smith, M. K., Stowe, R. L., Cooper, M. M., Prince, M., Atit, K., Uttal, D. H., LaDue, N. D., McNeal, P. M., Ryker, K., St. John, K., van der Hoeven Kraft, K. J., & Docktor, J. L. (2021). The curious construct of active learning. Psychological Science in the Public Interest, 22(1), 8–43. https://doi.org/10.1177/1529100620973974

Lu, P., Burris, S., Baker, M., Meyers, C., & Cummins, G. (2021). Cultural differences in critical thinking style: A comparison of U.S. and Chinese undergraduate agricultural students. Journal of International Agricultural and Extension Education, 28(4). https://doi.org/10.4148/2831-5960.1003

Lumpkin, A. (2015). Student perceptions of active learning. College Student Journal, 49(1). https://www.researchgate.net/publication/312188115_Student_perceptions_of_active_learning

Mao, Y. (2017). Intercoder reliability techniques: Holsti’s Method. In M. Allen (Ed.), The SAGE encyclopedia of communication research methods (Vols. 1-4). SAGE Publications, Inc. https://doi.org/10.4135/9781483381411

Mayo, J. A. (2002). Case-based instruction: A technique for increasing conceptual application in introductory psychology. Journal of Constructivist Psychology 15(1), 65–74. https://doi.org/10.1080/107205302753305728

Millis, B. J. (E d.). (2010). Cooperative learning in higher education: Across the disciplines, across the academy. Stylus.

Penn, M. L., Currie, C. S. M., Hoad, K. A., & O’Brien, F. A. (2016). The use of case studies in OR teaching. Higher Education Pedagogies, 1(1), 16–25. https://doi.org/10.1080/23752696.2015.1134201

Popil, I. (2011). Promotion of critical thinking by using case studies as teaching method. Nurse Education Today, 31(2), 204–207. https://doi.org/10.1016/j.nedt.2010.06.002

Pratt, C. C., McGuigan, W. M., & Katzev, A. R. (2000). Measuring program outcomes: Using retrospective pretest methodology. American Journal of Evaluation, 21, 341–349. https://doi.org/10.1016/S1098-2140(00)00089-8

Putnam, B., Lamm, A., & Lundy, L. (2017). Using critical thinking styles of opinion leaders to drive Extension communication. Journal of Agricultural Education, 58(3), 323–337. https://doi.org/10.5032/jae.2017.03323

Regional Educational Laboratory Appalachia. (2021). Learning before going to scale: An introduction to conducting pilot studies. Institute of Education Sciences. https://ies.ed.gov/ncee/edlabs/regions/appalachia/resources/pdfs/Pilot-Study-Resource_acc.pdf

Rockwell, S.K., & Kohn, H. (1989). Post-then-pre evaluation: Measuring behavior change more accurately. Journal of Extension, 27(2). http://www.joe.org/joe/1989summer/a5.html

Silva de Lima, M., Pozzer, L., & Queiroz, S. L. (2023). Use of interrupted case studies to teach scientific communication: Examples from the effects of mining on water resources in Brazil. Journal of Chemical Education, 100(2), 722–731. https://doi.org/10.1021/acs.jchemed.2c01146

Snyder, L. G., & Snyder, M. J. (2008). Teaching critical thinking and problem solving skills. The Delta Pi Epsilon Journal, 50(2), 90–99. https://eric.ed.gov/?id=EJ826495

Stedman, N. L. P., & Adams, B. L. (2014). Getting it to click: Students self-perceived critical thinking style and perceptions of critical thinking instruction in face-to-face and online course delivery. NACTA Journal, 58(3). https://www.nactateachers.org/attachments/article/2224/12.%20Stedman_NACTA%20Journal.pdf

Stewart, R., Stein, D. C., Yuan, R. T., & Smith, A. C. (2014). “The farmer’s dilemma”—An interrupted case study for learning bacterial genetics in the context of the impact of microbes on the organic food industry and biotechnology. Journal of Microbiology & Biology Education, 15(1), 36–37. https://doi.org/10.1128/jmbe.v15i1.643

Taylor-Powell, E. & Renner, M. (2009). Collecting evaluation data: End-of-session questionnaires. University of Wisconsin-Extension, Cooperative Extension. https://www.wcasa.org/wp-content/uploads/2020/03/Evaluation_Questionnaires-UW-Extension.pdf

University of Florida. (n.d.). UF critical thinking inventory: Where it all began. Retrieved from https://www.ufcti.com/research/

Vinoski Thomas, E., Wells, R., Baumann, S. D., Graybill, E., Roach, A., Truscott, S. D., Crenshaw, M., & Crimmins, D. (2018). Comparing traditional versus retrospective pre-/post-assessment in an interdisciplinary leadership training program. Maternal and Child Health Journal, 23(2), 191–200. https://doi.org/10.1007/s10995-018-2615-x

Weber, R. P. (1990). Basic content analysis. Sage.

White, T. K., Whitaker, P., Gonya, T., Hein, R., Kroening, D., Lee, K., Lee, L., Lukowiak, A., & Hayes, E. (2009). The use of interrupted case studies to enhance critical thinking skills in biology. Journal of Microbiology & Biology Education, 10(1), 25–31. https://doi.org/10.1128/jmbe.v10.96

Wilson, J. S. (2017). Promoting critical thinking in general biology courses: The case of the white widow spider. Journal on Empowering Teaching Excellence, 1(2). https://doi.org/10.26077/JMB7-ZH62

Wu, Y.-L., Rumble, J. N., Lamm, A. J., & Ellis, J. D. (2020). Communication of genetic modification science: Consumers’ critical thinking style, perceived transparency of information, and attitude. Journal of International Agricultural and Extension Education, 27(2), 49–61. https://doi.org/10.4148/2831-5960.1117

Zayapragassarazan, Z. & Kumar, S. (2012). Active learning methods. NTTC Bulletin 19(1). https://files.eric.ed.gov/fulltext/ED538497.pdf

License

Icon for the Creative Commons Attribution 4.0 International License

Journal on Empowering Teaching Excellence, Spring 2024 Copyright © by Utah State University is licensed under a Creative Commons Attribution 4.0 International License, except where otherwise noted.

Share This Book