Skip to main content
SearchLoginLogin or Signup

Combined Cognitive–Motivational Modules Delivered Via an LMS Increase Undergraduate Biology Grades

Volume 1, Issue 2. DOI: 10.1037/tmb0000020

Published onNov 06, 2020
Combined Cognitive–Motivational Modules Delivered Via an LMS Increase Undergraduate Biology Grades
·

Abstract

Students’ success in undergraduate Science, Technology, Engineering, and Mathematics (STEM) courses requires effective studying behavior, but also the motivation to enact it. Promoting students’ achievement in STEM has commonly focused on either study strategies (cognitive) or motivational interventions; we hypothesized that combinations of these would be more effective. Using a learning management system (LMS) for delivery, we iteratively developed and tested the effect of different combinations of one of the four cognition-focused with one of the three motivation-focused intervention modules. Participants were 3,092 undergraduate introductory biology students tested in 10 studies at three universities over 4 years. They were randomly assigned to either a no-treatment control condition or one of the 17 conditions involving either single or combined intervention modules delivered over an entire semester. Course grades were provided by the instructor. We used meta-analytic techniques to capture the effect of students’ access to the interventions on grades, and to test whether differences across experiments changed the effect size for the interventions. Averaging across the studies, the intervention had an effect of g = .30. All 10 moderators were significant: Cognitive + Motivational versus either one alone, timely access to the intervention, iterative development phase, type of cognitive or type of motivation module, the specific cognitive-motivation combination, university, academic year, semester, first versus second semester of biology, and course content. We conclude that combined interventions delivered via an LMS can meaningfully improve undergraduate students’ course grades (corresponding to 6.6 percentage points on final course grade), with minimal extra work for instructors. However, these effects depended on a variety of contextual factors.

Keywords: study strategies, self-efficacy, expectancy-value

Disclosures and Acknowledgments: The work was performed at University of Illinois at Urbana-Champaign, Old Dominion University, Temple University, University of Illinois at Chicago, and University of Southern Indiana.

Funding: The research reported herein was supported by the U.S. Department of Education, Institute of Education Sciences (grant no. R305A140602). The funder played no role in the study design, data collection, data analysis, interpretation of data, writing of the report, or decision to submit the article for publication. Conflicts of interest: We have no known conflicts of interest to disclose.

Data availability: All sufficient statistics are included in the manuscript in Table 4, and individual level data are available from the first author to qualified researchers upon request. The study materials other than questionnaires are available at https://www.ideals.illinois.edu/handle/2142/97878. Data analyses are described in detail on pp. 19–20.

Prior dissemination: the results reported herein were presented at the annual meeting of the American Educational Research Association in Toronto, Canada in April, 2019. A detailed analysis of Expectancy × Costs interactions in Study #9 has been published at https://doi.org/10.1016/j.lindif.2019.04.001.

Open Access License: This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License (CC-BYNC-ND). This license permits copying and redistributing the work in any medium or format for noncommercial use provided the original authors and source are credited and a link to the license is included in attribution. No derivative works are permitted under this license.

Contact Information: Correspondence concerning this article should be addressed to Jennifer G. Cromley, University of Illinois at Urbana Champaign, 1310 S. Sixth St. MC-708, Champaign, IL 61820, USA. Email: [email protected]


Undergraduate students’ success in “gateway” STEM courses has proven a persistent challenge (National Academies of Sciences, Engineering, and Medicine, 2018). In the past several decades, numerous reports detailed the high number of students with interest and aptitude in STEM who fail to achieve satisfactory grades in these courses, and leave STEM majors (National Academies of Sciences, Engineering, and Medicine, 2017; National Research Council, 2011; President’s Council of Advisors on Science and Technology, 2012). These reports have called to transform STEM instruction in order to motivate students, enhance their learning and achievement, and contribute to their retention (National Academies of Sciences, Engineering, and Medicine, 2018). However, although desirable, such curricular changes require much effort, time, and an overhaul of infrastructure (Henderson et al., 2011). Often, the practicalities of implementation seem to pose great difficulties even to instructors who are highly motivated to improve their courses.

In the current study, we investigated an alternative approach to promote undergraduate STEM students’ achievement that builds on combining brief intervention modules that have been found to improve students’ achievement in previous research. Specifically, hypothesizing a synergy of interventions that focus on students’ learning and on students’ motivation, we tested and compared the efficacy of pairs of validated intervention modules—one focusing on enhancing students’ cognition with one focusing on enhancing students’ motivation—to promote students’ achievement in undergraduate STEM courses. We delivered these interventions through a study-specific web-based instruction system (Blackboard), in ways that supplemented the instructors’ curricula, thus removing the burden off of faculty to radically change their instruction or to develop large amounts of new teaching materials. In this article, we report on the findings from 10 experiments in three different universities that tested and compared the effects of different combinations of cognitive–motivational modules on undergraduate students’ achievement in biology courses.

Combining Cognitive and Motivational Interventions to Promote Student Achievement

During the past couple of decades, researchers have devised different interventions to target mechanisms hypothesized to improve students’ learning and achievement. For the most part, studies testing such interventions have focused either on promoting students’ cognitive processes (e.g., learning strategies and problem-solving strategies) or on promoting students’ motivation and engagement (e.g., self-efficacy and value of the content). Cognitive interventions are based on theory and research that explain people’s memory, information processing, and problem solving. Motivation interventions are based on theories and research that explain people’s choice, investment of effort, and persistence despite challenges.1 Whereas research has demonstrated the efficacy of each type of intervention to improve students’ achievement on its own, researchers have rarely investigated the potential synergy of combining cognitive and motivational interventions. However, there is a strong theoretical basis to hypothesize synergy between cognitive and motivational interventions. In an extensive synthesis of the research, Pintrich (2000) provided a comprehensive argument for the joint operation of cognition and motivation in undergraduate students’ successful engagement and achievement—the need for both “skill and will” (Zusho et al., 2003). Students’ academic success requires both effective cognitive processes and the motivation to apply them (Boekaerts, 1996; Wolters, 1999; Zimmerman & Moylan, 2009; Zusho et al., 2003). Accordingly, we hypothesized that combining an intervention to enhance students’ cognitive processes when learning content with an intervention to enhance students’ motivation while learning the content would manifest stronger effects on achievement relative to either receiving no interventions or to receiving only one of these interventions.

This hypothesis has received initial support from a number of recent studies with young students. Combining cognitive and motivational interventions was found to have positive effects on young students’ word reading (Toste et al., 2017) and reading comprehension (e.g., Guthrie et al., 2004), as well as middle school math scores (e.g., Cleary et al., 2017). This emerging research also suggests that, in addition to an additive effect on the achievement of enhancing both cognitive and motivational processes, combining cognitive and motivational interventions may manifest positive effects of cognitive processes on motivation (COG→MOT) and vice versa (MOT→COG).

Yet, importantly, both cognitive and motivational processes are diverse and include multiple mechanisms (Pintrich, 2000). Effective cognitive processes include such processes as attention regulation, eliciting prior knowledge, and a variety of learning strategies (e.g., self-explanation, summarizing, and elaborating) and problem-solving strategies. Motivation also involves a variety of processes including competence beliefs, goal setting, perceived relevance, interest, and autonomous pursuits. It may be that the combination of any interventions of cognitive and motivational processes would improve students’ achievement. However, it may also be that particular combinations are more effective than others in the undergraduate STEM course context. To test our hypothesis about the synergy of cognitive and motivational interventions, as well as to compare different combinations of cognitive and motivational interventions, we selected several theoretically based cognitive interventions and motivational interventions that already had the evidence of effectiveness for promoting students’ achievement. We then tested the effect on achievement of students’ access to different pairs of interventions from these cognitive and motivational sets.

Cognitive Interventions for Biology Courses

Cognitive processes serve crucial mechanisms of effective learning and achievement. Intervention studies promoting cognitive processes have often focused on the effects on achievement of one process at a time, such as activating prior knowledge relevant to the learned material, scaffolding systematic note-taking, or providing guidance in summarizing the content (e.g., Hodds et al., 2014; McNamara, 2004, 2017; O’Reilly et al., 2004). Among undergraduate students in survey courses, cognitive interventions that were found to enhance achievement in different STEM domains included instruction in summarizing text (Bednall & Kehoe, 2011) and activating prior knowledge before learning in psychology (Gurlitt & Renkl, 2010), prompting to sketch the learned content in biology (Cromley & Mara, 2018; Stevens, L. M., & Hoskins, S. G., 2014), prompting comparing-and-contrasting for learning geology (Jee et al., 2013), teaching mnemonics for learning statistics (Mocko et al., 2017), and fostering self-explanation (McNamara, 2017).

For our study, we selected for inclusion four interventions that target different cognitive processes relevant for learning the content in introductory biology courses: (a) instructing in specific study strategies, (b) demonstrating worked examples, (c) activating students’ relevant prior knowledge, and (d) organizing the content through thematic segmentation of lecture videos.

Strategy Instruction

We selected to implement Pressley and Harris’s (2006) successful principles in delivering study strategies instruction. Study strategies include cognitive operations for processing information into knowledge structures such as memorization techniques, self-quizzing, creating mental images of text-based information, summarizing information in one’s own words, and comparing-and-contrasting different pieces of information. Findings suggest that strategy instruction (SO) interventions have much larger effects when the strategies are embedded in the specific content that the students are studying, probably due to discipline effects (Weinstein et al., 2000). For example, to summarize accurately in biology, one must keep track of numerous components and their functions in a system, whereas to summarize accurately in history one must select the most important actors and their actions. The instructed strategies we selected to fit the content of introductory undergraduate biology were drawing to learn, comparing-and-contrasting within diagrams, making a concept map, using etymology to figure out the meaning of new vocabulary words, and using mnemonics to remember new course content. Our intervention involved brief videos in which a presenter introduced the study strategy and demonstrated its use on a biology-relevant example taken from the course textbook.

Worked Examples

Whereas biology exams include many questions that require the use of problem-solving strategies, biology instruction rarely includes direct and extended demonstrations of such problem solving. Building on the problem-solving literature, research has demonstrated the positive effect of providing students with “worked examples”—the explicit illustration of step-by-step correct solutions to problems of different types (Renkl, 1997; Sweller & Cooper, 1985). Effective worked examples’ interventions also involve pointing out to students how the current worked-out example problem is related to “big ideas” in the discipline (Perfetto et al., 1983). Worked examples’ interventions were found to contribute to in-depth understanding of how to reason in the domain and to higher grades in numerous science and nonscience domains (Sweller et al., 2011). In our intervention, we provided students brief videos in which a presenter demonstrated, step-by-step, the correct reasoning about and solution to a problem similar to those students are asked to solve on the course within semester exams. The content concerned learning objectives from a single lecture (e.g., geographic isolation as a means of speciation), so these were key to learning in the course; the worked examples were brief enough, however, that they did not take on major, overarching themes of the course such as natural selection.

Prior Knowledge Activation

The crucial role of background knowledge in learning, and in science learning specifically, has been an integral element of cognitive learning theories for decades (Bransford et al., 2000; Fisher, 2004; Pressley & McCormick, 1995). Effective learning always involves using the new information to enhance or modify existing knowledge structures. Students’ content knowledge explains between 30% and 60% of the variance in students’ achievement (Dochy et al., 1999; Pressley & McCormick, 1995). However, some students do not learn effectively because they fail to activate their existing knowledge relevant to the domain. Research suggests that interventions that activate students’ relevant background knowledge using warmups, preteaching, videos, and other techniques significantly improve students’ learning (Woloshyn et al., 1992). Once activated, background knowledge frames attention, interpretation, elaboration, and organization of the new material, as well as helps the retrieval of that stored information from memory (Dochy et al., 1999). Such activation reduces comprehension problems and eases cognitive processing (Alexander & Murphy, 1998). In our intervention, we provided students brief videos that reminded them of and activating information that they have learned in high school and that is directly relevant to information they will encounter in the course textbook and lectures.

Segmenting Lecture Videos

Learning from lectures requires paying attention and cognitively processing the information presented. This requires a complex and rapid conscious and strategic decision-making concerning the important parts of the lectures to attend to, how to organize the information, and, if notes are taken, how to use them later on (Kiewra et al., 1991; Van Meter et al., 1994). This requires cognitive processing such as noticing what information is most important, making connections between different pieces of information, and organizing the information in meaningful and easy to remember categories. Even skilled students encounter lectures whose pace or organization exceeds their ability to effectively process the important information that is delivered. Interventions found to be effective in supporting students’ ability to process lecture information included providing opportunities for repeated exposure to the information (Hidi & Klaiman, 1983) and scaffolds for the effective organization of the information (Trumpower & Goldsmith, 2004). Repeated exposure allows the correction of attention errors, in-depth processing of the important information, and more effective organization of the information (Pressley & McCormick, 1995). Although literature reviews of learning effects from lecture recordings at the university level suggest a mix of positive, null, and negative effects of lecture recordings (Danielson et al., 2014; O’Callaghan et al., 2017), one reason for these mixed effects may be varied effects on attendance. In our intervention, we adopted an intervention strategy that provides students’ recordings of their lectures that are thematically segmented. We could not anticipate whether or not the module will impact students’ attendance; however, in order to counter any possible changes of attendance, we allowed students repeated access to view the recordings, without limitations on time.

Motivation Interventions for Biology Courses

Research has demonstrated the wide range of motivational processes that relate to undergraduates’ STEM achievement (Richardson et al., 2012). Drawing on self-efficacy theory (Bandura, 1997) and expectancy-value theory (Wigfield & Eccles, 2000), numerous studies have pointed to students’ self-efficacy and expectations of succeeding in the academic task, and their valuing of the task, as contributing to choice of an academic major, persistence in the face of difficulties, and achievement. In addition, recent research has also supported the role of noneconomic perceived costs associated with a task (e.g., effort and psychological drawbacks) on students’ intentions to drop out of undergraduate STEM majors (Perez et al., 2014, 2019). Whereas motivational interventions have been relatively uncommon in the literature (Harackiewicz & Priniski, 2018; Rosenzweig & Wigfield, 2016), research points to the promise of interventions that focus on enhancing students’ competence beliefs, or self-efficacy (Schunk & DiBenedetto, 2016), and to those aiming to increase students’ task value through relevance writing—asking students to write brief statements about the relevance of course content to their lives (Harackiewicz & Priniski, 2018; National Academies of Sciences, Engineering, and Medicine, 2017). To date, there have been no interventions that directly target mitigating perceptions of cost; however, in light of the emerging centrality of this motivational process as complementing the other processes (Hulleman & Barron, 2016), we decided to design an intervention to address it. Thus, in our study, we selected for inclusion three motivational interventions: (a) enhancing self-efficacy for success through strategy-oriented feedback on performance, (b) promoting task value through relevance writing, and (c) alleviating perceived nonmonetary costs through persuasive messages.

Enhancing Self-Efficacy

The student’s self-efficacy—the confidence that he or she can successfully complete required assignments—is one of the best predictors of undergraduate science grades (Richardson et al., 2012). Although self-efficacy is related to prior achievement, many undergraduate STEM students with good preparation have low self-efficacy for specific gateway courses. Enhancing students’ self-efficacy has been found to increase their learning and achievement in numerous STEM and non-STEM domains (Zimmerman et al., 2017). In our intervention, we built on a successful intervention by Muis (Muis et al., 2013) in which students’ self-efficacy is enhanced by providing them with personalized emails with feedback on course quiz performance that emphasizes learning, understanding, and improvement, rather than only earning grades.

Promoting Relevance

Students’ perceptions that the learned content is relevant to their lives have been long-known to contribute to motivation, engagement, and academic success (Albrecht & Karabenick, 2018). In one of the most investigated motivational interventions in the college classroom, Hulleman, Harackiewicz, and their colleagues (Harackiewicz et al., 2016; Hulleman & Harackiewicz, 2009) have targeted students’ perceived self-relevance of the content by asking the students to write essays in which they relate the content of the course to their lives. These interventions have been found to facilitate students’ higher interest and course grades. This body of research includes slightly different tactics for guiding students’ relevance writing (e.g., asking students to write an essay about the relevance of content to their own lives or asking students to write a letter to a friend about the relevance of content to the friend’s life). Research suggests that providing students with higher choice in the assignment is associated with better outcomes (Rosenzweig et al., 2020). In our intervention, we adapted the task and asked the students to write a paragraph relating a central concept in the course to their lives, while emphasizing choice among the diversity of options, including writing about any aspect related to the concept and about any aspect related to the student’s life (Hartwell & Kaplan, 2018).

Alleviating Perceived Costs

Theories of motivation consider not only what drives students to engage in an academic task but also what may drive them away from engagement. One important perception that hinders engagement is students’ perceptions of nonmaterial costs involved in investing effort and trying to succeed. Such perceived costs may involve the student’s sense that the effort required is too high, that the task may require giving up on other valued activities, that there may be negative emotional consequences to investment, such as in the case of failure, and that investment may be detrimental to certain social relationships. Although there were no published interventions for alleviating such perceived costs at the time of our study (although see Rosenzweig et al., 2020 for a recent study), research that shows such perceptions to be the strongest predictors of intention to leave a STEM major precipitated us to design an intervention that targets these perceptions among the participants in the current study. We followed the practices of successful interventions in students’ perceived lack of belonging to the university that employed brief relatable messaging (Walton et al., 2015). In our intervention, students received brief videos in which an upperclassman discusses overcoming the sense of perceived costs of involvement in the course.

The Current Study

Building on the small body of research on combined cognitive–motivational interventions (Cleary et al., 2017; Guthrie et al., 2004; Toste et al., 2017), we conducted 10 experiments over four academic years at three different universities in an iterative test of the following research questions: (a) Do combinations of cognitive–motivational interventions in introductory undergraduate biology courses impact students’ achievement relative to no interventions, and to only cognitive or motivational interventions? and (b) Which combinations of cognitive–motivational interventions in undergraduate biology courses are more effective in impacting students’ achievement? The findings indicated that, overall, the interventions were quite effective; however, effect sizes varied across semesters (fall and spring) and across the three institutions. Therefore, we also report on the tests of variables that might moderate the size of effects, taking into account differences between universities, course content (molecular/cellular vs. organismal biology), timing of the course, the specific cognitive and motivational modules delivered, and the fidelity of implementation.

Method

We first describe the participants and procedures, followed by a description of the intervention approach, the design of each intervention module, and the data analysis plan.

Participants and University Contexts

Participants were 3,092 undergraduate students in first- or second-semester introductory biology courses designed for STEM majors. They were recruited from one of the three U.S. universities: university no. 1 was a medium-sized, minority-serving, high-research-activity university with a majority commuter students; university no. 2 was a large, urban, very-high-research-activity university serving about 50% commuter students; and university no. 3 was a large, flagship state university in the highest research productivity category (see Table 1 for demographics of each institution). All courses were traditional lecture-based undergraduate biology classes with weekly recitation sections.

At university no. 2, students were required to pass a fall semester chemistry course before taking their first (organismal and evolutionary) biology course in the spring. At the other two universities, students could begin biology in the fall, taking molecular/cellular biology or organismal and evolutionary biology.

Procedures

In compliance with the Institutional Review Board protocols approved at each institution, all participants completed a written consent form (paper or electronic) in which they agreed to release their grades and research data, and then completed a series of cognitive and motivational pretests over the first 2 weeks of the semester. The intervention modules were then released to treatment students according to the course schedule over the next 10–11 weeks. The modules were introduced to students as resources that may help them succeed in the course. Students were sent an email when each module was made available and instructed to access the module. Students also received a reminder about the availability of the supports during the following week. Course grades for participating consented students were collected from the instructors with participant consent. Course grades were based on academic performance on course exams, quizzes, homeworks, and written assignments—grades were not based on attendance or classroom participation. Labs were graded separately and were not included in the final course grade. We report results in the same units used by the instructor, which ranged from fractional decimals (e.g., a mean of .74) to percentages (e.g., a mean of 74) to points out of 1,000 (e.g., a mean of 740). Participants received a small amount of extra course credit for participating in the study.

Experimental Design and Procedures

For our study, we selected cognitive and motivational support modules that were found in previous research to have positive effects on STEM students’ grades and/or intention to remain in STEM (with the exception of offsetting perceived nonmonetary costs, which was a new intervention; but see Rosenzweig et al., 2020). In each of the 10 studies, we compared effects of students’ access via the learning management system (LMS) to a cognitive support module (e.g., a video demonstrating worked examples) and/or a motivational support module (e.g., a prompt for relevance writing) on undergraduate biology course grade, compared to students who did not access any support modules. All modules were reviewed for scientific accuracy by PhD-level biologists who have taught introductory biology courses to science majors.

In each study, students in the course were randomly assigned to a business-as-usual control condition in the LMS (i.e., no access to support modules) or to an experimental condition that combined one of the several cognitive support modules with one of the several motivation support modules (see Table 2). All intervention modules were stand-alone supports delivered to consented participants via a study-specific Blackboard site.2 Participants were randomly assigned by Blackboard to experimental conditions, and for each condition, we used Adaptive Release features to only allow the student to see the modules they had been assigned to (see Figure 1 for a screenshot). For example, a student in the worked examples—relevance writing condition could access the relevance writing module and the worked examples module (see details about the modules below), but other modules were not visible or accessible to them. The Blackboard site allowed us to track students’ daily access to the intervention modules. All modules also included a student feedback form.3

Figure 1. Screenshot of LMS From the Strategies Instruction + Self-Efficacy Enhancing Condition
Note. Details removed to protect site identification.

Available download: Blackboard module with intervention components

Cognitive Support Module Design

The four cognitive support modules were study strategies instruction, worked examples (modeling self-explanation and other strategies), activating relevant prior knowledge, and segmented video recordings of class lectures, all of which were delivered as weekly video recordings. All video modules were closed-captioned. For each of the cognitive support modules, except for the class lectures recordings, we wrote scripts specific to each of 11 weekly biology course topics, videotaped the scripts (5 min per video), and made Blackboard-embedded video links available to participating students.4 We provide further details of each cognitive support module below.

Study Strategies Instruction

Each study SO video included a model demonstrating a cognitive strategy using that week’s course content, together with explanations of the usefulness of the strategy, attributions of success to strategy use, and an invitation for the learner to pause the video and practice the strategy on a specific segment of the textbook; however, we were unable to record whether students paused and practiced the strategy (Pressley & Harris, 2006). The actors in the study strategies instruction videos demonstrated six specific strategies in this order: comparing-and-contrasting within diagrams (four videos, one per textbook chapter as covered in the lecture), using sketching to enhance understanding (four videos), making concept maps (three videos), effective summarizing (five videos), and using mnemonics and etymology for learning new terms (one video each; see Appendix for one example).

Comparing-and-contrasting within diagrams comprised modeling the process of looking for similarities and differences within multipart diagrams. We used screen capture to create a video from a scanned textbook image on the screen, the mouse pointer controlled by the narrator, and the voice narration. Using sketching to enhance understanding comprised drawing a diagram from text information (diagram covered with adhesive notes) with pen and paper under a document camera, also with narration (Figure 2). Making concept maps comprised making verbal concept maps using words, bubbles, and arrows, also on a document camera, also with narration. Effective summarizing comprised typing summaries from textbook, PowerPoint, and personal notes, captured with screen capture and narration. Using mnemonics and etymology for learning new terms comprised brief text-based videos introducing these strategies.

Figure 2. Snapshot From an Example Drawing-to-Learn Strategy Instruction With Accompanying Script

Worked Examples

In the worked examples videos, we posed an application question similar to a medium-difficulty exam question (i.e., of sufficient difficulty) likewise based on that week’s course content. For example, one video posed the problem, “Fossils of Lystrosaurus, a relative of mammals from a group called Therapsids from about 250 million years ago, have been found in South Africa, Antarctica, and India. Describe how continental drift can explain why these fossils could be found so far apart.” The problem posed was followed by a demonstration of the process of reasoning through an answer and checking the completeness of the answer (i.e., the actor engaging in self-explanation of the question as well as other strategies). The worked examples were intended to show the process of applying “big ideas” from the chapter to real situations; thus, they were not new explanations of the key concepts.

One worked example question and answer for 26 of 27 taught chapters was written by a member of the research team and was checked for accuracy by the biologists on the team. Each of the 26 worked example questions followed one of the three patterns shown with a sample question shown in Table 2. When we did not use biology content from the textbook (as in the Gorteria example below), we used biology content from scientific journals. These were written in a conversational style, were consistently 0.5 page in length, and were video recorded as described above. Worked examples modeled a large number of study strategies (e.g., activating prior knowledge and summarizing), making inferences (self-explanation), checking one’s own work (e.g., occasionally making an error in reasoning such as over- or under-generalizing), and careful reading of test questions.

Activating Relevant Prior Knowledge

The scripts and videos for the activating relevant prior knowledge module were designed to remind students of information that the textbook and lecturer would assume the students already know before reading each chapter and hearing each lecture. Such knowledge came from three sources: (a) high school biology (e.g., proteins are made up of amino acids), (b) everyday life experiences (e.g., amphibians have wet skin), and (c) prior chapters from the biology course (e.g., the definition of sexual vs. asexual reproduction). The scripts were based on knowledge requirements from the entire chapter, but did not cover information proportionate to the content coverage in the chapter. Likewise, the scripts were not intended to be an outline of the important points of the chapter. We identified relevant background knowledge, which was written into a script using relatively nontechnical language, checked for accuracy by biologists, and video recorded (in two cases, we combined two chapters taught together that drew on very similar background knowledge). The 25 background knowledge scripts varied in length from 0.5 to 1.5 pages of single-spaced text.

Segmented Video Recordings of Class Lectures

For the video recordings of class lectures, we recorded full class lectures using Camtasia Relay software as they were delivered by the course instructor each week of the semester. We then edited each lecture to create thematic segments (5–9 min each) based on the course topics. For example, a lecture on Darwin’s theory of evolution was divided into segments covering the theory; definitions of adaptation, fitness, natural selection; and evidence for the theory. We posted the segmented videos from each lecture weekly for students to view.

Motivational Support Module Design

The three motivational support modules were individualized constructive feedback about exam performance to enhance self-efficacy, writing prompts about the relevance of the course topics to students’ lives, and videos with messages designed to offset perceived costs of pursuing a STEM major. All were delivered 3–4 times per semester, timed either 1 week before within semester exams or after grades on each within semester exam were released (see below).

Enhancing Self-Efficacy

For the self-efficacy enhancing intervention (based on Muis et al., 2013), individualized exam feedback was created by dividing participants into score bands (low, medium, or high) on each exam, and presenting a semicustomized graph to each participant showing the pattern of his/her achievement over the semester (e.g., low on Exam 1, medium on Exam 2, and medium on Exam 3), together with motivational advice tailored to the student’s pattern of scores to promote adaptive attributions to effort and strategies (e.g., “You may or may not feel disappointed in your first exam score…Research shows that students who evaluate what worked and did not work as they learn can improve their learning in the future”). Messages after the first exam referred to patterns of performance such as “you did better…keep using those study strategies” or “your scores are quite similar…consider using better study strategies.”

Relevance Writing

The relevance writing intervention prompts (based on studies such as those reviewed by Harackiewicz & Priniski, 2018) asked students to engage in an open-ended writing of about 300 words regarding the connection of the course content they were learning at that point in the semester to any aspect of their lives (e.g., self-knowledge, career goals, and social relationships). Each of the 3–4 prompts targeted content from a major unit that had been covered in class, and was launched one week before each exam. Students typed their responses in Blackboard.

Offsetting Perceived Costs

The offsetting perceived costs’ videos were brief interviews with actual science majors, who were also course alumni from the relevant institution, about overcoming various nonmonetary costs of being a STEM major (Perez et al., 2014). Upperclassmen of various genders and ethnicities who had experienced drawbacks associated with being a science major (i.e., perceived costs), yet had persisted in science despite those drawbacks, served as the interview subjects. The research team member asked the interviewees to identify a struggle associated with being a science major that they faced (broadly, not just in biology) and how they coped with this struggle. The interviewer guided interviewees on which topic to discuss so that perceived costs associated with time and effort, loss of valued alternatives, and psychological struggles (e.g., fear of failure) were addressed across all four videos. However, the stories were all authentic to the student’s actual experience. The interviews were conducted remotely and were recorded using a video web-conferencing tool. The videos were then edited and converted to .mp4 format using a video converter application and were 1–2.5 min in length.

The following videos are recorded interviews with actors narrating words of actual students.

Dot - Biology

Ian - Biology/pre-med

Iterative Intervention Design Process

During the course of the program of research, we followed an iterative intervention development process in 2014–2016 in which we applied lessons learned from each previous semester to design a more effective intervention in the subsequent semester. This involved trimming ineffective modules (e.g., videos of study lectures in combination with costs videos appeared to be ineffective), and modifying their delivery (e.g., students were not completing the intervention modules in a timely manner so we made extra credit for a unit conditional on completing intervention modules before that unit’s exam). The specific combinations of cognitive and/or motivational modules delivered are shown in Table 1.

Data Analysis

We used a meta-analysis to analyze the pattern of results across our 10 experiments. Rather than reporting the findings from each study separately, we are able to gain more statistical power by analyzing all studies simultaneously (Cooper et al., 2009). Meta-analysis has a number of advantages over collapsing all participants into one sample as well, because each study’s participants were in classes together, and from a statistical perspective, these groups need to be treated separately or as clustered data. Another advantage of meta-analysis is that, although effects in individual studies might be nonsignificant due to small sample sizes, the overall effects can be statistically significant while taking into account the characteristics of the particular studies (Borenstein et al., 2009). Thus, differences among studies (e.g., an experiment in the fall vs. spring) can be systematically tested as moderators of the effect of the intervention, with enhanced statistical power compared to a regression approach.

Potential Moderator Variables

Because of differences between the universities, their course sequences, and other variations in the studies across the 4 years of development, in addition to the main intervention effects, we identified several potential categorical moderators of the effect of the intervention. One was the participants accessing both the cognitive and motivational modules versus accessing only one. This moderator was selected based on the theoretical assumption guiding the study, which predicted that cases in which students access both modules should outscore cases in which students access only one type (either cognitive only or motivational only). A second moderator was a timely access of the intervention. This categorical moderator emerged from our monitoring of the fidelity of the intervention’s implementation, and concerns students’ timely access of the modules versus cramming on the modules (i.e., a semester in which >90% of participants accessed their 2nd and subsequent modules in the LMS in the last 2 weeks before the final exam; different studies clearly followed either the timely or “cramming” pattern). We hypothesized that more timely access would be associated with a stronger intervention effect. A third categorical moderator involved timing in our program of research, which concerned the timing of the particular study in the iterative development process of the intervention. Specifically, we tested whether interventions in the early phase of our learning as a research team (Spring, 2015–Fall, 2016) would manifest different effects than later interventions conducted during the last two semesters (Spring–Fall, 2017), when the research team had gathered more experience and developed more expertise. We hypothesized that studies during the iterative development phase would show smaller effects relative to the studies in the postiterative phase.

We tested eight additional categorical moderators: Two of these categorical moderators concerned whether the type of cognitive module or the type of motivation module had specific effects on course grade. There is no prior theory that suggests that any module may be expected to be more effective for increasing achievement than the others. Another moderator concerns the specific cognitive–motivation combination that would manifest the strongest effect. Five additional categorical moderators were contextual: the university, the academic year, the semester (fall vs. spring), the student’s college biology background (i.e., whether students were in their first or second semester of college biology, with either possibly happening in fall or spring), and the course content (i.e., molecular and cellular biology vs. organismal and evolutionary biology). Details of the moderators are shown in Table 3.

Meta-Analysis Software and Analytical Choices

We used Viechtbauer’s R meta-analysis program metafor (Viechtbauer, 2010), using the rma.mv function for correlated sampling errors and the Knapp–Hartung method. Effects (multiple treatment conditions compared to one business-as-usual control group) were nested within studies as the random effect. All results were evaluated at an alpha level of p < .05. The full data set is included as Table 4 and individual-level data are available by request from the first author.

Results

Descriptives

Details of conditions, the sample size per condition, treatment and control group mean grades, and standard deviations of grades are shown in Table 4. Across all studies, SO was accessed by 1,519 students, worked examples by 511, background knowledge activation by 71, and lecture videos by 51, offsetting perceived costs by 584, relevance writing by 743, self-efficacy enhancing by 592, and with 940 students in control conditions.

Research Question 1: Does the Series of Interventions Increase Course Grades?

The main effect model with no moderators showed a statistically significant overall effect of Hedges’ g = .30. This finding implies that, across our studies, the average student who accessed support modules gained .30 SD, on course grade compared to business-as-usual control students, a magnitude that was significantly different from zero. Results are shown in Figure 3, reported as Hedges’ g, which can be interpreted like a Cohen’s d effect size, that is, the relative advantage of the treatment compared to a no-treatment control condition, expressed in standard deviation units on grades, with g = .30 representing approximately 1/3 standard deviation grade advantage for a treatment over the control group (detailed statistics for each effect are shown in Table 5).

Figure 3. Effect Sizes for Conditions Versus No-Treatment Control, Ordered by Time
Note. The no-treatment control condition is symbolized by the wide horizontal line at zero, bars represent standard error of the meta-analytic point estimate, the wide vertical line represents the end of the iterative development phase, and dashed horizontal lines represent the mean weighted effect size in the iterative development and postiterative development phases. B = background knowledge; C = costs offsetting; L = lecture videos; NC = noncompliant; O = only (e.g., SO is strategy instruction only); R = relevance writing; S = strategy instruction; SE = self-efficacy enhancing; and W = worked examples.

Of the 50 effects on course grade, 41 were significant and positive, three were nonsignificant, and six were significant and negative (see Figure 3). Among the 17 different conditions tested across the studies and shown in Table 1, the positive significant effect of combining Study Strategies and Relevance Writing (SR; pink) was replicated five times in six subsequent comparisons, of Study Strategies and Self-efficacy (SSE; forest green) five times of five, of Study Strategies and Costs (SC; blue–grey) four times of five, of Worked examples and Costs (WC; black) three times of three, of Worked examples and Self-efficacy (WSE; red) two times of three, and of Worked examples and Relevance (WR; navy blue) three times of four. We thus have robust evidence that the combined cognitive–motivational interventions can yield significant increases in biology student course grades.

Research Question 2: Do Moderators Significantly Affect the Intervention’s Effect on Course Grades?

This overall result from Research Question 1 averages across the 10 studies, three universities, fall and spring semesters, and other differences between the studies. We therefore conducted moderator analyses. We next report on analyses with each moderator entered separately; the moderators were relatively highly correlated with each other (see Supplementary Table 1), suggesting that multivariate moderator analyses would not have added value. All moderators were significant across the series of studies (see Supplementary Table 2).

Effects of Type-of-Module Moderators

Access to the cognitive modules showed significant effects (mostly in combination with motivation modules) of g = .22–.44. Access to the motivation modules showed significant effects (mostly in combination with cognitive modules) of g = .31–.39. The specific conditions that showed average significant effects were Study Strategies + Relevance (SR; g = .47), Study Strategies + Self-efficacy (SSE; g = .34), Study Strategies + Costs (SC; g = .30), Worked examples + Costs (WC; g = .28), and Worked examples +Self-efficacy (WSE; g = .26). Students who accessed only the study SO also showed significant effects (g = .26). Access to the combinations of cognitive–motivational conditions relative to only one cognitive or one motivation intervention resulted in larger effects (g = .34 vs. g = .17). On average, our cognitive supports (mostly delivered with motivational supports) increased grades and our motivational supports (mostly delivered with cognitive supports) substantially increased grades. There was variability among conditions, with the direct instruction study strategies plus written reflections on relevance having a particularly large effect.

Effects were larger at the more-selective universities no. 2 (g = .33) and no. 3 (g = .35) than at the less-selective university no. 1 (g = .11).

Effects of Timing-Related Moderators

Mean effects in the (early) iterative development phase (including all of the negative and nonsignificant effects; shown with the dashed horizontal line to the left in Figure 2) were smaller than in the (later) postiterative phase, (shown with the dashed horizontal line to the right in Figure 2), where there were no nonsignificant effects and no negative effects (g = .20 vs. g = .45). Effects were larger in the spring semester (g = .42) than in the fall (g = .20), regardless of whether that course was the first (g = .32) or second (g = .28) course in the student’s biology sequence. Timely student access resulted in larger effects relative to cramming on the intervention (g = .39 vs. g = .14). Effects were larger for organismal/evolutionary biology (g = .37) than for molecular/cellular biology courses (g =. 14), again regardless of whether these were held in fall versus spring or were 1st versus 2nd in the student’s course sequence. Effects were larger after the team had engaged in iterative intervention development, settling on the more-effective components. Effects were also larger when students distributed their engagement over the course of a semester and also when the course was a spring course. Effects were larger in the Organismal/Evolutionary biology course, which had more variety of course content and was less memorization-focused.

Discussion

Our aim in this series of studies was to determine whether combined cognitive–motivational interventions delivered via an LMS could increase biology students’ course grades.

Research Question 1: Does the Series of Interventions Increase Course Grades?

Educational intervention researchers typically focus on either motivation or cognitive interventions (e.g., Greene et al., 2020; Rosenzweig et al., 2020). The main effects of the motivation and cognitive interventions found in this study align with this prior research. In our series of studies, we found a mean effect on grades that is medium-sized. Access to all four cognitive modules showed significant effects, as did access to all three motivation modules. Further, this study extends prior research by demonstrating the benefits of supporting both students’ cognitive strategies and their motivation in the context of gateway biology courses. Supporting the hypothesis of synergy between cognition and motivation interventions (Pintrich, 2000), access to combined cognitive and motivational modules resulted in significantly better grades compared to the business-as-usual control groups, as well as significantly better grades compared to only one type of support—cognitive or motivational. There are clear benefits of adding cognitive supports above and beyond motivational supports, showing evidence for our proposed COG→MOT effect. This suggests the future intervention work in gateway science courses should consider both students’ learning strategies and their motivation to improve achievement in such courses.

In addition, the interventions were designed to be used outside of course time and were delivered fully online via the LMS, and therefore, did not require any instructor involvement in carrying out the learning support for students. Thus, the interventions show great practical value as a supplementary support for students in the introductory biology courses.

Research Question 2: Do Moderators Significantly Affect the Intervention’s Effect on Course Grades?

All moderators showed significant results as follows: type of cognitive module, type of motivation module, cognitive–motivation combinations versus no combination, the university, the academic year, the semester, our iterative development phase versus postiterative, the student’s college biology background, and the course content. Below, we elaborate on each of these moderators.

Type of Cognitive Module

With regard to the different cognitive modules, SO was effective, as was worked examples, when paired with a motivation module. Both of these modules taught students specific achievement-relevant skills in the context of biology content, but SO was slightly more consistently effective. One possible explanation is that SO is more of a direct instruction technique following a specific order of instruction, compared to worked examples, which uses embedded instruction but in a less systematic way. Worked examples may have been more effective than other cognitive interventions (besides SO) because the intervention demonstrated to students how to think deeply about how to solve biology problems. Thus, they may have been able to apply the same thought processes to exams, which may have helped to improve achievement.

Prior knowledge activation was less effective; feedback from the course instructor suggested that information may have been new to the least-prepared students, rather than activating what they had learned previously. Lecture videos were also less effective, perhaps because students needed to locate the lecture segments to replay. It is possible that an initial explanation in the classroom that was not understood would be no clearer to that student when replaying.

Study strategies instruction likely benefited students because we directly instructed these high-level—perhaps unfamiliar—strategies embedded in students’ own course content (Weinstein et al., 2000). We also followed a well-validated SO method based on decades of classroom studies (Pressley & Harris, 2006), one which included multiple opportunities to practice and perhaps thereby become accurate in applying the strategies. We furthermore released the strategy videos in time with the weekly course content, ensuring the highest chance for the strategies to appear useful during ongoing studying, together with an incentive system that led students to distribute their viewing of the strategy videos. The benefits of distributing access to the cognitive supports are consistent with research on distributed practice more generally (e.g., Kang & Pashler, 2012).

Activating background knowledge might have been effective, albeit modestly so, for more reasons than we hypothesized based on this well-established principle. For example, some of the literature on undergraduate learning supports suggests that “reminders” of assumed existing knowledge can constitute, in fact, new information and new principles to some learners (Bachman, 2013), and the biology instructors working on the project have confirmed that this is the case from their teaching experiences. Our assumption that students had followed a standard high school biology curriculum conforming to their state standards—for example, the coverage of natural selection or Punnett squares—might not hold for quite a few of the students. This might have made our intervention stronger than it otherwise would have been, if we were in fact providing initial instruction and not re-teaching already known material.

Type of Motivation Module

With regard to the motivation modules, the self-efficacy enhancing modules were slightly but nonsignificantly less effective, especially compared to relevance writing, which had about the same effect as the costs-offsetting videos. These findings are in line with recent research demonstrating equivalent benefits of both relevance writing interventions and cost interventions in undergraduate science courses (Rosenzweig et al., 2020). Perhaps the postexam timing of the self-efficacy enhancing modules gave participants less of a chance to improve their performance. Both the more-active relevance writing and the less-active costs-offsetting videos activities may have helped students devote more time to study and/or persevere in the face of frustration. Indeed, the relevance writing and cost-offsetting interventions are designed to enhance the overall value of the biology course content and a STEM major, respectively, which may have motivated students.

The most likely mechanisms for motivational supports are in energizing and directing studying (Pintrich, 2000); although we were not able to track actual studying behavior, the analyses comparing the low-fidelity (“cramming” on motivational supports at the end of the semester) versus the high-fidelity studies in our program of research suggest that the stronger effect of the supports occurred when engagement was spread out over the semester. This is consistent with the assumption that the periodic motivational supports encouraged studying over the course of the semester rather than at a particular narrow time at the end. Given that many studies that employ such “brief” interventions deliver them only once, and only at the beginning of the semester, we believe our results and the conclusions of Harackiewicz and Priniski’s (2018) review both support a more-frequent, distributed, intervention to more effectively increase student achievement.

Cognitive–Motivation Combinations Versus No Combination

Results strongly support the synergy of cognition and motivation (Pintrich, 2000), and the mutual influences of MOT→COG and COG→MOT. Thus, there are clear benefits of adding cognitive supports above and beyond motivational supports, consistent with a COG→MOT effect. Our combined cognitive–motivation interventions in the postiterative phase of our research produced large, statistically significant, and practically meaningful increases in students’ introductory biology course grades, when compared to participants who received no cognitive or motivational supports. Not only are the effect sizes large but the grade difference that corresponds to the effect size in this study reflects 6.6 percentage points (e.g., an increase from a grade of 80.0% as a percentage of total possible points in the course to 86.6%), representing an increase of one (e.g., from a C+ to a B−) to two (e.g., from a B to an A−) grade bands, which we consider to be of a policy-relevant size. These results suggest it is important to provide support for both cognitive and motivational processes when intervening to support student success in gateway science courses. Doing so may result in meaningful increases in average student achievement, which may support students’ persistence in STEM disciplines. Thus, the results of this study support a holistic intervention approach and align with prior research that finds both “skill” and “will” are important factors in STEM achievement (Zusho et al., 2003).

Importantly, our study is informative for which particular combinations of motivation and cognition interventions are most effective. We saw particularly large, replicated effects from the combination of study strategies’ instruction and relevance writing—participants writing about how the course content from each course unit was relevant to an aspect of their life. We argue that the combination of instruction in how to study this biology material, together with boosting students’ own sense of why they should study, has an especially strong impact on grades, perhaps via more efficient use of study time and/or more time spent studying. It is also possible that the relevance writing module may have led students to also engage cognitively with the course content. Thinking about how the course content related to their own lives may have also prompted students to think deeply about their understanding of the concepts. Thus, it is possible the relevance writing task triggered both cognitive and motivational processes, which may have benefited students when this was combined with study strategies. The exact mechanisms remain a topic for the future research. Like a number of others who have investigated relevance writing (Harackiewicz & Priniski, 2018), we found no overall effect of relevance writing alone (i.e., when not combined with cognitive supports).

The benefit of combining cognitive and motivational supports is also consistent with, and leverages findings from, developmental research that documents mutual effects of earlier achievement on later motivation and of earlier motivation on later achievement (Gniewosz, 2010; Gniewosz et al., 2012; Kosovich et al., 2017; Meece et al., 1990; Perez et al., 2014; Weidinger et al., 2018). Finally, this study builds on prior research on combined cognitive and motivation interventions with younger children (Cleary et al., 2017; Guthrie et al., 2004; Toste et al., 2017).

The University

Students benefited more at the more-selective universities no. 2 and no. 3, suggesting perhaps that having higher knowledge and/or skills allows students to benefit more from our interventions, and some limitations to generalizability to minority-serving institutions.

Our Iterative Development Phase Versus Postiterative

The contrast between studies conducted during our iterative development phase versus those conducted in the last two semesters, when we were able to implement all lessons learned from the previous studies, suggests that the combination of higher treatment fidelity, more combination conditions, and those that include study SO (more so than worked examples) together yield strong results for students. Our findings are strongly supportive of the benefits of taking an iterative approach to developing whole-class interventions (Powell & Diamond, 2011). This approach is common in some STEM education reform models (e.g., WIDER; National Science Foundation, 2018) and is required for intervention development research funded by the Institute of Education Sciences of the U.S. Department of Education (IES, 2017). Nevertheless, the significant moderator analysis, which emphasizes the role of context in the effects, and the role of students’ groups and individual differences in the effect of interventions found in other research (e.g., Harackiewicz & Priniski, 2018), should serve as caution from expecting exact replication in new settings (Cronbach, 1975). We strongly recommend a careful contextual analysis for an initial selection of modules, to be followed by formative evaluation for identifying the supports more conducive to the success of the students in any new context (Kaplan et al., 2020).

The Semester

Effects were about twice as large in spring than in fall courses. This could be due to the majority-freshmen participants becoming accustomed to university-level work.

The Student’s College Biology Background

There were no differences in effect if the target biology course was the first college-level biology taken by the student versus the second. This supports our hypothesis that differences between fall and spring effects may have been due to getting used to college generally, more so than getting used to biology learning per se.

Timing of Access

The timing of student access as recorded by the LMS was critical for the effectiveness of the intervention—students who distributed their use of cognitive and/or motivational supports gained on course achievement more than 2.5 times as much as those who crammed on the intervention. Initially, we had offered extra credit proportionate to accessing the supports, regardless of the timing of access. Changing the way that extra credit was awarded in later semesters—by awarding extra credit for timely completion—was associated with noticeably less “cramming” on the intervention than in previous semesters, supportive of a relation between timely access and better course grades. This finding suggests that, like distributed studying and distributed retrieval, spreading out the acquisition of study strategies and energizing their use is beneficial. This is sensible, as students have the opportunity to apply each new strategy to subsequent course content (e.g., sketching learned in week 7 can be applied to learning materials in weeks 8–15). Similarly, when students distributed their access to motivational supports, the motivational supports could energize study activity over a longer period rather than cramming on them (e.g., doing all three relevance writings in the 2 weeks before the final exam).

The Course Content

Students benefited more in organismal biology courses than in molecular and cellular biology (regardless of whether that course was their first or second college-level biology course or whether they took that course in fall or spring). Molecular/cellular biology courses appear to rely more on memorization of sequences of molecule interactions compared to organismal biology; perhaps, the smaller effects reflect a mismatch between prompting deeper learning and course focus on memorization.

Limitations

The study provides persuasive evidence for the synergy of cognitive and motivational interventions on undergraduate students’ achievement in gateway biology courses. However, despite the evidence across 10 experiments, several features limit the generalizability of the findings. First, because of our focus on previously validated cognitive and motivational modules, we do not know the value of other potential modules that could increase student grades as much, if not more so, than the ones we tested. Both cognitive and motivational processes involved in students’ learning and achievement are diverse, and we tested but a few.

Second, even those cognitive modules and motivation modules that we combined do not operate in isolation from other cognitive and motivational processes that may influence students’ achievement. Insights from research that found interactive effects among cognitive modules (e.g., McNamara, 2004) and among motivational modules (e.g., Hulleman et al., 2017), interventions that go beyond the “one of each” approach we adopted here may prove even more effective in impacting students’ achievement. Although this was outside the scope of our initial hypotheses, it is possible that, for example, combing SO modules with lecture video modules may have allowed students to apply newly learned strategies to “chunks” of content in the lecture, which may have supported their learning. Future research is needed to explore such combinations of interventions.

Third, research in both cognitive and motivational interventions suggests that these interventions may operate differently for students with different characteristics (e.g., prior interest and prior biology knowledge; McNamara, 2017; Schwartz et al., 2016). The lack of a prior biology knowledge measure in our study made it impossible to test moderation of such individual differences on the effect of our intervention. Similarly, future research should examine whether characteristics such as students’ preintervention knowledge of different study strategies and different motivational beliefs (e.g., self-efficacy, expectancy-value beliefs, and mastery goals) moderate the effects of the intervention. Future research could also examine whether characteristics, such as prior knowledge of effective study strategies, change over time after receiving the interventions.

Finally, the diversity of factors that we found to be significant in moderating the effect size of the interventions raised important questions regarding the role of context in the way our interventions were received. The field of educational research has yet to address the issue of context in experimental research in a satisfactory manner. We hope that our findings would serve as an impetus for such theoretical and methodological developments.

Appendix: Example of Strategy Instruction Script

Step 1. Introducing the strategy

The first strategy we will show you for studying biology is looking for similarities and differences in diagrams. Your textbook shows a lot of material in diagrams, and not all of that information is in the text. Most diagrams have multiple parts, such as different forms of a molecule or different views of the same molecule, chemical reactions, different parts of an organism, or different molecules that are similar in some ways and different in other ways.

Step 2. Explaining the usefulness of the strategy

Paying close attention to the similarities and differences across complex diagrams can help you better understand the biology and will definitely help you on exams. Studying the diagrams does not just help some types of students, it can help all students.

Step 3. Demonstration of how to enact the strategy

I am going to show you how I would do comparing and contrasting in diagrams in Chapter 27. I am looking at Figure 27.3 on page 569. Gram staining (a) gram-positive bacteria and (b) gram-negative bacteria. The photo does not help much, but for gram-positive we have got cell wall, plasma membrane, and the cell wall is a peptidoglycan layer and the plasma membrane is a …. plasma membrane. The gram-negative also have the cell wall and plasma membrane, but the cell wall is different, it has an outer membrane, then a thin peptidoglycan layer at the base of the cell wall. What does staining have to do with it? Too large to pass through the thick cell wall…masks the safranin dye. Then on the right, can pass through this thin cell wall….stains the cell pink or red.

Step 4. Opportunity for practice

Now you try it. Can you use comparing and contrasting in the Figure 27.11 on page 573? Pause the video and see if you can use this study strategy. When you start up again, I will show my answer.

Step 5. Feedback

Here is how I did it, yours might be slightly different. A phage infects a bacterial cell so I can see the phage on the outside of the bacterium…

Step 6. Attribution to strategy use

Doing this comparing and contrasting helped me see that every little detail in every diagram is important, and so is all of the text. The color change is pretty small, so I did not see it until I read through the whole diagram, but now that I know what changes from step 1 to step 5 I can see how the color reinforces that.

Note. Italics indicate parts of the script that are read directly from the diagram.

Supplemental Material


Copyright © the Author(s) 2020

Received January 10, 2020
Revision received July 27, 2020
Accepted July 27, 2020

Comments
0
comment
No comments here
Why not start the discussion?