Skip to main content
SearchLoginLogin or Signup

When Online Courses Became the Student Union: Technologies for Peer Interaction and Their Association With Improved Outcomes During COVID-19

Volume 3, Issue 1: Spring 2022. Special Collection: Innovations in Remote Instruction. DOI: 10.1037/tmb0000061

Published onJan 05, 2022
When Online Courses Became the Student Union: Technologies for Peer Interaction and Their Association With Improved Outcomes During COVID-19
·

Abstract

While a variety of learning technologies are presently available to facilitate student-to-student peer interactions and collaborative learning online, recent research suggests that students’ opportunities to interact with their peers were significantly reduced following the abrupt transition to remote instruction due to coronavirus disease. This raises concerns because peer interaction is known to be a key ingredient in effective online learning environments, and during remote instruction, the primary connection between a student and their identity as a member of a college community would have been online courses. In this study, we investigate whether and how collaborative technologies supported peer interaction, and students’ learning, during remote instruction. Specifically, we used results from a multicampus survey of students and instructors, as well as data from our online learning management system, to explore the use of collaborative tools at a large scale and their associations with student outcomes. Findings indicate that instructors, as was typical before the pandemic, generally favored individual learning activities over collaborative activities during campus closure. But in those situations where collaborative activities were present during remote instruction, triangulation analyses indicate that their use was related to improved performance as measured by instructors’ survey responses, by students’ performance in their courses, and by an increased sense of belonging among students.

Keywords: educational technology, peer interaction, online discussion, computer-supported collaborative learning, learning analytics

Supplemental materials: https://doi.org/10.1037/tmb0000061.supp

Acknowledgments: We appreciate the contributions and assistance of Julie Wernert, Tonya Miles, Erica Moore, Craig Stewart, Judy Ouimet, Todd Schmitz, and Maggie Ricci.

Funding: This research was made possible by support from the Indiana University Office of the Vice President for Research and from Schmidt Futures, a philanthropic initiative cofounded by Eric and Wendy Schmidt.

Disclosures: We affirm that there are no conflicts of interest, actual nor potential, related to the conduct of this study.

Open Science Disclosures:
The data are available at https://osf.io/vcg4d/
The experiment materials are available at https://osf.io/vcg4d/

Contributing editors: Fran Blumberg was the Action Editor for this article. Rachel Flynn and Fran Blumberg were the Special Collection Editors.

Please address correspondence concerning this article to Benjamin A. Motz, eLearning Research and Practice Lab, Pervasive Technology Institute, Indiana University, 1320 East 10th Street, Bloomington, IN 47405, United States [email protected]


When the coronavirus disease (COVID-19) pandemic forced colleges to transition to remote instruction, students were not only detached from their physical classroom environments. The closure of campus facilities and the suspension of student events meant that most students were also detached from typical in-person peer interactions. Online courses became the de facto bridge between students and their campus communities, and while some innovative instructors leveraged learning technologies to connect classmates with one another, preliminary findings suggest that students’ opportunities to interact with other students became a major casualty of the emergency shift to remote instruction. According to one nationally representative study (Means et al., 2020), 65% of students thought opportunities to collaborate with other students on course work were worse or much worse online during COVID. This reduction in student–student interaction, during a time when education relied exclusively on technology, raises important questions about our common educational technology toolkit, such as whether and how it supported a learning community, and students’ learning, during remote instruction. The purpose of the present study is to investigate the prevalence of instructional efforts to facilitate technology-mediated collaboration during remote instruction and its association with academic outcomes.

Generally, student–student interaction increases students’ sense of belonging in education settings (Allen et al., 2021; Gilken & Johnson, 2019; Meeuwisse et al., 2010) and is strongly associated with student achievement in online courses (Jaggars & Xu, 2016). Sense of belonging is a complex construct that can be understood as the degree to which an individual perceives support and dependencies from others within a defined social context and practice (Hoffman et al., 2002). In general, a sense of belonging between one’s self and one’s context has been identified as a promising target for psychological interventions, due to its broadly positive effects on persistence and performance, particularly in settings with diverse participants (Walton & Brady, 2017). When viewed in this way, peer interaction among students is not merely a convenient by-product of shared in-person education settings, but instead is a necessary ingredient in such spaces, one that should be prioritized and preserved in online education settings. Furthermore, past research suggests that maintaining student belonging is a critical component of academic continuity during campus closure (Day, 2015; SchWeber, 2013) and that belongingness can be improved by interactions with ones’ peers in online environments (Delahunty et al., 2014).

In addition to the socioemotional benefits discussed earlier, there are numerous benefits of engaging in collaboration in online and technology-mediated learning activities. One benefit stems from learners’ construction of information into meaningful knowledge and skills through the task of communicating with other students. Through collaboration, learners might formulate and explain their ideas to their peers, thereby engaging in retrieval, structuring, and clarification of information. Similarly, peers engaging with each others’ work may be exposed to (and fill in) their own knowledge gaps, in addition to potentially exposing such issues during communication with peers (Webb, 2013). In social terms, this approach may be understood as learners coconstructing knowledge through continued, distributed engagement with resources and artifacts (Hakkarainen et al., 2013; Scardamalia & Bereiter, 1992). But regardless of the theoretical framing, collaborative technology can facilitate the active development of knowledge through shared engagement with learning activities.

Defining Technology for Collaborative Learning

Learning technologies are tools and resources intended to support student engagement in activities that achieve learning outcomes or curricular aims. As such, we define collaborative learning technologies broadly, as tools and resources intended to support peer-to-peer and peer-to-expert/instructor interactions and engagements around a shared resource or space to fulfill said outcomes or aims. We recognize this definition is sufficiently broad to encompass a wide array of tools developed for such interaction. As such, we have restricted our approach to resources that present an intentional design for collaborative learning that were used within our sampled data. We acknowledge that without scrutiny into the conduct of each individual class, any such selection will be imperfect and will likely fail to incorporate all collaborative learning technologies. This is primarily because some technologies, such as videoconferencing tools (e.g., Zoom) and web-based office tools (e.g., Google Drive, Microsoft Office), might be collaborative, or might not be collaborative, depending on instructional design. Moreover, an instructor might simply instruct students to work together and provide no support or indication for what tools they are to use. For example, students may choose to privately communicate through e-mail or chat services (e.g., Discord or Slack), which do not produce accessible evidence of their collaboration.

In consideration of these challenges, we chose to adopt a restrictive definition of collaborative tool use within the context of this study. In consultation with instructional technologists at Indiana University, we reviewed the full learning technology ecosystem and identified those web-based enterprise learning tools that, if used by a student during remote instruction, could only indicate that students were being intentionally exposed to each other, or to each other’s work. These tools were

  • Canvas Discussions. Online threaded discussions in the Canvas learning management system (LMS; https://www.instructure.com/canvas), with posts and replies made by students within the course site. In Canvas, discussions can be implemented as a graded assignment, with or without a deadline. By default, posts and replies are visible to all other students in the course site.

  • Canvas Group Assignments. An assignment that is submitted jointly in the Canvas LMS on behalf of a defined group of students. Student groups can be made manually or at random. If the assignment is graded, the instructor can automatically assign the same grade to all students in a group or can grade each student individually.

  • Canvas Peer Review Assignments. An activity in Canvas that instructs students to comment on each other’s submissions to a previous assignment. Peer review assignment settings can allow submissions to be scored according to a rubric and can either show student names or can support anonymous reviews. By default, reviews submitted by peers are not graded.

  • Piazza (https://piazza.com/). An online space where students can collaboratively answer questions posed by other students, or by the teacher. Answers have wiki-like functionality, allowing multiple students to contribute to the same response. These answers can be moderated by the instructor. Piazza also includes a traditional discussion tool.

  • InScribe (https://www.inscribeapp.com/). An online interactive community where students can ask questions and respond to questions posed by their peers, with options for moderation by the instructor. Communities can be limited to a single course or can include multiple courses across multiple semesters.

  • Hypothesis (https://hypothes.is/). A social annotation tool allowing students to collaboratively annotate and discuss information and resources on the web. Hypothesis is integrated with Canvas, enabling instructors to assign students to annotate a given webpage or document. Students can see and respond to each others’ annotations.

  • CourseNetworking (https://thecn.com/). An online social network at the class level, providing peer discussion functionality with a social media look and feel. CourseNetworking also includes a gamification element, where students can earn points for participation and award points to their classmates’ posts. Points can be passed back to the Canvas gradebook.

  • CircleIn (https://www.circleinapp.com/). An online space for students to study together remotely, answering questions and solving problems. CircleIn facilitates back-channel student discussions using group video or text chat features, searchable note sharing, and student networking.

Importantly, we believe each of these resources shares a common intention to support peer-to-peer and peer-to-instructor interactions. Specifically, these tools present a shared space or resource with which students can engage with one another to share insights and discuss challenges during the learning process. We believe such shared spaces can facilitate the benefits of collaboration described earlier. Canvas discussion forums, for example, are intended to promote a shared space where students and teachers can engage in problems within the discipline being discussed. Similarly, social annotation tools such as Hypothesis provide opportunities for students and instructors to interactively engage in a shared resource of interest wherein problems, challenges, and insights can be discussed. Finally, while it is not always the case that these tools are used in tandem, co-occurrences can occur. For example, Group Assignments and Canvas Discussions Forums can co-occur wherein students are divided into groups and correspond with one another through the discussion. Finally, all of these resources use Canvas as a launching point, but do not necessarily operate within the same systems. Hypothesis, for example, is an independent system that students can access after authenticating in Canvas.

Of course, as with any tool, there are differences in its function in design versus its function in use. For example, LMS discussion forums are subject to the limitations of such tools in terms of undesired outcomes such as premature abandonment due to students’ single pass reading practices (Hewitt, 2005) and are frequently host to fragmentation of topics to less central discussions on a salient resource, problem, or insight (Thomas, 2002). Such limitations are somewhat endemic to general-purpose learning technologies (which can support a wide range of instructional uses) and are contrasted by more recent, dynamic tools (such as Hypothesis and InScribe), where the instructional integration reinforces the tool’s intended design, to promote continuous refinement and development over multiple resources, content, and activities (Chen, 2019). As such, we recognize that the simple introduction of a tool for collaborative learning does not necessarily enable productive outcomes in an online context (Chang & Hannafin, 2015).

Still, even with these differences and limitations in mind, the shared intent of collaborative learning technologies, especially across a broad range of contexts as we discuss below, presents an overarching framework for examining their use and benefits. Within this framework, we assess engagement in collaborative resources and tools that could promote positive productive changes for student outcomes.

Benefits of Collaborative Learning Technologies

Harmonizing the affordances of technology with instructional goals of learning activities is central in producing these observed benefits of online collaboration (Dennen & Hoadley, 2013), and numerous specific collaborative structures have been proposed and supported by prior research. For example, technology might enable learners to adopt specific collaborative roles (Strijbos & Weinberger, 2010). Such role assignment or selection can be accomplished through the use of scripts for collaboration (Fischer et al., 2013), which facilitate an instructor’s ability to orchestrate collaborative processes across the class community (Dillenbourg et al., 2009). In other words, the combination of collaborative structures with technology resources can support both student learning and instructor teaching strategies.

In addition to fostering more constructive learning environments, online collaborative learning tools can also develop learners’ ability to productively engage with peers. Productive collaboration requires that learners be able to successfully negotiate the issues of working together (e.g., assigning responsibilities and holding each other accountable) to engage in inquiry and problem-based learning (Barron, 2003; Miller & Hadwin, 2015). Further, the ways in which effective collaborative learning is accomplished depend on ways of practice and problem solving instantiated within disciplinary communities (Cornelius et al., 2013). By developing these skills, structured peer interaction should yield increases in perceived support from others within a learning environment, effectively improving sense of acceptance and belonging between one’s self and context (Saltarelli & Roseth, 2014).

As such, the application of collaborative technologies enables learners to develop both social practices of group problem solving and inquiry in addition to disciplinary practices related to communication and consensus. In this manner, online collaborative resources can support both the development of disciplinary knowledge as well as practical skills for navigating social problem solving. The influence of collaboration as an essential component for constructing solutions to these problems provides additional opportunities to engender productive learning through networked tools and resources.

Taking stock, the application of tools to support collaborative learning within online contexts affords numerous opportunities to both improve and understand students’ learning. Specifically, these tools can operate across multiple levels and encompass both the social processes needed to regulate effective learning and teaching as well as the individual construction of knowledge, motivation to achieve their aims, and the development of meaningful engagement with disciplinary practices. Furthermore, these tools can support instructors’ teaching and organization of salient activities and resources. Identifying the ways in which these practices occur and are productive or unproductive for students learning can lead to (a) the identification of effective learning processes within particular contexts and (b) the potential implementation and assessment of these practices in other contexts or more generalized applications. That is, using these tools can support an iterative understanding and refinement of teaching and learning within networked tools.

All of these processes, however, are embedded in the design and application in any given teaching and learning context. This point is especially salient in online contexts where all forms of interaction are mediated across one or more tools or resources. Further, this point also suggests a notable barrier in the use of collaborative learning tools. Namely, the inclusion of these resources alone does not necessarily promote a deeper sense of belonging or greater performance. Rather, it is the interaction between these tools and the intentional teaching and learning activities that are supported through these tools that can afford greater community and performance within learners’ engagement in online contexts.

The potential benefits of implementing collaborative learning technologies, however, became increasingly pertinent due to the drastic change in instruction resulting from the COVID-19 pandemic. While we do not focus on any one component of collaborative learning in the present study, we instead examine the influence of collaborative tools in instructor and student perceptions and outcomes resulting from the rapid transition to remote instruction. While we appreciate the complexities of these approaches and the nuanced differences between peer collaboration, communication, interaction, and social influence in online learning environments, the present study adopts an intentionally broad and inclusive view to accommodate analyses at scale. The uptake, use, and benefits of these resources during a time when students had been jettisoned from their in-person social communities represented our main focus in this study. As such, the present study aims to address four research questions:

Research Question 1: How did instructors prioritize the use of collaborative learning technologies (technologies allowing students to interact with one another) relative to other learning activities during the transition to remote instruction?

Consistent with the special issue’s theme of innovations in remote instruction, this first research question aims to quantify instructors’ introduction of opportunities for peer interaction through their use of Canvas collaborative resources, as described above, when classes transitioned online, and how often such opportunities occurred in comparison with conventional, individual learning activities in Canvas.

Research Question 2: How did instructors who used collaborative learning technologies during remote instruction compare with instructors who did not?

This research question seeks to profile instructors who adopted learning technologies in support of peer interaction, according to instructors’ levels of instructional experience, levels of technology experience, and self-reported outcomes among students during remote instruction. Given that learning technology use is often negatively correlated with teaching experience (Gorder, 2008; Smerdon et al., 2000; Waugh, 2004), we examine whether this pattern will hold for the adoption of collaborative learning technologies. We investigate this adoption through both the perceived degree of use as well as the extant behavioral indicators of use within the Canvas data logs.

Research Question 3: In those courses where online peer interaction was already present pre-COVID, how did activity and engagement in online discussions and collaborative tools change after the transition to remote instruction?

Beyond causing schools to transition to remote instruction, the COVID-19 pandemic massively disrupted daily life. These disruptions may have also changed how students and instructors interacted with each other in Canvas discussions. If so, the benefits of technology-mediated peer interaction during remote instruction may be uniquely specific to this moment in time. For this reason, we also see value in interrogating the structure and semantic contents of student artifacts in collaborative learning tools and drawing contrasts before and during COVID.

Research Question 4: Did students who used technologies for peer interaction demonstrate improved academic outcomes during the period of remote instruction?

Among those students who were exposed to their peers or to the work of their peers, this analysis seeks evidence that this exposure had beneficial consequences for students’ performance and sense of belonging during remote instruction. Specifically, the extent to which collaborative learning technologies contributed to students’ perceived sense of dependencies with either the institutional support, their peers, or their instructor was examined.

We investigate these questions using a multipronged approach, combining data collected from a large-scale survey of college instructors and students, from institutional enrollment and course performance records, and from student activity data logged within the LMS.

Method

The present study was exploratory, and we composed our research questions after preliminary review of broad trends observed in survey data. Deidentified raw data from this study, as well as analysis scripts for reproducing all analyses described herein, are available at https://osf.io/vcg4d/ (Motz, Quick, & Morrone, 2021).

Participants

In the present study, participants are those who responded to either of two online surveys (one for instructors and one for undergraduate students) administered to the Indiana University community in May 2020, shortly after the end of the Spring 2020 academic semester (which was disrupted by COVID-19). All faculty who had a “Teacher” role in a LMS course site across nine Indiana University campuses were invited to respond to an instructor survey via email. Additionally, all undergraduate student enrollees across these nine campuses (who were over 18, who had not placed restrictions on their student directory listings, and who were not co-enrolled high school students) were invited to respond to a student survey via email. Due to legal and compliance restrictions on data collected overseas, participants were excluded if they were not in the U.S. at the time of responding, determined jointly by IP address and by a question about the respondent’s current location. In total, 1,532 eligible instructors and 6,148 eligible students consented to participate, with representation across all nine Indiana University campuses.

For the complete student sample, the median age of student respondents was 20. Respondents were also asked to self-report their gender identity and ethnicity. The majority of respondents reported that they identified as female (71%), and male identifying respondents made up 27% of participants. Students who indicated they identified as nonbinary made up 1% of respondents, while students who preferred not to disclose their gender identity or indicated that their gender identity was not included in the survey instrument made up of less than 1% of respondents. The majority of students identified as White (72%). Hispanic, African-American, and Asian students made up of 8%, 6%, and 4% of respondents, respectively. Native American and Pacific Islander students made up of less than 1% of respondents. Students who identified as two or more ethnicities, or who preferred not to respond, both made up of 4% of respondents. While international students were excluded from this sample, per our Institutional Review Board (IRB) policies, the sample is generally in keeping with the current demographic breakdowns within the Indiana University population. As such, the number of students who identify as White is predominant, and underrepresented populations will have less influence on statistical descriptions of this sample. While our focus was to understand the impact of the transition to remote instruction on the entire system of Indiana University students and teachers, we recognize that our inferences are limited in how marginalized groups were impacted by the transition to remote instruction. As such, we highlight this as a limitation of the present study.

Students with Senior standing had the largest proportion overall (36%) and Junior, Sophomore, and Freshman made up 23%, 23%, and 15%, respectively. Associate degree and non-degree seeking students made up approximately 2% of respondents. For a breakdown of the subsamples examined for each research question, please refer to Online Supplemental: Analysis Samples.

For the complete instructor sample, the median instructor age was 48. Instructors were not asked to report their gender identity but were asked to indicate their ethnicity. The majority of instructors were White (72%). Asian, African-American, and Hispanic instructors responded in similar proportions (each approximately 4% of respondents). Native American, Native Hawaiians, Pacific Islanders, and those who identified as two or more ethnicities made up less than 1% of respondents. Two-hundred and thirty-eight respondents did not indicate their ethnicity (13%). Adjunct, tenured, and full-time teaching faculty made up the majority of instructor respondents (respectively, 23%, 23%, and 19%). Associate graduate student instructors made up 14% of respondents. Full-time clinical, pre-tenured, and additional instructor roles made up of 5%, 9%, and 6% of respondents, respectively.

The LMS used at Indiana University is Canvas (Instructure, Inc.; Salt Lake City, UT). Every instructor at Indiana University has access to a Canvas page for every course that they teach and also has the opportunity to make practice Canvas sites. During remote instruction, all faculty were explicitly instructed by academic leadership to use Canvas to conduct their classes remotely.

In the instructor survey, instructors provided consent for the analysis of their survey responses as well as analysis of data in their LMS course sites in the initial consent prompt; thus, we collected data from the LMS for all participating instructors. However, due to legal and compliance restrictions on student data, we included a secondary agreement within the student survey, inviting student respondents to release their LMS activity data for analysis. The number of student participants that provided this additional release was 4,863. For Research Questions 3 and 4 (which investigate students’ LMS activity), our results are limited to this subsample. Also, for our analyses of LMS data at the course level, we excluded courses where fewer than 10 students were actively enrolled in the LMS course site.

Participating instructors taught a total of 3,432 classes with active LMS sites, an average of 2.24 per instructor. Participating students who provided release to examine their coursework in Canvas had student enrollments in 7,702 unique class sites during the Spring 2020 semester, an average enrollment of 4.92 class sites per student.

Materials

The surveys used in this study were initially designed to address a broad range of practical and theoretical questions. We selected survey topics in consultation with diverse constituencies of researchers, faculty leadership, students, and instructional designers. Survey items were written specifically with the goal of understanding how the transition to remote instruction due to COVID-19 affected the faculty and student experience, in collaboration with Indiana University’s Center for Survey Research. A full analysis of these results is beyond the scope of any single research article, and unrelated aspects of the survey findings have been described elsewhere (Jaggars et al., 2021; Motz et al., 2020; Motz, Quick, Wernert, & Miles, 2021). The full survey instruments and de-identified responses to all closed-ended survey items (including those not described in the present study) are available at Motz and Quick (2020).

Records from the LMS were extracted from Indiana University’s internal Canvas data warehouse and from a learning data platform provided by the Unizin Consortium (Unizin Data Platform; Unizin, 2021). Queries for these extracts are included in this study’s project site (Motz, Quick, & Morrone, 2021).

Identifying Collaborative Tool Use

We assess the presence and extent of collaborative technology use by measuring the number of discussion posts, group assignments, and peer review assignments and further by measuring the number of “launches” into collaborative tools, such as students authenticating into Piazza, for example, from a Canvas course site. While other technologies could potentially be collaborative in practice, our restrictive definition identifies those technologies that, merely by evidence of their use alone, unambiguously indicate that students were engaged in learning activities with other students, and that the instructor made some effort to facilitate this interaction.

Data Analysis

The overarching goal of statistical analyses in this study is to estimate differences in key quantitative metrics between conditions and individuals. For example, in addressing Research Question 1, we compare the number of assignments in a typical course site before remote instruction and during remote instruction. In addressing Research Question 2, we compare responses to survey questions between instructors who did and who did not include discussions in their course sites during remote instruction. We refer to “outcomes” loosely, and when mentioning “outcomes” we generally intend these to include objective grades, perceived performance, and other constructs relevant to student academics. Because our goal is to directly estimate differences in these metrics (rather than to measure the probability of our results under a hypothetical null model where differences do not exist), we use Bayesian estimation for all present analyses (Kruschke & Liddell, 2018).

Bayesian estimation is a statistical method for estimating parameter values in an analytical model, as well as the uncertainty of these estimates (Kruschke, 2014). In our current analyses, we measure differences in these estimates between conditions and individuals, and present the modal difference estimate (the most credible estimate of the difference between two parameter values), followed by the 95% highest density interval (HDI) surrounding these difference estimates. The 95% HDI is defined by the upper- and lower bounds around 95% of the most credible estimates, analogous to (but distinct from) a 95% confidence interval. For example, in answering Research Question 2, we estimate that instructors who used discussions in their courses had 0.76 more years of experience teaching online courses compared with instructors who did not use discussions, with a corresponding 95% HDI of 0.46–1.08 years (see Table 2). Because the 95% HDI does not include zero (or values very close to zero), we can infer that this difference is credibly nonzero. The specific statistical packages we used are described within the results, and all analysis scripts are publicly accessible at Motz, Quick, and Morrone (2021).

Results and Discussion

Research Question 1: How did instructors prioritize the use of collaborative learning technologies relative to other learning activities during the transition to remote instruction?

From the university’s official announcement that all Spring 2020 courses would move online to the initial start of this new phase of remote instruction, there were 2 weeks of time for instructors to prepare. By definition, this preparation involved transitioning instructional activities that would normally be conducted in class into an online format conducive to remote learning. By measuring the kinds of learning activities present in instructors’ LMS course sites prior to, and then during remote instruction, we can paint a coarse portrait of how instructors prioritized collaborative learning technologies during this short transition (see Table 1). While we do not examine the particularities of how instructors incorporated these technologies into their course designs, this approach has the benefit of being able to assess the presence of technologies for peer interaction at scale.

Table 1
Change in Learning Technology Utilization During Transition to Remote Instruction

Measure

Pre-remote

Remote

Difference estimate [95% HDI]

Individual

 Number of assignments to submit files

3.07

4.20

1.13 [0.90 to 1.37]*

 Number of assignments to submit online quizzes

1.66

2.26

0.59 [0.38 to 0.84]*

 Number of assignments with other submission types

2.02

1.66

−0.36 [−0.66 to −0.07]*

Collaborative

 Number of assignments to submit discussion posts

0.81

0.96

0.16 [0.03 to 0.28]*

  Number of discussion topics

1.22

1.98

0.76 [0.57 to 0.95]*

  Number of discussion posts per student

1.02

2.40

1.37 [1.12 to 1.62]*

 Number of group assignments

0.22

0.26

0.04 [−0.01 to 0.10]

 Number of peer review assignments

0.05

0.07

0.02 [−0.00 to 0.04]

 Number of collaborative tool launches per student

0.08

0.06

−0.02 [−0.06 to 0.02]

Note. Values are the average number of each learning activity per class, among instructors participating in the study. Asterisks (*) indicate that the 95% highest density interval (HDI; a Bayesian analog of the confidence interval) of this measure’s difference estimate between pre-remote and remote, shown in brackets, does not include 0.

We categorized assignments as being pre-remote or remote according to the assignment’s due date, and whether it was before or after the university’s official announcement of the university’s transition to remote instruction. Coincidentally, this date was the precise midpoint of the Spring 2020 semester under analysis. Discussion topics (which might not be associated with a graded assignment or deadline) were categorized according to the median time (i.e., the date and time in which a post occurred) of student posts within that topic. Discussion posts and launches within collaborative tools were categorized according to the timestamp when they were recorded in our institution’s data warehouse.

The variables listed in Table 1 were included as response variables in a multivariate linear model, drawing contrasts between pre-remote and remote, and we estimated parameters in this model using the brms package (Bürkner, 2017) for R, which uses Stan (Stan Development Team, 2021) to sample posterior distributions for estimating parameter values in a Bayesian estimation framework.

In the category of individual learning activities, on average, there were significant increases in the number of assignments for students to submit files and quizzes and a decrease in other forms of individual assignments (i.e., assignments submitted in person). Despite the decrease in the latter, there was a substantial cumulative increase in the number of individual learning activities in Canvas, from 6.75 assignments pre-remote to 8.12 assignments per class during remote instruction, a roughly 20% increase.

There was a proportionate increase in the number of collaborative assignments. Preremote instruction we observed 1.08 collaborative assignments (discussion, group submission, and peer review assignments combined) per class in Canvas, to 1.29 collaborative assignments during remote instruction, a roughly 19% increase driven primarily by increases in the number of discussion assignments. However, given the scant utilization of these tools before remote instruction, the overall quantity of individual assignments outweighed the quantity of collaborative activities—with over six times more graded individual assignments than collaborative assignments on average.

Even while the number of graded discussion assignments increased modestly, students made more than twice as many discussion posts following the transition to remote instruction, indicating that much of this collaborative activity was either optional, ungraded, or indirectly graded (commonly as participation points). This increase is driven exclusively by those courses where there were no discussions pre-remote instruction, considering that there was no change in the number of discussion topics or posts within courses that did have discussions pre-remote instruction (see Research Question 3).

Moreover, there was no reliable change in the number of group submission or peer review assignments, nor was there an increase in the number of times students “launched” (authenticated into) collaborative learning platforms. Considering their relative rarity in this large sample, it is possible that many instructors were unaware of these opportunities. Also, considering that many instructors were overwhelmed by the sudden transition to remote instruction (Carey, 2020), it makes sense that most instructors might default to known coursework formats with shallow learning curves, even if an increased volume of such activities might not have been beneficial for student learning (Motz, Quick, Wernert, & Miles, 2021). Next we explore these hypotheses, examining the profile of instructors who used collaborative tools during remote instruction.

Research Question 2: How did instructors who used collaborative learning technologies during remote instruction compare with instructors who did not?

For each instructor who responded to our institution-wide survey, we identified all active Spring 2020 Canvas course sites where they had “Teacher” roles and measured whether the course included an active discussion topic or the presence of learning tools for peer interaction during the period of remote instruction. For this analysis, a discussion was defined as any Canvas discussion topic (graded or ungraded) where the median time of a posts occurrence within that topic occurred after the transition to remote instruction. Collaborative tool use was measured by the presence of a group assignment or peer-review assignment with a deadline after the transition to remote instruction, or by student launches into Piazza, InScribe, Hypothesis, CourseNetworking, or CircleIn with timestamps after the transition to remote instruction. Courses with fewer than 10 enrolled students were excluded. Of the remainder, 53% of instructors had a discussion and 25% had a collaborative learning tool in at least one of their courses during remote instruction.

We joined these instructor-level Canvas log records with instructors’ survey responses and examined survey items specifically related to instructional experience, technology experience, and self-reported course outcomes, as listed in Table 2. For each survey item, we estimated the difference in response tendency between instructors with, and without, discussions and collaborative tools in their course sites. Instructors provided a numeric response regarding years of experience teaching at the college level, and this item was analyzed using a robust hierarchical Bayesian version of the t-test (Kruschke, 2013). The remaining items had ordinal response scales, and these were analyzed using the hierarchical ordered probit model described in Liddell and Kruschke (2018).

Table 2
Differences in Survey Responses Between Instructors Based on Their Use of Collaborative Learning Technology

Survey questions

Use of discussions in LMS

Use of collaborative tools

Instructional experience

 How many years of experience do you have teaching at the college level?

−1.2 [−2.2 to 0.15]

−1.25 [−2.4 to −0.03]*

 How many years of experience do you have teaching 100% online classes?

0.76 [0.46 to 1.08] *

0.16 [−0.04 to 0.37]

Technology experience

 In general, how would you describe your level of comfort in adopting new technology in your classes?

0.02 [−0.13 to 0.16]

0.36 [0.19 to 0.54]*

 I was familiar with technology for online teaching and learning in my discipline.

0.37 [0.23 to 0.52]*

0.35 [0.18 to 0.53]*

Self-reported course outcomes

 My students received a lower quality learning experience.

−0.21 [−0.36 to −0.08] *

−0.37 [−0.53 to −0.22]*

 I think my students will struggle in their future courses or future employment because my course had to be offered remotely.

−0.29 [−0.43 to −0.15] *

−0.31 [−0.48 to −0.14]*

Note. LMS = learning management system. Values indicate the estimated difference in response tendency for each survey question, contrasting instructors who used discussions or collaborative tools with those who did not. Positive estimates indicate tendencies for more years, comfort, or agreement (depending on the survey question) among instructors who had evidence of discussions or collaborative tools. Asterisks (*) indicate that the 95% highest density interval (HDI; a Bayesian analog of the confidence interval) of this estimate, shown in parentheses, does not include 0.

Broadly, discussion and collaborative learning tool use during remote instruction were correlated with lower amounts of general teaching experience (as indicated by number of years teaching) at the college level, but higher amounts of experience teaching online (as indicated by number of online courses previously taught). In particular, instructors who included collaborative tools had significantly less teaching experience at the college level, adding to past evidence that general teaching experience is often negatively correlated with technology use (Gorder, 2008; Smerdon et al., 2000; Waugh, 2004). Further, faculty with collaborative tools were less likely to report that teaching online conflicted with their personal identity as an instructor (n = 312; 65% in disagreement). And when instructors had more experience teaching online specifically, they were more significantly more likely to include discussion activities in their courses.

Indeed, instructors who reported that they were more familiar with teaching technology were significantly more likely to include both discussion and collaborative activities in their courses during remote instruction. However, perhaps because Canvas discussions are relatively mainstream (used by more than half of instructors in this sample), there was no difference in comfort with new technology between those who did and did not have a discussion in their courses. Faculty who did not use collaborative tools were somewhat more likely to indicate that teaching online conflicted with their personal identity and values as an instructor (no collaborative tools: 223; collaborative tools: 138). Both groups, however, indicated that the chief barrier in implementing new tools or resources for teaching online was lack of time (no collaborative tools: 182; collaborative tools: 189), though faculty who did not use collaborative tools also indicated a lack of technology as a significant barrier (no collaborative tools: 104; collaborative tools: 50). As such, faculty who did not use any resource recognized as collaborative (e.g., Canvas discussions, CourseNetworking, Piazza, etc.) indicated greater resistance to adapting to online resources for teaching remotely. In contrast, faculty who used collaborative tools tended to present a more open attitude to remote instruction.

Across discussion and other collaborative technologies, instructors who used these tools in their courses reported categorically better outcomes. These instructors reported significantly less agreement that their students had a lower quality learning experience and significantly less agreement that their students would struggle in the future. Of course, the causal relationship between these variables is unclear. Instructors who promote technology-facilitated student-to-student interaction in their courses might also engage in other pedagogical moves that improve these outcomes. Or it is also possible that more talented instructors, more thoughtful instructors, or instructors who have relaxed standards for course outcomes are simply more likely to adopt collaborative tools. For these reasons, our next research questions examine peer interaction by investigating students’ activity, self-reported outcomes, and objective outcomes in courses that featured collaborative tools.

Research Question 3: In those courses where online peer interaction was already present pre-COVID, how did activity and engagement in online discussions and collaborative tools change after the transition to remote instruction?

Of the 1,532 faculty who consented to participate, 475 faculty had used some form of collaborative technology (e.g., Canvas discussions, Piazza, Hypothesis, etc.) within their courses prior to the transition to remote instruction. For these courses where there was already collaborative technology in their course sites, there was minimal change in the use of collaborative tools upon the transition to remote instruction, as described in Table 3. Similarly, differences in the semantic contents of students’ discussion posts in these courses pre-to-post-transition (as measured by the Linguistic Inquiry Word Count [LIWC]; Pennebaker et al., 2015) were generally minor. In Table 3, we list those semantic features where the differences were largest, on average, between the pre-transition and post-transition periods. While some statistically significant differences were observed, these effects were generally small. For example, there was a reduction in the use of 1st person pronouns, and an increase in the frequency of words related to cognitive processes (e.g., think, know, cause), but no large or systematic shifts in the semantic contents of these posts, nor differences in the length of posts or the number of posts. Practically, then, students who used discussions prior to the transition to remote instruction exhibited little change as part of their learning practices post-transition to remote instruction, which is noteworthy considering the huge magnitude of the disruption to college due to the pandemic.

Table 3
Properties of Students’ Use of Collaborative Learning Technologies

Measure

Pre-remote

Remote

Difference estimate

Overall activity and engagement

 Average discussion posts per student

12.04

11.57

−0.46 [−2.94 to 2.03]

 Average collaborative tool launches per student

0.28

0.12

−0.16 [−0.4 to 0.07]

 Average length of posts (words)

156

151

−4.76 [−15.08 to 5.49]

 Average instructor posts per topic

1.98

1.68

−0.28 [−0.8 to 0.23]

 Average replies per posts

.47

.51

0.04 [0.0 to 0.08]

Semantic content (LIWC categories)

 1st person pronouns

4.53

4.22

−0.31 [−0.55 to −0.06]*

 Sentiment

5.15

5.24

0.1 [−0.11 to 0.31]

 2nd person pronoun

1.36

1.39

0.03 [−0.09 to 0.15]

 Numerical

1.92

1.79

−0.14 [−0.46 to 0.18]

 Cognitive processes

12.7

13.1

0.4 [0.02–0.79]*

 Insight

3.38

3.47

0.09 [−0.05 to 0.24]

 Affiliation

2.30

2.50

0.19 [−0.13 to 0.48]

 Power

2.67

2.54

−0.12 [−0.26 to 0.01]

 Focus past

2.82

2.99

0.17 [0 to 0.36]

 Relative

10.7

10.4

−0.21 [−0.54 to 0.14]

 Social

8.39

8.7

0.31 [−0.08 to 0.69]

 Work

4.69

4.21

−0.48 [−0.74 to −0.2]*

 Time

3.69

3.45

−0.25 [−0.45 to −0.03]*

 Space

5.53

5.57

0.05 [−0.13 to 0.25]

Taking stock, we interpret these results as indicating a difference in experience with using collaborative tools for remote instruction. Namely, those faculty who used collaborative tools had more experience in engaging with technologies for student collaboration and were therefore less likely to change these practices as implemented, with no significant change in the quantity of activity or quality of discussions. In contrast, as examined in Research Question 2, those faculty who did not use collaborative technologies were less familiar with the technologies and did not have the time to gain this familiarity, due to the rapidity of the transition caused by the COVID-19 pandemic. Further, instructors who previously used collaborative tools continued to implement their previous enactments of these resources, even during a cataclysmic shift in delivery of education writ large, likely because they perceived little or no need to change their practices to facilitate student learning.

A deeper investigation into instructional designs and student activity would no doubt reveal more nuanced insights into the varied forms of peer interaction, but this would be difficult at the scale of the present study. Moreover, it is unclear whether generalized outcomes, which is our focus here, hinge on such complexities. In the next section, we consider the association between general practices and student outcomes.

Research Question 4: Did students who had more peer interaction demonstrate improved academic outcomes during the period of remote instruction?

Even though curricular support for peer interaction was not implemented as a randomized controlled experiment during the shift to remote instruction, there were obviously differences between courses in their support for peer interaction following the shift. As our analyses have documented thus far, there were larger increases in the number of individual assignments than collaborative assignments after the transition to remote instruction (Research Question 1), but those instructors who did facilitate peer interaction after the transition reported improved outcomes (Research Question 2), despite minimal change in the quantity and quality of these practices (Research Question 3). Presently, we examine records from the students themselves, to explore associations between opportunities for peer interaction and students’ academic outcomes during remote instruction.

The vast majority of students in our analysis were enrolled in more than one class during Spring 2020, and some classes might have support for peer interaction while others might not. In this sample, 1,868 students enrolled in at least one undergraduate class with, and at least one undergraduate class without, some form of collaborative activity (peer review, Piazza, CourseNetworking, etc.), not including discussions.1 For this subset, we contrasted performance in classes with collaborative activities with the same students’ performance in their other enrolled classes without collaborative activities. We implemented this contrast as a linear mixed-effects model, again using the brms package, where the outcome variable was a student’s average estimated cumulative score in their Canvas courses (according to the grading categories and weights implemented by the instructor), comparing between enrollments that had, and that did not have opportunities for peer interaction after the transition to remote instruction. This model included weights for the number of enrollments in each of these levels, included fixed effects to correct for structural properties of these courses (average enrollment, number of assignments, number of announcements,2 academic level), and also included a random effect baselining each individual student. Credible model estimates, after controlling for covariates and individual student effects, are shown as blue lines superimposed on raw data in Table 1.

Figure 1

Estimated Performance in Undergraduate Courses With Collaborative Activities and Without Collaborative Activities, Excluding Discussions

Note. Each dot represents an individual student. Semi-transparent blue lines show credible model estimates of the central tendency of students’ percent scores, after controlling for course-level covariates and individual student random effects. Estimates suggest that students earned 1.16% points higher in classes with collaborative activities, compared with the same students’ performance in courses without collaborative activities.

We estimate that students performed 1.16% points (95% HDI [0.65–1.66]) better in their undergraduate courses with collaborative activities, compared with the same students’ performance in undergraduate courses without collaborative activities. This contrast takes into account differences between students’ overall performance, as well as structural differences between courses. But even so, this modest difference may be driven by instructors’ grading standards or other unmeasured covariates, so we emphasize the normal caveat that these findings are correlational.

For additional triangulation, we also conducted a more in-depth analysis of the association between collaborative activity and students’ socioemotional outcomes within a focused sample of specific courses. This analysis is performed using data exclusively from the period of remote instruction. For each of the three most heavily used collaborative tools at our institution (Canvas discussions, Piazza, and CourseNetworking), we selected a sample of the five courses that had the largest amount of average activity in these tools and had the highest number of enrolled students responding to our survey. Thus, we selected 15 courses for this analysis, with a total of 657 enrollments and with 541 unique students in these courses. For each student, we calculated the z-score (within each respective course) of the number of discussion posts they made, or the number of launches into the Piazza or CourseNetworking, depending on the course. This z-score represents the relative amount of interaction with an online collaborative learning activity during remote instruction, compared with a students’ peers during remote instruction. Using these students’ survey responses (converted to binary agreement values; 1, 0) as outcome variables, we fit the parameters of linear mixed-effects models, again in brms, measuring the association between the survey responses and the z-score, while also controlling for a students’ final score in the course as a fixed effect, and including a random-effect term as a baseline for average performance in each of the 15 courses. Survey items and estimated coefficients are shown in Table 4. With the exception of the last item in Table 4, all survey questions were preceded by “After courses transitioned to remote instruction … ”

Table 4
Relationship Between Students’ Survey Responses and Collaborative Activity

Survey item

Coefficient relating amount of activity in collaborative tool
to agreement with survey item

I still found it easy to think of myself as a college student.

0.28 [0.05 to 0.53]*

I became less concerned about what my classmates and instructors thought of me.

0.13 [−0.09 to 0.36]

I felt like I lost touch with the Indiana University community.

−0.24 [−0.50 to 0.01]

My academic goals became less important to me.

−0.12 [−0.36 to 0.11]

I felt I was successful as a college student.

0.17 [−0.05 to 0.40]

I interacted with my classmates more.

0.39 [0.05 to 0.72]*

I earned lower grades than I expected.

−0.18 [−0.42 to 0.05]

I anticipate being behind in my academic progress upon return to the classroom.

−0.14 [−0.43 to 0.11]

Note. Values indicate the estimated change in response tendency for each survey item, associated with increasing z-scores representing students’ activity in collaborative tools. With the exception of the last item in the table above, all survey questions were prefaced with, “After courses transitioned to remote instruction : : : ” Positive estimates indicate tendencies for more agreement among students with more activity in collaborative tools, controlling for a student’s performance and with a random effect for each course. Asterisks (*) indicate that the 95% highest density interval (HDI; a Bayesian analog of the confidence interval) of this estimate does not include 0.

The general trend is that students who report improved socioemotional outcomes also show suggestions of increased activity in collaborative tools relative to their peers. In particular, there is a significant association between student activity in collaborative tools and agreement with the statement “I interacted with my classmates more,” providing useful confirmation that our definitions of collaborative tool use converge with students’ self-report. Controlling for each student’s estimated final score in their courses may have attenuated the effects of collaborative tool use on students’ agreement with survey items regarding grade outcomes. Nevertheless and notably, students who were more active in collaborative tools were significantly more likely to agree with the statement “I still found it easy to think of myself as a college student,” even when controlling for student performance. This survey item is particularly noteworthy, as it is the item that most closely measures belonging between self and context, which past research had shown to be the primary benefit of peer interaction in online learning environments (Meeuwisse et al., 2010).

Summary and Conclusion

The goal of this study was to investigate the prevalence of instructional efforts to facilitate technology-mediated collaboration during remote instruction due to COVID-19 and its association with student outcomes.

We found no change in the relative quantity of collaborative learning activities upon the transition to remote instruction, observing that instructors, overall, continued to assign over six times more graded individual assignments than collaborative assignments on average, and there was no credible change in the adoption of specialty tools for collaboration (Research Question 1). Those who did incorporate collaborative learning activities into their courses during remote instruction tended to have less experience teaching college courses in general, but more experience teaching online, and more familiarity with technology for teaching and learning (Research Question 2). For those who had collaborative tools present in their courses prior to remote instruction, there was no significant change in the amount of activity in these tools, and only minor changes in the semantic contents of students’ discussion posts (Research Question 3). When collaborative activities were present after the transition to remote instruction, there was evidence of improved academic outcomes both in instructors’ self-reports (Research Question 2) and students’ estimated course performance (Research Question 4). Notably, even when controlling for individual students’ course performance, students with more collaborative activity in their courses reported higher agreement that they were able to think of themselves as college students (Research Question 4).

Our study’s primary novel contribution is the observation, across a large sample of college instructors, that collaborative technologies were neither prioritized nor privileged during the rapid transition to remote instruction. When college went exclusively online, this online experience heavily favored individual learning activities, and as a consequence, students experienced a sharp reduction in opportunities to collaborate with their peers (see also Means et al., 2020). Second, observations triangulated from instructors’ survey responses, students’ estimated grades, and students’ survey responses all suggest improved academic and socioemotional outcomes when students had such opportunities to collaborate.

In a perfect world, teachers would have anticipated that students could benefit from increased opportunities to interact with one another during a time of campus closure. But the state of the world in March 2020 was far from perfect. Instructors in the current sample had 2 weeks to move their in-person or hybrid courses entirely online amidst a surging pandemic, while many also suddenly needed to provide additional family care. This finding is not an indictment on college instructors nor the quality of their effort amidst a global public health crisis. Rather, it is an observation that, during a rapid emergency transition to online learning, instructors are not naturally inclined to prioritize building an online learning community—which is unfortunate, considering that “community” is precisely what was lost during remote instruction.

A further consequence of this rapid transition is likely that many of the potential benefits of introducing more collaborative tools and resources in supporting students’ development of social and disciplinary knowledge were not realized. This probable outcome is most likely due to the lack of time observed by students and teachers in planning and implementing effective activities to conduct and regulate their teaching in addition to the numerous stressors that were introduced as a result of rapid social and institutional change.

During a time when in-person student services, events, and facilities were suspended, online courses constituted the primary connection between a student and their campus community. Students who interacted with other students in these courses got slightly better grades and reported that it was easier for them to think of themselves as college students. This socioemotional outcome, a sense of belonging between oneself and context, is a key component of effective learning communities (Allen et al., 2021; Meehan and Howells, 2019; Meeuwisse et al., 2010; Moeller et al., 2020). Toward this end, our study adds to existing evidence of an association between improved student outcomes and the use of collaborative learning technologies (Chen, 2019; Jaggars & Xu, 2016) and suggestions that student belonging is an important element of academic continuity during a time of campus closure (Day, 2015; SchWeber, 2013).

Our use of large-scale survey and LMS data enabled us to cast a particularly wide analytical net when examining our research questions, but also raised limitations to the precision of our inferences. In particular, this study is a description of basic utilization of collaborative tools and is blind to the course subject matter and the specific instructional designs that incorporated them. There is no one-size-fits-all collaborative tool (Jeong et al., 2019), and any tool’s strategic use is key to its effectiveness. Ergo, we want to clearly avoid the suggestion that a teacher might simply turn-on a collaborative tool, do nothing else, and observe the spontaneous emergence of a constructive social community. Beyond our confirmation that students who used these tools reported increased interaction with their peers, identifying the requisites of constructive engagement with collaborative tools requires more nuanced analysis, beyond the scope of this large-scale study.

Limitations of the current results’ generalizability may apply to at least two additional dimensions, beyond instructional design. First, how broadly should we expect the current results to generalize to other student populations? While the present study included a particularly large sample from nine different campuses, these were limited to the Indiana University system, participants skewed toward being white and female, and we were unable to include international respondents due to legal and compliance restrictions during data collection. We also recognize that the college experience during remote instruction differed substantially across demographic variables, in no small part due to uneven technology access (Jaggars et al., 2021). For those concerned about generalizability to new student populations, we have provided an Online Supplemental that profiles the sample in each of our analyses, and we also remind readers that raw survey data (including all respondents and demographic variables) are available at Motz and Quick (2020). Second, how broadly should we expect the current results to generalize beyond the context of the COVID-19 pandemic? Given that we are hardly the first to present evidence of an association between improved student outcomes and social interaction in online learning environments (e.g., Jaggars & Xu, 2016), there is reason to believe that this pattern is robust. However, looking ahead, we hope that our observations of the rarity of technology-mediated student-to-student interaction does not generalize into the future.

Whether confronted by another global pandemic or by more localized emergencies, this is unlikely to be the last time that a university is forced to rely on remote instruction for academic continuity. In the rush to rapidly transition online, the present study suggests that instructors might still prioritize individual assessments, at the cost of opportunities to leverage collaborative tools. Facing campus closure, institutions might now anticipate this pattern and might recommend that instructors privilege collaborative learning activities instead. Considering that students’ normal in-person opportunities to interact with peers are curbed during remote instruction and that peer interaction is fundamental to effective online learning, the present study’s findings add evidence that technology for supporting collaborative learning can provide benefits for student outcomes in such situations.

Supplemental Materials

https://doi.org/10.1037/tmb0000061.supp


Comments
1
?
surbhi nahta:

Sevenmentor is a Renowned Brand For Quality Education And In- House Job Placement Services Since, Past Decades.