Why and how do middle school students exchange ideas during

0 downloads 0 Views 5MB Size Report
Jul 6, 2018 - There is a long history of research on computer support for students learning .... using supporting evidence (Sandoval & Millwood, 2005) and ...
Intern. J. Comput.-Support. Collab. Learn https://doi.org/10.1007/s11412-018-9282-1

Why and how do middle school students exchange ideas during science inquiry? Camillia Matuk 1

& Marcia C. Linn

2

Received: 24 August 2017 / Accepted: 6 July 2018 # International Society of the Learning Sciences, Inc. 2018

Abstract Science is increasingly characterized by participation in knowledge communities. To meaningfully engage in science inquiry, students must be able to evaluate diverse sources of information, articulate informed ideas, and share ideas with peers. This study explores how technology can support idea exchanges in ways that value individuals’ prior ideas, and allow students to use these ideas to benefit their own and their peers’ learning. We used the Idea Manager, a curriculum-integrated tool that enables students to collect and exchange ideas during science inquiry projects. We investigated how students exchanged ideas, how these exchanges impacted the explanations they ultimately produced, and how the tool impacted teachers’ instruction. We implemented the tool with 297 grade 7 students, who were studying a web-based unit on cancer and cell division. Among other results, we found a relationship between the diversity of students’ ideas, and the sources of those ideas (i.e., whether they came from the students themselves or from their peers), and the quality of students’ scientific explanations. Specifically, students who collected more unique ideas (i.e., ideas not already represented in their private idea collections) as opposed to redundant ideas (i.e., ideas that reiterated ideas already present in their private idea collections) tended to write poorer explanations; and students who generated their own redundant ideas, as opposed to choosing peers’ ideas that were redundant, tended to write better explanations. We discuss implications for formative assessment, and for the role of technology in supporting students to engage more meaningfully with peers’ ideas. Keywords Science inquiry . Middle school . Web-based curricula . Social technologies . Knowledge repositories . Student learning

* Camillia Matuk [email protected] Marcia C. Linn [email protected]

1

New York University, 2 Metrotech Center, Brooklyn, NY 11201, USA

2

University of California, 4611 Tolman Hall, Berkeley, CA 94720, USA

Matuk C., Linn M.C.

Introduction Overview Collaborative knowledge construction is essential for citizens’ everyday participation in society, and particularly critical to scientific progress. Scientists share and build upon others’ ideas through public presentation and publication of findings, and through discussion of work in progress, joint problem solving, and argumentation (Latour & Woolgar, 2013; Toulmin, 1958; Lemke, 1990). Engaging students in authentic scientific practices, as recommended by national standards (e.g., NGSS Lead States, 2013), might include students participating in similar knowledge building activities (Duschl & Osborne, 2002; Osborne, 2010; Scardamalia & Bereiter, 2014). While collaborative knowledge construction has been previously cultivated in classrooms with varying degrees of success (e.g., Cohen, 1994), these efforts have also revealed the persistent challenges of encouraging students to engage productively with one another’s ideas. Teachers generally face obstacles to incorporating diverse perspectives into their teaching, including concerns over students’ abilities to make use of those ideas, and over the limited class time available to properly address diverse ideas (e.g., Silver et al., 2005). Often thoughtfully planned, long-term professional development is required for teachers to succeed in supporting students to productively address one another’s ideas, and such programs can be challenging to implement amid teachers’ busy schedules (e.g., Simon et al., 2006). There is a long history of research on computer support for students learning through engagement with peer ideas (Hmelo-Silver et al., 2017; Hsi & Hoadley, 1997; Vogel et al., 2017), that also shows clear areas for improvement (Wecker & Fischer, 2014). This research shows how, given opportunities to encounter and exchange diverse ideas, students can refine their own thinking even as they contribute to refining the thinking of others. It highlights that engagement in shared knowledge construction involves articulating informed ideas, being mindful of how one’s own ideas might contribute to a broader conversation, evaluating diverse sources of information, and selectively integrating that new information into one’s existing ideas. These skills are relevant to scientific discourse specifically, and to communication more generally. As it becomes increasingly possible to design and incorporate advanced knowledge construction tools into curriculum environments, it also becomes necessary to understand the opportunities that these technologies offer for collaborative science inquiry and learning. This study explores alternative ways to support students as they exchange ideas during a classroom science inquiry project. Guided by the Knowledge Integration (KI) framework, which specifies an approach to support students in integrating new and existing ideas into a coherent understanding (Linn & Eylon, 2011), we designed curriculum materials to reveal the impacts of students’ idea exchanges on the coherence of their final scientific explanations (Matuk & King Chen, 2011; Matuk & Linn, 2015). We integrated a web-based tool called the Idea Manager into a grade 7 cell life sciences unit designed to promote KI. We studied how students using the tool exchanged ideas with their peers, and then how students used those ideas to refine their own scientific explanations. Our exploratory analyses describe patterns in the relationships between the diversity of ideas that students collect, and the quality of their written science explanations. Findings have implications for the design of science instruction and curriculum materials that support meaningful social interactions, and contribute to our understanding of the roles that technology can play in collaborative learning. Before describing our study, we review research on the value of, and challenges of engaging with peers ideas for learning.

Intern. J. Comput.-Support. Collab. Learn

The value of engaging with peers’ ideas There is much empirical support for the potential that engaging with peers’ ideas has for STEM learning (Hausmann et al., 2004). Learning can happen when students build upon and elaborate peers’ ideas (Hogan et al., 1999; Tao & Gunstone, 1999; van Boxtel et al., 2000). Students can also benefit from critically evaluating their peers’ ideas (Schwarz et al., 2000). The Idea Manager implemented in this research enables us to study this process. One benefit to being exposed to peers’ different perspectives, is metacognitive. In sharing ideas, students can be prompted to make comparisons between their own and others’ contributions, and thus to notice inconsistencies in their own thinking (Amigues, 1988; Hatano, 1993; Keil, 2006). More specifically, formulating ideas to share with others, and listening to others share their ideas encourages learners to be metacognitive. Explainers must decide upon promising ideas and formulate coherent communication about those ideas. Through the process of explaining and justifying their ideas, students monitor the state of their understanding, and notice gaps and inconsistencies (Hatano, 1993). Students are encouraged to clarify their reasoning when preparing to share ideas with others (Forman et al., 1985; Whitebread et al., 2007). They must decide upon promising ideas and formulate coherent communication about those ideas. Through the process of explaining and justifying their thinking, students monitor the state of their understanding, and notice gaps and inconsistencies (Hatano, 1993). Another benefit of sharing ideas is in refining, and even constructing new conceptual understanding. This is because in presenting ideas to others, learners must re-articulate these ideas, and explain and justify their reasoning. Doing so gives students opportunities to elaborate and clarify their thinking (Forman et al., 1985; Linn & Eylon, 2011; Whitebread et al., 2007), and to find new ways to formulate, revise, elaborate, or streamline their ideas (Roscoe & Chi, 2008). Such peer interactions can prompt students to question their ideas, create new ideas to fill gaps and to reconcile inconsistencies, and deepen connections between new and prior knowledge (Bargh and Schul, 1980; Chi, 2000; Wittrock, 1990). Thus, the process of sharing ideas can lead to building new understanding (Chi et al., 1989; Chi et al. 1994; Chi et al., 2001; Webb et al., 1995). Research in particular disciplinary areas highlights specific ways of engaging with peers’ ideas that benefit learning. In mathematics classrooms, for example, students increase their learning when they give explanations that integrate multiple ideas; and that acknowledge, reiterate, elaborate upon, and offer counter arguments to peers’ ideas (e.g., Veenman et al., 2005; Warner, 2008). Similarly, students’ participation in classroom conversations with peers about their mathematical ideas can predict their later achievement. This is especially true when students’ participation involves detailing their problem solving approaches, elaborating on others’ ideas, justifying their disagreements, and proposing alternative strategies (Webb et al., 2014). Research also shows epistemological benefits to sharing ideas. Specifically, building on one another’s ideas can help students to view learning as a collaboration, in which ideas are improved through joint effort for the benefit of many (Bell & Linn, 2000; Hong & Chiu, 2016; Hong et al., 2016). Through their participation, students can develop their abilities as contributors to scientific knowledge, rather than simply as reviewers of it (Scardamalia & Bereiter, 1993; 2014).

Challenges for students engaging with peers’ ideas It is clear from the research on collaborative learning that supporting the productive exchange of ideas is critical, but also challenging. For example, grade 6 students who acknowledged and

Matuk C., Linn M.C.

actively engaged in discussion of peers’ ideas were more successful at solving problems than students that ignored or rejected others’ ideas (Barron, 2003). However, students’ ideas are often diverse, conflicting, and incomplete (diSessa, 2000), and so supporting individuals in engaging effectively with those ideas is challenging. One reason is that students may simply not realize the importance of considering and addressing alternative viewpoints in constructing persuasive arguments (Nussbaum & Kardash, 2005; O’Keefe, 1999). Considering alternative viewpoints may also increase cognitive load (Coirier, Andriessen & Chanquoy, 1999). As well, people tend to seek consistency, and to gravitate toward ideas that already agree with their own (Simon & Holyoak, 2002). It follows that, although considering arguments that counter their own can result in deeper understanding as well as stronger arguments, students tend to avoid generating counter arguments, both orally and in writing (Perkins, Farady & Bushey, 1991; Knudson, 1992; Leitão, 2003; Koschmann, 2003). They moreover have difficulty in objectively evaluating views that conflict with their own (Perkins, 1987; Kuhn, 1991; Nussbaum & Kardash, 2005). Students moreover struggle in formulating clear expressions of their ideas (McNeill & Krajcik, 2008; Sandoval & Millwood, 2005), particularly in using supporting evidence (Sandoval & Millwood, 2005) and incorporating newly encountered evidence into their arguments (Chinn & Brewer, 1998; Kuhn, 1989; McElhaney et al., 2012), as well as in coordinating evidence from diverse sources and with alternative hypotheses (Kuhn et al., 1995; Schauble, 1996). Some researchers have argued that, given students’ limited domain knowledge, it might be more efficient to simply provide them with already formulated arguments and counterarguments (e.g., Nussbaum & Kardash, 2005). Students might then learn by critiquing and elaborating upon those arguments, although they would still require the necessary tools to properly do so. Consistent with the KI framework, we begin with the notion that effective instruction builds on students’ ideas (Linn and Eylon, 2011; Ruiz-Primo & Furtak, 2007) and seek ways to analyze the results when students consider the ideas of others. Furthermore, we explore how technology might help teachers, learners, and instructional designers to give leverage to the ideas that already exist, and that will emerge in a classroom, in order to use those ideas as resources in students’ learning.

Teacher support for productive collaboration around peer ideas Students can be supported in engaging with others’ ideas when given clarity in the norms and goals of peer interaction, and guidance in identifying and using relevant ideas (Berland & Reiser, 2009; Barron, 2003; Blatchford et al., 2003; Mercer et al., 1999). For this, teachers’ active involvement in encouraging students’ peer interactions can be critical for ensuring the benefits of collaborative learning (e.g., Webb et al., 2014). For instance, students participate more meaningfully when teachers prompt them to expand upon their own ideas (e.g., Kazemi & Stipek, 2001) and the ideas of their peers (e.g., O’Connor et al., 2015). In addition, students performed better on tests of reasoning and problem solving when their teachers actively supported discourse moves (e.g., through paraphrasing, questioning, and encouragement to explain or elaborate ideas) compared to students whose teachers did not engage in these practices at all or as frequently (Gillies & Haynes, 2011). Experienced classroom teachers support students in developing their argumentation skills by modeling and exemplifying the formation and critique of arguments, encouraging the use of evidence to justify arguments, prompting the construction of counterarguments, and emphasizing the importance of communicating and listening to others’ ideas (Simon et al., 2006). In

Intern. J. Comput.-Support. Collab. Learn

inquiry learning environments, teachers’ open ended questions can also prompt students to provide evidence to justify their claims (McNeill & Pimentel, 2010). Developing the ability to facilitate collaborative activities often requires professional development. Furthermore, teachers often complain that they lack the time to adequately guide each of their students’ social interactions. Moreover, synchronous discourse in face-to-face classroom settings does not leave much time for students to properly consider others’ ideas, nor to formulate their own responses (Rowe, 1974). Thus, the challenge of designing supports for students’ peer interactions is of great interest and relevance to researchers in computer supported collaborative learning (Andriessen, 2006). Researchers have explored such strategies as providing students with worked examples of effective discussions (Rummel & Spada, 2005), computerized representational guidance (Bell, 1997), scripts to guide face-to-face discussions (e.g., O’Donnell and Dansereau, 1992; Palinscar and Brown, 1984) and computer-mediated discussions (Dillenbourg, 2002; Rau, Bowman & Moore, 2017; Rummel & Spada, 2005; Schellens et al., 2007; Weinberger et al., 2005). In one study on collaborative scripts, students in an undergraduate chemistry course engaged in peer discussions of issues that an adaptive tutor recommended based on those students’ prior responses, and experienced greater learning gains compared to students in a traditional instructional context (Rau, Bowman & Moore, 2017). However, the intervention in that study took place at a time after student teams may have already established routines for collaboration. Thus, it may not be possible to attribute students’ learning entirely to the effects of the collaborative script. In another study, high school students who collaboratively constructed argumentation maps during a science investigation using the Belvedere system successfully incorporated more counterarguments to support their views than students who did not construct maps; however, this effect did not persist into students’ later write-ups (Toth et al., 2002). These findings reveal a need for additional investigations of ways to use technology to scaffold collaboration.

The relative benefits of diverse and congruent ideas This study investigates promising ways to build on students’ ideas to support learning, and moreover, how to take advantage of the possibilities that technology offers to do so. We focus on exploring the learning advantages in students’ exposure to peer ideas that are consistent with, or that diverge from their existing ideas. Prior research across domain areas has distinguished between the influences of congruent and incongruent ideas on thinking and explanation, and has found mixed conclusions about their relative benefits. Research in favor of the influence of divergent ideas argues that alternative ideas help learners to reconsider their thinking, and enhance conceptual learning. One study of Wikipedia editors, for example found that exposure to incongruous, as opposed to congruous information in Wikipedia was more likely to prompt editors to revise their ideas (Moskaliuk et al., 2012), which often leads to improvement (Linn & Eylon, 2011). Other research has found evidence that divergent ideas can improve conceptual learning outcomes. In one such study, researchers grouped together high school students based on their strategies for comparing decimal fractions. They found that groups with individuals holding diverse strategies ultimately performed better on both an immediate and delayed post test (Schwarz et al., 2000). In another study, college students who engaged in explanation building for the purpose of distinguishing among contested evolutionary perspectives experienced higher learning gains compared to students who engaged in explanation building for the purpose of achieving consensus (Asterhan & Schwarz, 2009). In another study, Matuk & Linn (2015) found that middle

Matuk C., Linn M.C.

school students, who were prompted to seek divergent ideas from their peers during their work in a science inquiry unit, later showed greater learning gains on a post-test than students who were prompted to seek ideas that agreed with their own. In another example, middle school students teams were more engaged in a design task when members were assigned divergent design goals, than when they were assigned shared design goals (Vitale et al., 2017). In this case, the greater coordination required to work with divergent goals may have led team members to more seriously consider one another’s ideas. These results align with evidence that encounters with incorrect ideas have value. In one study, students who encountered both correct and incorrect ideas and reasoned about why they might be incorrect or incorrect developed sounder conceptual understanding than students who only encountered and reasoned about correct ideas (Hynd & Alvermann, 1986). Likewise, researchers who grouped together middle school aged students according to their prior conceptions about the physics of motion down an incline (Howe et al., 1992) and of flotation (Howe et al., 1990) found that even when these initial ideas were incorrect, groups with diverse initial ideas made greater progress from the pre to the post test. Other literature affirms the importance of encountering both incongruent and congruent ideas. For instance, information literacy standards emphasize the importance of divergent and convergent strategies when seeking and evaluating information (American Library Association, 2015). Indeed, there are learning benefits in critically evaluating peers’ (divergent) ideas, as well as in elaborating peers’ (congruent) ideas (Hausmann et al., 2004). Convergence is often described as the ultimate goal of collaboration, as well as a measure of learners’ shared understanding (Scardamalia & Bereiter, 1994; Weinberger, Stegmann & Fischer, 2007), and a benefit to individual learning (Fischer & Mandl, 2005). Convergence that occurs early in the collaborative process can predict a group’s success (Kapur, Voiklis & Kinzer, 2008). Yet, divergence is critical to convergence: individuals who initially diverge in their thinking, may benefit from using a process of negotiation come to convergence (Halatchliyski, Kimmerle & Cress, 2011). The process of convergence or building coherence is central to learning. Divergent ideas play a role in stimulating this process. Studies have uncovered a Btwo wrongs make a right^ phenomenon, whereby student dyads with initially incorrect ideas succeed in realizing their initial errors and resolving their differences (Glachan & Light, 1982). The conflict between ideas may lead students to pursue strategies for resolving them they would not otherwise have pursued. In solving decimal fraction problems, for example, high school students with initially divergent conceptual bugs employed hypothesis testing and argumentation to clear up their disagreements; this was less likely within dyads in which one student was correct (Schwarz et al., 2000). Together, these studies emphasize the value of encountering diverse ideas from peers and the importance of understanding and supporting the ways that collaborating students use those ideas to reach convergence.

Opportunities for technology to support collaborative knowledge integration Design of technology to help students collaborate productively deserves further research. Technology offers the opportunity to closely examine those moments during which students encounter and make decisions about how to respond to their peers’ ideas. For example, how do students make decisions over which of their own ideas to share, and which from their peers to incorporate? How can technology capture these decisions to inform more effective instruction? We explore student actions at this level to a better understand students’ challenges with engaging with peers’ ideas, as well as where and how they might benefit from support.

Intern. J. Comput.-Support. Collab. Learn

Toward this goal, technology can help to segment and scaffold the process by which students encounter and respond to ideas, allowing them time for greater reflection, and offering a finer-grained look at how they approach this task. Among existing collaborative learning tools that are designed with explicit grounding in learning theories, a handful stand out for their relevance to the current study. For example, Knowledge Forum guides knowledge building processes by offering an open space within which participants can post, share, organize, and comment upon individual and collaborative notes, which can include both textual and graphical information (Bereiter, 2002; Bereiter & Scardamalia, 1989, 1993, 1994, 2006; Scardamalia et al., 1989; Scardamalia, 2002). Other platforms allow learners to interact while also offering tools to seek, organize and retrieve information from wider sets of knowledge resources (e.g., Yang & Chen, 2008). This study investigates a unique curriculum-integrated tool, called the Idea Manager, which allows students to access each others’ ideas as learning resources when refining their own understanding; and captures students’ exchanges of ideas in order to inform teachers and researchers of better ways to support students collaborative learning. This work is guided by the KI framework (Linn & Eylon, 2011), which views learners as holding multiple, often contradictory views of any one science topic. From a KI perspective, the process of learning is a deliberate activity of distinguishing and making connections between new and existing ideas (diSessa, 2000; Eylon & Linn, 1988; Linn & Hsi, 2000; Slotta et al., 1995). Instruction guided by KI thus emphasizes eliciting students’ existing ideas; helping them explore additional, often more normative ideas; and guiding them to reflect on distinguishing and sorting out alternatives as they build an integrated understanding. In other words, the KI perspective assumes that when refining one’s understanding of a science topic, learners benefit by initially diversifying their own ideas and then converging upon a more coherent view. While collaborative learning has long been inherent in KI curriculum materials, new technological advances such as the Idea Manager have only recently made it possible to explore the role of technology in collaborative knowledge integration, and the issues it raises for science learning and curriculum design. Prior research finds evidence that technologyenhanced guidance can support students’ conceptual learning by encouraging them to collaboratively make connections between their ideas through face-to-face discussion (Rau, Bowman & Moore, 2017). But in contrast to face-to-face classroom discussions, which permit only certain ideas to be heard, and even then, limiting these ideas to sequential presentation and response, social technologies can allow students to share and access all of their peers’ contributions at once. This study investigates how students respond to the opportunity to view all their peers’ ideas at once. We designed this work to understand how students exchange ideas with their peers during the process of building scientific explanations, and how the shared ideas impact their final explanations. Our specific research questions are: 1. What is the overall impact of the unit on students’ conceptual learning? This question verifies whether the unit improved students’ learning of relevant science concepts. 2. How will students exchange ideas with their peers? Because it was novel for the students to share ideas during science class in the manner facilitated by the Idea Manager, we wished to explore how they used the tool to interact with one another. For example, would they be inclined to freely contribute their own ideas to the public repository? Would they be reluctant to share their ideas? Would students still generate their own ideas toward developing their

Matuk C., Linn M.C.

explanations, or would they rely on copying their peers’ ideas? When students copy ideas from the public repository, will they recognize and select good quality ideas? Finally, will students tend to collect peer ideas that diversify, or that reinforce their existing ideas? Knowing the answer to this question will help us to understand how students draw on their peers to improve their own thinking, and how they judge new ideas in relation to their own. This may inform new ways to guide their engagement with peer ideas. 3. How will the relative diversity of students’ ideas impact students’ final written explanations? By idea diversity, we refer to whether students’ collections of ideas consist of many different vs. redundant ideas. By sources of ideas, we refer to whether ideas are generated by the students themselves, or copied from their peers. More specifically, we wish to know the quality of the explanations of students who earlier tended to collect ideas that simply agreed with their own; and conversely, the quality of the explanations of students who earlier tended to collect ideas that conflict with, or that otherwise diversify, their own ideas. An answer to this question may inform ways to better support students’ engagement with peer ideas. 4. What are the teachers’ experiences with technology-supported peer idea exchange during science inquiry? We sought to understand the value that teachers perceived in the Idea Manager for enhancing their students’ engagement with their peers’ ideas, and for enhancing their own approaches to classroom instruction.

Materials and methods Participants Participants were 297 grade 7 students taught by two science teachers, who each taught 5 class periods. The school was located in the western United States, and served a diverse student population. At the time of this study, the teachers each had between 5 and 8 years of experience collaborating with our research group as participants in classroom-based research studies and in our annually hosted professional development program on technology-enhanced science inquiry. Each of the teachers had also used previous versions of the curriculum unit (described below) in their instruction, although this was the first time they used it with the Idea Manager.

WISE and the idea manager The platform on which the unit was designed and delivered was the Web-based Inquiry Science Environment (WISE, wise.berkeley.edu, Slotta & Linn, 2009). WISE is a free, open-source, and customizable online learning environment. With WISE, students work as partners during class time through self-paced investigations of a driving inquiry question. Meanwhile their teacher circulates to offer assistance as needed, and may also introduce or review concepts as a whole class at particular times. In the unit, students collect and interpret evidence from various activities, including visualizations, virtual experiments, and videos; and use different tools to express and refine their understanding (e.g., annotations, short essays, drawings). WISE meanwhile logs students’ responses and interactions, which enables teachers to grade and give feedback, and designers and researchers to better study the impacts of technology on student learning. The Idea Manager is a tool integrated into WISE, and designed to scaffold students in constructing explanations (Fig. 1). Following a KI pattern, the Idea Manager makes explicit

Intern. J. Comput.-Support. Collab. Learn

the acts of gathering, distinguishing, and sorting ideas (Matuk et al., 2011, 2016). As WISE logs these actions, the Idea Manager provides teachers and researchers with a record of students’ changing ideas that can inform instruction and design. For example, prior research used the Idea Manager in a chemistry unit to identify when in the process of explaining students found specific ideas more or less difficult to apply (McElhaney et al., 2012). Such analyses with the Idea Manager indicated which students needed what kind of assistance. As such, the Idea Manager adds value over typical pre and posttests of students’ understanding. At the time of this study, the Idea Manager was in its early stages of design iteration (Matuk et al., 2016). It recently incorporated a feature that extended its functionality from a private repository of ideas belonging to student partners (their Private Basket of ideas), to a common space through which all student members of a class period can exchange ideas (the Public Basket). Students first collect ideas in their Private Baskets, and then can select ideas to share anonymously with others. Once they have shared their own ideas, students can then visit the Public Basket to select ideas from their anonymous peers to add to their own Private Baskets. First piloted in classrooms as a hybrid online/ face-to-face prototype (Matuk et al., 2013), the IM’s public feature makes it possible to track how ideas emerge and spread within a classroom, and how exchanging ideas affects students’ developing understanding. Matuk et al., (2016) document a fuller description of the theoretical rationale and design of the Idea Manager, which was inspired by previous and existing argumentation and explanation scaffolds (e.g., Bell, 2013; Zhang & Quintana, 2012).

Curriculum and study design What Makes a Good Cancer Medicine? (aka, Mitosis) is a middle school unit available at wise. berkeley.edu (Preview the unit at http://wise.berkeley.edu/webapp/previewproject. html?projectId=6498). In it, students assist a fictional scientist in investigating the potential of three different plant-derived chemicals in treating cancer. This sequence of activities was designed to follow the Knowledge Integration pattern by structuring activities to support eliciting, adding, distinguishing, organizing, and reflecting on ideas as students prepare to write recommendations for cancer medicines based on their observations (Matuk & Linn, 2013). As students examine animations to compare the effects of each chemical on cell division, they learn the phases of mitosis and cell division. Throughout the unit, students are encouraged to document their ideas in their Private Baskets of the Idea Manager around specific topics, including cell division, cancer medicine, and observations students make of the animations (Fig. 1a-c). Students may also delete ideas that they no longer feel are relevant. At four different points in the unit, students are encouraged to sort their ideas in preparation to write explanations in response to different questions: (1) What happens when cells divide? (2) How might a medicine stop cancer cells from dividing? (3) Which treatment would you recommend? and (4) Why does hair fall out during cancer treatment? At these same four points, students are asked to share at least one of their private ideas to the Public Basket, and to copy at least one of the ideas from the Public Basket into their Private Baskets. After each exchange, students are asked to write justifications for their choices of peer ideas. Students worked on the unit during 9 consecutive school days. Most worked in assigned pairs, or else individually. They progressed self-paced through the unit toward milestones set by their teachers. The teachers circulated their classrooms to offer students help as required, and also began each class with an opener, during which they reviewed concepts covered during the previous day, foregrounded concepts students were about to cover, and addressed any other issues that came up from their review of students’ work-in-progress.

Matuk C., Linn M.C.

Fig. 1 (a) Over the course of a unit, students enter short text entries into their Idea Baskets and specify attributes (e.g., source, tags, rating). Students can choose to share any private idea to the Public Basket, from which they may also copy any idea into their Private Baskets. (b) Entries accumulate in a sortable list to which students can return and revise. (c) In the Explanation Builder, students drag ideas into author-defined categories, and refer to these as they write an explanation in response to a prompt

Intern. J. Comput.-Support. Collab. Learn

Fig. 1 (continued)

On the first and last days, students completed a 4-item, web-based pre and posttest, which included open-ended and multiple-choice questions designed to measure students’ conceptual understanding of key ideas in the unit (Figs. 2-5). The first item (Fig. 2) asked students to reorder (by dragging and dropping) images of cells pictured at different stages of division. The second item (Fig. 3) asked students to describe what was happening in a given picture of a cell, and why that process was important for cell division. The third item (Fig. 4) presented students with ideas from fictional students about the importance of spindle fibers to cell division. Students were asked to select one idea with which they most disagreed, and to explain why they disagreed with it. The final item (Fig. 5) asked students to describe how a medicine designed to treat cancer would affect cell division.

Data and analysis Overview of data Our data include students’ logged responses to the pre and posttests, and their work in the Mitosis unit. Among the latter, we focused specifically on students’ logged work with the Idea Manager, including the contents of their ideas, whether those ideas were generated by themselves or their peers, and the number of times those ideas were copied by peers. One researcher went between the two teachers’ classrooms on each day that the student were working on the unit, and documented fieldnote observations about how the students were engaging with the unit and with each other, and about how the teacher facilitated and

Matuk C., Linn M.C.

responded to their activities. We also collected video recordings of selected student pairs as they worked on the unit throughout the week, as well as of one-on-one interviews with both teachers following the unit’s completion. We further describe these data and our analyses of them below.

RQ1: Analyzing the impact of the unit on learning To answer our first research question, which regards the impact of the unit on students’ conceptual learning, we analyzed the change in students’ performance on the pre and posttests. To do this, we developed rubrics to score each of the pre and posttest items (Tables 1, 2, 3). On those items that called for students to construct scientific explanations (items 2 and 4), we used Knowledge Integration (KI) rubrics, which consisted of 6-point scales (0–5) that rewarded the links students made between relevant scientifically normative ideas (Liu et al., 2011). These KI rubrics have been refined over multiple implementations of the Mitosis unit in prior classroom-based research studies (Matuk & Linn, 2013). To achieve inter-rater reliability, two coders independently scored subsets of 20–30 student responses at a time, and in between, refined their

Fig. 2 Item 1 from the pre and posttest

Intern. J. Comput.-Support. Collab. Learn

Fig. 3 Item 2 from the pre and posttest

shared understanding of the rubric by discussing and resolving disagreements. We iterated this process until we had achieved Cohen’s kappa values of 0.93 and 0.95 on items 2 and 4, respectively. These represent very good inter-rater agreement. One coder then scored all of students’ pre and post test responses. By summing students’ scores

Fig. 4 Item 3 from the pre and posttest

Matuk C., Linn M.C.

Fig. 5 Item 4 from the pre and posttest

on each item, we obtained single scores associated with each students’ pre and posttest. We performed a t-test to detect significant differences between these scores.

RQ2: Analyzing how students exchanged ideas To answer our second research question, which asks how students exchanged ideas with their peers, we used logged data on students’ uses of the Idea Manager. From these data, we extracted the total number of ideas that each student workgroup had collected by the end of the unit, and noted the proportion of this total that was generated by the students themselves, and copied from their peers. In order to determine what kinds of ideas students were selecting from their peers, we scored the quality of each of the publicly shared ideas based on how well they integrated two or more relevant concepts into a supported statement (Table 4). We grouped

Intern. J. Comput.-Support. Collab. Learn Table 1 Scoring rubric for items 2.1 and 2.2 of the pre and posttest (Fig. 3) Questions posed to students 2.1 Describe what is happening in this cell. 2.2 Explain why the phase pictured above is an important part of the cell cycle. Key ideas • The phase name, anaphase, is correctly identified. • A description is offered of the organelles’ actions within the cell (i.e., the cell’s DNA is being pulled by the spindle fibers into equal parts). • The importance of the organelle’s activity is articulated (i.e., each daughter cell must receive identical genetic information). Scoring rubric Score Criteria 0 No answer (Blank) 1 Off-task or uninterpretable I don’t know. 2 Irrelevant, incorrect or ambiguous The cell is spareing apart and make a new cell. ideas It is apart the cell cycle. the chromosomes are separating apart to the other 3 Partial understanding. 1 normatively side of the chromosomes. The phase is an important stated key idea, or 2 partially correct part of the cell cycle because the chromosomes key ideas are the reason the cell is turning into two new cells. 4 Fair understanding. 2 normatively Anaphase is happening to the cell, the centromeres stated key ideas. split and the two chromatids are pulled apart by the spindle fibers. Anaphase is an important part of the cycle because its when the centrometers split. 5 Complete understanding. 3 normatively This cell is going through Anaphase, the 3rd stage stated key ideas of Mitosis. Right now the cell is under going the centromere splitting and the chromatids are pulled apart by the spindle fibers. This stage of Mitosis gives the daughter cells the same number of chromosomes; a crucial part in the cell cycle.

ideas according to their quality scores (low = 2 or 3, medium = 4, high = 5), and used an ANOVA to calculate the relationship between the quality of ideas, and the number of times they were copied by students. To complement this analysis, we examined the reasons that students wrote when prompted to justify their decisions for choosing to share and copy ideas, and organized these into emergent categories, which we defined and refined through repeated viewing and discussion of students’ responses. To support our findings with qualitative insight into students’ approaches toward, and processes in working through the unit, we video recorded the discussions of two student pairs throughout their work on the unit. These student pairs were selected to be video-recorded based on having each returned signed consent and assent forms to participate in our research. We were also limited by the number of video cameras that we had available, which was two. We viewed those portions of the video recordings that captured students during their work on the Idea Manager, and performed a qualitative descriptive analysis to identify the main issues they experienced during their work (Miles et al., 2014; Sandelowski, 2010). Only one student pair’s video recording of these particular moments were audible enough for analysis, and thus cannot be said to be representative of all students. However, their video served as one illustrative example of students’ process for deciding which publicly shared ideas of their peers to select and add to their own Idea Baskets, and therefore offered qualitative support to the rest of our data on the topic of how students exchange ideas (Derry et al., 2010).

Matuk C., Linn M.C. Table 2 Scoring rubric for item 3 of the pre and posttest (Fig. 4) Question posed to students Why are spindle fibers important in cell division? Here’s what other students think (see Fig. 4). 1. With which of these ideas do you most disagree? 2. Explain why you disagree with that idea. Score Category Example Is the most correct idea selected? 0 Student B 1 Student A, D or E 2 Student C What does the student critique about the idea? (not mutually exclusive) 0 or 1 Spelling/grammar/length He miss spelled Bcell^ and spelled Bcel^ it is the shortest answer. 0 or 1 Circular reasoning, lack of explanation, I disagree because spindal fibers dont keep the cell or faulty reasoning together they just duplicate them disagrea with student e because makse no sens 0 or 1 Conceptual explanation (Role of spindle Because the spindle fibers don’t hold or support fibers) the cell in any way, they just pull the chromosomes apart. 0 or 1 Awareness of other options I disagree with student C because I don’t think the spindle fibers keep the cell together. I think the cell membrane does that. And all the other students have good answers. (In my opinion) Are students’ justifications correct? The new idea will help them by giving them more 0 Selects Student A, B, D or E OR Student things to talk about in there easay. It will also be selects Student C and not correctly very interesting to read about what they say. explain function of spindle fibers or explains something else altogether well now they will have another idea to include to 1 Student selects Student C and states that their list of why spindle fibers are so this is not the function of spindle fibers important.also it tells you in what phase the AND/OR Explains the actual function spindle fibers appear. of spindle fibers

RQ3: Analyzing the relationship between idea diversity and explanation quality To explore our third research question, which concerns the relationship between the diversity of students’ ideas and the quality of their explanations, we coded each of students’ Private Basket ideas according to whether it was unique or redundant relative to the rest of the ideas in the Basket. A unique idea added information that was not already present among the ideas in the Private Basket. For instance, if a student pair had only one idea that mentioned that cancer consists of cells that divide uncontrollably, then that idea would be scored as unique, because it differs in content from all the other ideas. Meanwhile, if a student pair’s private basket contained more than one idea, albeit worded differently, regarding the role of spindle fibers in dividing a cell’s DNA into two equal parts, then each of those ideas would be scored as redundant because each restates already existing ideas. Having thus coded their private ideas, each student workgroup was associated with a number of unique and a number of redundant ideas. We also scored students’ written explanations to a question within the unit: BMaya heard that her mother’s hair might fall out during her cancer treatment. Why would this happen?^ To do this, we developed a rubric that scored students’ explanations based on the number of links made between key ideas relevant to the question, and that were presented earlier the unit (Table 5). Two coders independently scored approximately 20

Intern. J. Comput.-Support. Collab. Learn Table 3 Scoring rubric for item 4 of the pre and posttest (Fig. 5) Question posed to students BExplain the effect your drug would have on the different parts of the cell in that phase, and how this would help keep cancer growth under control.^ Key ideas • Identifies a cell organelle to be affected by the medicine (e.g., chromosomes, spindle fibers) • Explains the process by which the medicine will disrupt the function/action/process related to the organelle (e.g., will prevent chromosomes from dividing). • Mentions the need for the medicine to stop cell division/cancer growth in order to treat cancer. Score Criteria 0 No answer (Blank) 1 Off-task or uninterpretable i think that u can effect drug difffrent calll 2 Irrelevant, incorrect or ambiguous the phase will help the cancer growth by getting both cell organism and treating both sides of the cell. 3 Any ONE of the three key ideas is correctly the medicine will affect because the spindle fibers explained. are making aline going down and the crimatic are gonna spread out each side on going right and one going left. (Organelle = spindle fibers) 4 Any TWO of the three key ideas are correctly My cancer medicine would slow down the process explained. of mitosis and there might be a few less spindle fibers and less chromosomes that a regular cell would. (Organelle = spindle fibers, chromosomes; Process = not present; Need = Bslowing down^) In this phase the cells DNA is seperating and that is a esential part of mitosis. If the cell sucseeds in splitting it’s DNA then a new cell can be formrd (Organelle = DNA; Process = splitting; Need = not present) 5 All THREE key ideas are correctly explained. If the drug effected metaphase, the chromatids would line up, but would scatter before they • Key ideas must not be incorrectly explained. If could split. This method is both safe and quick they are incorrect, consider a score of 4. and can be used to stop cancer. Another way that • It’s OK if there are incorrect ideas IN ADDITION the drug would effect the cell is that it would to the correct key ideas, unless those ideas lead prevent the spindle fibers from doing their job. you to believe the student didn’t correctly Once the chromatids are scattered, you are back understand the key ideas after all. where you started in the cell cycle. (Organelle = chromatids; Process = lining up; Need = can’t finish cell cycle)

student explanations before coming together to discuss and resolve discrepancies. We repeated this process until we had refined our rubric, and achieved a Cohen’s kappa value of 0.82, which represents very good agreement between raters. Afterward, one coder scored the rest of students’ explanations. We grouped student workgroups according to their explanation scores (low = 1–3; medium = 4; high = 5) and used an ANOVA to test for significant differences between the diversity of ideas and the quality of students’ explanations. To further determine how collecting unique vs. redundant ideas related to the quality of students’ explanations, we divided students into two groups: Those of whom had selected 50% or more of their ideas from their peers, and those of whom had generated more than 50% of their ideas by themselves. We then performed a Chi-square test to see whether there were significant differences between the main source of students’ ideas (peer or self), and the quality of their explanations.

Matuk C., Linn M.C. Table 4 Scoring rubric for the quality of ideas in the classroom’s Public Basket Score

Description

Example

0 1 2

No response Off task Uninterpretable; nonnormative ideas or observations; Declarative or factual statements or definitions; unconnected ideas; or a mixture of normative and nonnormative ideas. A normative observation or claim, but lacks interpretation, explanation, or sufficient supporting evidence. Integrates more than one concept into a well-supported, normative claim; offers a causal explanation for an observation.

(None found) (None found) They divide during mitosis because cancer cell make them spread the sickness. its bad, you can die, theres many types of cancer.

3

4

5

The medicine stops the centrioles and spindle fibers from pulling the choromosomes apart. The cell was able to undergo mitosis, but one of the cell’s chromosomes was obliterated by the Zingiber zerumbet, possibly making that cell useless.

RQ4: Analyzing teachers’ experiences with technology-supported peer idea exchange To understand the teachers’ perspectives on their uses of the Idea Manager, one researcher conducted individual interviews with the two teachers following their instruction with the unit. These interviews, which lasted approximately 45 min, were intended to be widely informative to our broader goals for improving our curriculum units and platform. Thus, some of the Table 5 Scoring rubric for students’ written explanations to the embedded assessment question: BMaya heard that her mother’s hair might fall out during her cancer treatment. Why would this happen?^ Key ideas • Cancer is when cells divide rapidly/out of control. • Cancer treatment stops cell division to treat cancer. • Chemotherapy targets rapidly dividing cells. • Chemotherapy also stops normal cells from dividing. • Skin/hair cells are rapidly dividing cells. • When hair cells aren’t replaced with new ones, hair falls out. • When skin/hair follicle cells die, they can no longer hold hair in the scalp and the hair falls out. Score 0 1 2 3

Description Blank Off-task, uninterpretable Non-normative, lacks explanation, doesn’t address the question 1 normative idea

4

2 linked normative ideas

5

Elaborated response with 3+ linked normative ideas

Example (no response) these are the most likle to cure cancer we think that chemotherapy can cause you to loose hair because of the radiation in the medicine can make you loose hair. During the treatment, a lot of cells die, including bone cells, hair cells, stomach cells, and skin cells. Her hair cells stops dividing because of the treatment. That is why her hair is falling out. Maya’s mother may lose her hair because chemotherapy and other cancer treatment prevent cells from dividing so quickly as they usually do, so they do not divide according to the body’s needs. Because of the cells unable to go through mitosis so easily, normal body cells will be affected in such a way that they cannot divide so quickly. The chemotherapy stops the cells from dividing, but it affects the regular cells. The hair cells need to be replaced, and when the cells aren’t dividing, the hair doesn’t Breplenish^ itself so the hair falls out...

Intern. J. Comput.-Support. Collab. Learn

interview questions asked the teachers to describe their general impressions of this version of unit compared to the versions they had used in prior years, ideas for enhancements they would like to see in the future, to describe their approaches to grading and giving feedback on students’ work, how they used various teacher tools, and how our summer professional development workshops may have informed their approaches. For the purposes of this study, we focused our analysis on teachers’ responses to those interview questions that asked them what they liked and disliked about the Idea Manager. We performed a qualitative descriptive analysis of these interviews to obtain teachers’ own perspectives (Miles et al., 2014; Sandelowski, 2010), and to triangulate these with our other analyses in order to strengthen the validity of our findings (Cypress, 2017; Morse et al., 2002).

Results and discussion RQ1: Impact of the unit on students’ learning Analyses of students’ pre tests (N = 280) and posttests (N = 277) found that all students showed significant gains by the end of the unit (M = 1.38, SD = 1.09, t(219) = −12.88, p < .0001). This finding suggests that the unit was generally successful at enhancing students’ understanding of, and ability to reason about cancer and cell division.

RQ2: How did students exchange ideas? Analyses of the logged data on students’ uses of the Idea Manager revealed a number of findings that together, suggest that students deliberately engaged with their peers’ ideas. Students’ collected a mean of 18.7 ideas (SD = 6.8) in their private baskets by the end of the unit. Students generated significantly more of these ideas by themselves (M = 14.53, SD = 5.52) than they chose from among the ideas that their peers shared in the Public Basket (M = 4.17, SD = 3.42). This finding suggests that in spite of access to the Public Basket, students demonstrated an ability to articulate their own ideas without being reliant on their peers. Thus, the unit was effective in eliciting students’ own ideas, which the KI perspective suggests prepares students most effectively for learning. Students were selective in deciding which ideas to contribute to, and which to copy from the Public Basket. Overall, they contributed slightly more ideas to the Public Basket (M = 5.11) than they copied from their peers (M = 4.26), which suggests that students were selective in deciding which of their peers’ ideas to put into their own private baskets. They also appeared to be selective in terms of which of their own ideas to contribute, as only 22.8– 27.3% of the ideas in students’ Private Baskets ever made it into the Public Basket, such that the final collection of public ideas had a fairly high mean quality score of 3.77/5. Students moreover distinguished high quality ideas from among the public ideas. Public ideas were copied a mean number of 0.83 times, and higher quality ideas were more frequently copied than lower quality ideas (F(2, 721) = 10.76, p < .0001) (Fig. 6). Only 52% of the 722 public ideas were actively copied by students, while the rest of those ideas were contributed, but never copied (Fig. 7). This finding may be due to the potentially overwhelming number of ideas, 722 in total, contributed by students themselves and their peers. Rather than give equal attention to each idea in turn, students may have developed shortcut strategies for deciding which ideas to select. For example, students may have paid more attention to, and have been

Matuk C., Linn M.C. 1.2 1.11

Mean number of times chosen

1

0.8 0.73

0.72

Medium [4] (N=216)

Low [2, 3] (N=308)

0.6

0.4

0.2

0 High [5] (N=198)

Quality of Public idea Fig. 6 Students more frequently chose higher quality ideas than they chose medium or low quality ideas from the Public Basket. Quality scores are shown in square brackets

more likely to copy ideas shown to have been highly copied by others in their class. Thus, it is possible that while certain students were selecting peers’ ideas based on their quality, other students may have been selecting ideas that appeared most popular. Below, we examine the reasons that students expressed for selecting the public ideas they chose.

Students’ reasons for exchanging public ideas To explore how students were evaluating ideas, we examined their written responses to the prompts that asked them to justify their selections of peer ideas. In these justifications, students expressed a variety of reasons for exchanging ideas in the Public Basket (Figs. 8 and 9). Most commonly, students based their decisions to share and to copy ideas on their perception of the idea’s validity,

Fig. 7 Visualization of the number of times that ideas in the public basket were copied (orange bar graph), organized by their quality score (x-axis)

Intern. J. Comput.-Support. Collab. Learn 60

52 50

Number of mentions

40

31

31

30

18

20

10

7 4

4

2

1 Last resort

Others' certainty

Relevant

Blank

Interesting

Similar

Valid

Helpful

No reason

0

Reasons for choosing public ideas Fig. 8 Reasons for choosing public ideas (n = 150)

without elaborating on what constituted that validity. For example, one student pair explained that they chose a particular idea B… because we agree with it the most,^ while another student pair noted that their chosen idea B…was a very smart answer and it seemed like something that would be 70

63

50

40

30

27

20

15 4

3

2

1

1 Tchr Prompted

5

Obligation

7

Similar

11

Interesting

11 10

Recognition

Number of mentions

60

Last resort

Blank

Relevant

Altruistic

Helpful

No reason

Valid

0

Reasons for making ideas public Fig. 9 Reasons for sharing ideas with others (n = 150)

Matuk C., Linn M.C.

true.^). Approximately 20.67% of students reported choosing public ideas due to their helpfulness; that is, because they added new information to their thinking (B… because it was well written and explained part of the lesson.^ B… because we thought that it was similar to our idea but different enough to provide food for thought.^). Regarding students’ decisions to contribute their own ideas, one category of reason stated was a desire for peers to recognize that their ideas were helpful. For example, one student pair noted that Bwe wanted to… see what [our classmates] thought,^ implying that they wanted to see whether their peers would choose to copy their idea if they were to make it public. To better understand students’ reasoning in action, we examined a video recording captured of two student partners, whom we named Ariel (A) and Noah (N). Earlier, the students had contributed one of their ideas to the Public Basket, and in the excerpt below, they are looking for their idea among those of their peers to determine from the number beside it whether anyone else had copied it. They discuss how the idea they shared was originally one that they had earlier copied from the Public Basket, and whether this implies that they had stolen it: A: Where’s our idea? N: No one likes our idea. A: [pretends to cry] We’re up here, BCancer…^ N: Where’s our idea? A: We stole that idea from someone. Wait we didn’t steal, we copied it. N: You stole that idea…. ohhhhh! A: It even said to copy, so technically I didn’t. The excerpt above demonstrates how Ariel and Noah reasoned about the value of ideas (how many times others have copied it) and the question of ownership. Recall that the tool displayed the idea and the certainty of the contributors. It did not reveal the contributors identity. Some of their reasoning no doubt arose due to the information that the tool displayed and kept hidden as well as the the actions that the tool allowed and did not allow, and the terminology used to describe those actions. For example, students press a button that reads Bcopy^ in order to duplicate and add a public idea into their private baskets. Although Bcopy^ is an accurate description of the button’s function, a term such as Bcite^ may have better framed this action in terms of knowledge sharing. Arguably, using terminology to better frame the task may have led students to have different conversations about what it means to exchange ideas. Nevertheless, this example from Ariel and Noah demonstrates conscious reasoning about the public value of their contributions, and matters of intellectual ownership. Such reasoning has relevance to professional science discourse. A further reason stated in students’ written responses for sharing their ideas was a desire to improve the Public Basket. For instance, one student pair noted that the idea they had chosen to share Bseemed acurate [sic] compared to the other (students’) ideas.^ Another student pair said of their idea that Bwe thought it was a good idea and nobody has it yet.^ Still another student pair observed that their idea B…was a general statement that other classmates would understand.^ Each of these examples suggests that students were consciously choosing to add public ideas that would benefit their peers in some manner, whether this was by contributing information that no one else had thought of, or by contributing ideas that they believed would help their peers to understand something. Essentially, students contributed ideas with the goal of diversifying the available information. Notably, to make such decisions students needed to review the existing public ideas so they could make an overall evaluation of them. Students reported that the idea they chose to share would add value relative to the ideas already present.

Intern. J. Comput.-Support. Collab. Learn

Being critically evaluative is core to scientific thinking (Bailin, 2002; Duschl & Osborne, 2002; Kuhn & Pearsall, 2000; Lombardi et al., 2018) and to reasoning across other disciplines (e.g., McGraw et al., 2018). This report of comparing ideas in the Idea Manager to their own ideas with the goal of adding value to the Idea Manager aligns with the KI process of distinguishing ideas. Thus, understanding how instructional activities such as those facilitated by the Idea Manager might support students in honing this skill can inform improved instruction. To better understand students’ processes for evaluating ideas, we examined another episode between Arial and Noah. In it, the two student partners were captured looking through the public basket and discussing which ideas to copy into their private baskets. As illustrated in the dialogue below, Ariel and Noah both considered a range of criteria in selecting from the list of peers’ ideas: A: Which one do you want to copy… Eenie meenie miny moe’s not going to work, Noah. N: All right, all right, I’m going to choose one [N closes his eyes and waves his finger over the screen before resting it on one of the ideas in the list]. Probably this one. A [presses the Bcopy idea^ button] N: Wait no, that has a red star. They’re [i.e., the students who shared the idea are] not sure at all. A: Oh, haha. N: Delete. Go back, it says Bdelete.^ A: But maybe it’s [i.e., the idea is] right but they’re [the students who shared it are] just not sure. N: I don’t care. They’re not certain about it. A: OK. Let’s do BPlant A^ ‘cause that’s what we think. This one right here: BPlant A restrains the spindle fibers and…^ Our observations of Ariel and Noah demonstrate both the challenges they faced, and their abilities, in critically evaluating information. While we cannot confirm whether their process was representative of all students in this study, their behavior resonates with prior research on youth’s struggles with evaluating information found in internet searches. For example, students tend to assume that results that appear higher in a list are more trustworthy (Hargittai et al., 2010; Westerwick, 2013), and will select these in spite of relevant results appearing at the bottom of the list (Pan et al., 2007). Moreover, rather than consider the source’s credentials, or the evidence to support the source’s claim, students tend to use irrelevant heuristics to judge the credibility of information, such as its ease of access, aesthetic presentation, and consistency with their current information needs (Barzilai & Zohar, 2012; Iding et al., 2009; Walraven et al., 2009). In line with these prior research findings, Noah first randomly selected an idea by closing his eyes and then seeing where his finger landed. However, he then quickly reconsidered his choice when he noticed that the contributors indicated their low certainty of it via a star rating that appeared next to it. This sole information provided by the Idea Manager on the idea’s contributors was Noah’s way of judging the credibility of the idea’s source. Ariel then proposed that the idea itself may be good, in spite of the contributors’ uncertainty about it. This shows her ability to consider alternative scenarios, which is one of the core skills of critical thinking (Facione, 1990). However, having failed to agree on this point, Ariel and Noah ultimately decided to select a different idea that was similar to one of their own existing ideas. Their decision reflects a common heuristic in information evaluation of seeking ideas that confirm one’s existing understanding (Metzger et al., 2010). Indeed, 12% of students in our study stated that they chose particular peer ideas because of their similarity to their existing private ideas (B…because we wrote a private idea similar to that…^).

Matuk C., Linn M.C.

RQ3 How do the sources and diversity of ideas relate to the quality of students’ explanations? Our third research question concerned how the ideas that students collected related to students’ abilities to construct explanations using those ideas. In particular, we asked whether the origin of students’ ideas—generated by themselves or copied from their peers—as well as the diversity of students’ final collections of ideas, were related to the quality of students explanations. In terms of the relationship between idea diversity and explanation quality, we found that students’ Private Idea Baskets generally contained a greater proportion of unique ideas than they contained redundant ideas (Fig. 10), which suggests that they were overall, students were focused on adding new ideas, rather than on finding new ways to state their existing ideas. However, those students who collected a greater proportion of unique ideas by the unit’s end tended to have also produced poorer quality explanations. Meanwhile, those who collected a greater proportion of redundant ideas tended to have produced higher quality explanations (F(2, 142) = 6.04, p < .005) (Fig. 11). Further analysis showed that students who wrote better quality explanations also tended to have self-generated more of the redundant ideas in their Private Baskets. In contrast, students who wrote poorer explanations tended to have mostly chosen, or to have equally chosen and self-generated, their redundant ideas (χ2 = 9.511, df = 2, p < .01) (Fig. 12). From a KI perspective, self-generating redundant ideas may involve an effort to refine reasoning. In contrast, continuing to add redundant ideas may signal that students are unsure about some aspect of the explanation and not yet ready to refine their thoughts. Alternatively, in preparing to write a scientific explanation, there may be a benefit to restating one’s ideas in different words over focusing on collecting further unique ideas. We explore these interpretations further in the discussion section below.

RQ4 What were teachers’ experiences with technology-supported peer idea exchange? To understand the teachers’ views on the impacts of the Idea Manager, we turned to the portions of their interviews during which we asked them to reflect on how they and their students used the tool. In her interview, Teacher A recalled her use of previous versions of the same unit, which did not include the Idea Manager, but rather a discussion board on which students would post attributed

Fig. 10 Proportion of unique and redundant ideas by source

Intern. J. Comput.-Support. Collab. Learn

Fig. 11 Proportion of unique and redundant ideas in students’ Private Baskets, by explanation quality

ideas. In those cases, Teacher A described how she struggled to keep her students from simply relying on Bthe smart kids^ to give the answers. BI like [the Idea Manager]. I especially like that the Public Basket, [students] don’t know whose ideas they are. That’s probably my favourite thing. Because there used to be a discussion [forum in a prior version of the unit] which sort of maybe served the same purpose. Um, the discussion [forum] was problematic for a lot of reasons, because [students] could see each other’s names. One problem was the ‘hi how are you doing?’ kind of thing. And the other was the, um, if they see the names, they believe that certain kids are smart, and that those kids have the right answers. They just wait, and whatever that kid says, then the whole class says, ‘Oh yes, yes, yes, that’s the right idea.’^ Teacher A went on to explain how she observed more interesting discussions than during the previous years that she had taught with the Mitosis unit, and attributed this to the the fact that the Idea Manager anonymized students’ contributions. Without being able to assume the quality of an idea by their knowledge of its contributor, students were left to be more intentional in evaluating their peers’ ideas in their own right:

Fig. 12 Sources of redundant ideas by quality of explanation

Matuk C., Linn M.C.

BI like that [the Idea Manager] is actually anonymous, because [the students] really, it’s interesting, when [students] read [the ideas], watching [students] reading [their peers’ ideas], ‘cause they really are thinking like ‘I don’t know, is that right? Is that not right?’ So the discussions are more interesting.^ Our classroom observations show how the Idea Manager moreover helped teachers to provide their students with formative feedback. For example, when most of her students had reached, or just completed the Explanation Builder activity, we observed Teacher B scanning through their responses in the WISE grading interface during class. There, she could see how students were organizing the ideas they had collected, and could identify students in need of assistance. When she noticed that certain student workgroups had not collected relevant or sufficient ideas to construct their explanations, she called them to her desk to offer individualized assistance, such as to return to prior steps in the unit in order to collect further ideas about particular topics. We also observed both teachers reinforce the unit’s instruction by tailoring the next day’s opening discussions to address the ideas that students had documented during the previous days. For example, through a whole class activity, teachers asked students to analyze a set of responses and articulate their strengths and weaknesses. This suggests that at the end of these days, the teachers had visited the WISE grading interface to examine students’ ideas, and to identify topics with which their students appeared to struggle. Each of these examples shows how the Idea Manager allowed teachers to more easily monitor their students’ ideas, and to respond with just-in-time guidance.

Discussion Summary of findings This study explores how students engaged with their peers’ ideas to support their construction of scientific explanations. It illustrates how a curriculum can integrate technology aligned with the Knowledge Integration framework to promote learning. Our findings show how students approached the activity of exchanging ideas. They moreover reveal some surprising relationships between these approaches and the quality of students’ explanations. In particular, we used a technology called the Idea Manager, which allows researchers to trace specific selfgenerated and copied ideas over time, and to see how these become integrated into students’ explanations. We discuss some of our main findings, and consider their implications in light of existing research on computer-supported collaborative learning. With regard to our first research question on the impact of the unit on students’ learning, we found that students gained significantly from their pre to their posttests. This suggests that the unit as a whole was effective in improving students’ conceptual understanding and ability to reason about mitosis, cancer, and cell division. With regard to our second research question on how students exchanged ideas, we found that students were no less inclined to develop their own ideas, in spite of their ability to copy their peers’ ideas from the Public Basket. Indeed, students’ collections of ideas by the end of the unit consisted primarily of their own contributions rather than of those copied from their peers. This suggests that the Idea Manager was successful at promoting KI’s focus on eliciting ideas. Evidence from students’ written justifications, and the example of Ariel and Noah’s discussion over which of their peers’ ideas to select, offered evidence of students’ abilities to evaluate their own and others’ ideas. Specifically, students tended to contribute their higher

Intern. J. Comput.-Support. Collab. Learn

quality ideas to the Public Basket, which implies that they were successful in distinguishing their promising from their less promising ideas. They moreover tended to copy the better quality ideas from their peers, which suggests that they were effective in distinguishing among promising ideas in the Public Idea Basket. The video excerpt from Ariel and Noah’s discussion offered a specific illustration of the various criteria that student pairs consider in deciding the value of their own and of others’ ideas. It is possible that visiting the Public Basket and seeing how others articulated their ideas may have allowed students to make better judgements about their own ideas. As reported by a number of students, their decisions about which ideas to share and which to select were based on the already existing ideas in the Public Basket. Their desire to contribute ideas that would be helpful to their peers in some way may have driven them to scrutinize their own and their peers’ ideas more closely. This finding resonates with related research on knowledge building environments, which finds that focusing students on collaboratively building collective understanding can impact their views of knowledge as something that can be improved through shared contributions (Hong & Chiu, 2016; Hong et al., 2016). With regard to our third research question on the relationships between the sources and diversity of students’ ideas and the quality of their explanations, we found some unexpected patterns. Notably, students who had generated more redundant ideas also tended to have constructed more coherent explanations, while students who generated more unique ideas tended to have constructed less coherent explanations. This finding was surprising given prior research pointing to the benefits of encountering diverse ideas, as reviewed earlier. One explanation for these results is that generating redundant ideas has cognitive benefits. It may help students to refine their understanding, as through the self-explanation effect (Chi et al., 1994; Siegler, 2002). Rearticulating one’s ideas may also have metacognitive benefits, as it involves actively creating links between new and prior knowledge (Mayer, 1984, 2002; Weinstein & Mayer, 1986). As with rewriting and revision, students who generate redundancy may be re-evaluating and clarifying their thinking (Ladd, 2003; Fitzgerald, 1992; Scardamalia & Bereiter, 1994). Another explanation for our observations is that they reflect individual students’ knowledge integration trajectories that were not captured by our measures. For example, some students may have more quickly identified the key ideas in the unit, and essentially did not require prompts to add ideas to the Idea Manager before they transitioned from the elicitation to the distinguishing phase of knowledge integration. When they were prompted to add ideas to the Idea Manager, these students may have simply rephrased these same key ideas that they had already identified. Spending more time in the phase of distinguishing and refining ideas may have helped these students to ultimately produce better, more refined explanations. Meanwhile, other students may have struggled to recognize and converge upon relevant, normative ideas, and so moved relatively slowly from the phase of eliciting ideas to the phase of distinguishing ideas. The KI perspective expects students to have multiple diverse ideas, and encourages them to articulate these at the start of their inquiry. However, when students do not eventually move from elicitation to distinguishing and refining a set of relevant ideas, this may indicate their need for further support. Our findings may thus strengthen arguments for curriculum materials that can adapt to students’ diverse needs. More specifically, our findings pinpoint the need to support students in distinguishing among their ideas. For example, certain students may benefit from explicit prompts to relate their own ideas to those of their peers; to reconcile contradictions; and to periodically review, reflect, and converge on a set of ideas most relevant to the driving inquiry question.

Matuk C., Linn M.C.

Yet another explanation for the relationship between generating redundant ideas and constructing higher quality explanations has to do with the nature of unit’s subject matter. The reasons for which learners hold diverse ideas across different domains may vary, and knowledge of these reasons has implications for how to address those ideas, and how to support students’ learning and conceptual change in a domain. For example, students’ conceptions of the physics of matter and motion have developmental origins associated with well-researched naïve intuitions (e.g., Piaget & Inhelder, 1974; Smith et al., 1985; Tirosh & Stavy, 1999) and developmental pathways (e.g., Johnson, 1998; Liu & Lesniak, 2006). Addressing students’ conceptions of density, for example, have benefitted from considering conceptual change and socio-cognitive theory (e.g, Herrenkohl et al. 1999; Kang et al., 2004; Kang et al., 2010; Skoumios, 2009; Smith et al., 1992), formative assessment (Furtak & Ruiz-Primo 2008; Kennedy et al. 2005; Shavelson et al. 2008), metacognitive instruction Mittlefehldt & Grotzer 2003), and conceptual systems (Carey, 2009; Hashweh, 2016; Smith et al., 1997). Likewise, college students’ diverse perspectives on socio-scientific issues such as stem cell research, for example, tended to derive not from scientific views, but from views based in, among other things, ethics, politics, religion, and human rights (Halverson et al., 2009). This implies that instruction on such topics might be improved with more intentional integration of scientific and non-scientific perspectives. While our study shows that students indeed have multiple diverse ideas about cancer and mitosis, too little is known about the nature of those ideas and about the process of students’ conceptual change in this domain. Thus, it cannot be concluded based on this study alone whether the value of refining consensually shared ideas vs. the value of resolving conflicting ideas is generalizable to other domains, or even whether this value is maximized through the intervention in this study. Hence, future research might further explore the nature of students’ ideas about cancer and mitosis, as well as attempt to replicate this study’s findings in other domains. With regard to our fourth research question on teachers’ experiences with technologysupported peer idea exchange, the Idea Manager appeared to enhance their instruction. Specifically, the Idea Manager provided a record of students’ processes for building explanations. This allowed teachers to monitor students’ work in progress, and to provide timely, formative guidance. Moreover, the teachers’ ability to use the Idea Manager to review students’ ideas at the end of the day informed their approach to the next day’s instruction, during which they could address ideas with which they observed their students to struggle. Finally, one teacher noted that compared to prior years teaching the same unit without the Idea Manager, her students appeared to engage in richer conversations about one another’s ideas, which she attributed to the way that the Idea Manager anonymized students’ contributions. Altogether, these findings suggest that technology can have valuable impacts on teachers’ instruction. Future research might explore effective visual displays of students’ ideas so that teachers might understand their trajectories throughout the inquiry process, and the impacts of peers’ ideas, in order to better inform their guidance.

Limitations and areas for further research This study contributes to the overall mixed conclusions regarding the relative value of divergent and convergent ideas for learning. In particular, the relationship we found between idea redundancy and producing high quality explanations adds evidence for the value of considering convergent ideas in generating explanations. However, this relationship also

Intern. J. Comput.-Support. Collab. Learn

refines understanding of the potential value of divergent ideas. In one previous study with the Idea Manager, for example, Matuk and Linn (2015) found that students who were prompted to seek peers’ divergent, as opposed to convergent ideas showed greater learning gains by the posttest. The discrepancies between these and other research findings suggest that there remains more to be discovered about how students engage with divergent and convergent ideas. Below, we outline several research directions toward this goal.

Individual characteristics and inquiry processes While our choice to study the distinctions between divergent and convergent ideas is grounded in distinctions made in prior literature, this dichotomy may not capture the more nuanced criteria that students actually have for selecting peers’ ideas, nor the individual characteristics and experiences that influence their decisions. For instance, some research indicates that the benefits of divergent and convergent ideas depend on the stage of one’s overall thinking on a topic. In brainstorming across domains, for example, convergent and divergent thinking— wherein learners seek and develop ideas that either narrow down, or diverge from their current thinking, respectively—have varying benefits depending on one’s phase in problem solving (e.g., Cropley, 2006). Hence, future research might investigate when in the process of explanation (e.g., when gathering ideas, distinguishing among ideas, or organizing ideas in preparation for writing an explanation) that students may benefit from encountering divergent or convergent peer ideas. Other research notes that individuals’ different responses to information depend not only on the quality of that information, but also on those individuals’ characteristics. For instance, one study found that novel ideas tended to be considered more cognitively stimulating by individuals who craved structure or autonomy (De Jonge, Rietzschel & Van Yperen, 2018). Together with other such findings, our study suggests that in order to fully understand the ways that students engage with information, we cannot ignore personal characteristics. Future research might therefore explore what distinguishes students (e.g., low vs. high prior knowledge), who ultimately collect many diverse ideas from students who ultimately collect many convergent ideas, and for whom these strategies are more or less effective.

The role of teachers’ guidance The influence of contextual factors in explaining the discrepancy between our findings and those of prior research also deserve further consideration. While our study was focused on observing the impacts on learning of a technology-enhanced learning environment, teachers’ interactions with their students during their work in this environment likely also contributed to our observations. Prior research finds that the manners by which teachers respond to and build upon their students’ ideas is related to improvements in student learning (Furtak et al., 2016; Ruiz-Primo & Furtak, 2007). In one study, teachers’ support was found to be essential in helping students to develop their conceptual understanding, in spite of the availability of peer assistance and computer scaffolding (Furberg, 2016). Likewise, the guidance that the teachers in our study provided during their circulation of the classroom may have had an important role in developing students’ ideas during their inquiry, and in explaining the patterns that we observed. One possible effect of teachers’ guidance may have been in ensuring that our technology had any impacts at all. For instance, the ways that teachers motivated the Idea Manager upon introducing the unit and during whole class

Matuk C., Linn M.C.

discussions may have helped to instill in their classrooms a culture of valuing others’ ideas. The teachers also clarified the role of peer contributions for building understanding. Furthermore, the discussions teachers had with their students as they circulated to offer help may have assisted students in critiquing their peers’ ideas, and in articulating their own. Another possible effect of teachers’ guidance is that it had different effects on students in each condition. As our teachers were well versed in constructivist classroom practices—both had attended more than a few professional development workshops on Knowledge Integration—we expected their interactions with students to feature the kinds of questioning patterns designed to draw out students’ ideas, and to scaffold their elaboration (Mortimer & Scott, 2003). Strategies such as rephrasing and revoicing (Stanford et al., 2016), for instance, are in line with convergent thinking, and may thus have been especially helpful to students in the Reinforce group, leading to their greater success compared to the Diversify group. Without replicating this study’s findings with other students and in other teachers’ classrooms, and without a closer examination of how teachers enacted the unit in their classrooms, we cannot conclusively claim that the patterns we observed are entirely due to the technology design, nor can we generalize these findings to other populations. What can be assumed, however, is that human and computer resources can work in concert to support learning. In certain cases, a critical role for computer-based support may be in creating opportunities for more targeted learning interactions among peers and their instructor (cf. Furberg, 2016). Future research may therefore investigate the kinds of guidance that different teachers give in conjunction with collaborative learning technologies, and particularly, how different guidance may support reinforcing or diversifying ideas toward strengthening students’ explanations.

Orchestrating engagement with peer ideas This study was concerned with exploring the ways that students use their peers’ ideas to enrich their own, and how these ideas impact their learning. However, research indicates that learners require explicit guidance in order to successfully construct scientific explanations (Herrenkohl et al., 1999; McNeill et al., 2006; Osborne et al., 2004). Specifically, students’ learning improves when they are given the rationale for the ways that science is practiced (Chen & Klahr, 1999). Research finds that different kinds of guidance can address the different challenges that students experience (Wang, 2015). For example, guidance that offers criteria by which students can evaluate the quality of peers’ contributions, that supports productive sharing and critique of peers’ work (e.g., Sampson et al., 2011), and that prompts students to consider alternative perspectives (e.g., Chin & Osborne, 2010) can help to instill in students critical thinking practices, and values in scientific evidence. One study on middle school students’ explanation building during biology inquiry activities found that both cognitive and metacognitive guidance improved students’ content knowledge understanding and the quality of their explanations (Wang, 2015). Specifically, metacognitive instruction that addressed standards for evaluating the quality of explanations improved students’ abilities to evaluate their peers’ ideas, and this ability was linked to better quality explanations. Thus, an additional question to explore is how to use technology to guide students’ engagement with peers’ ideas. Other studies have manipulated the kinds of guidance given to students on how to add and integrate new ideas to improve their explanations, and has yielded recommendations for the design of effective guidance (e.g., Gerard et al., 2016; Ryoo & Linn, 2016). Following these examples, future studies with the Idea Manager might explore the use of metacognitive guidance to support more deliberative choices around the exchange of

Intern. J. Comput.-Support. Collab. Learn

ideas, cognitive guidance for applying strategies for evaluating ideas, and social guidance to promote more productive discourse between students. Comparing promising kinds of guidance, and comparing this guidance to a control group, would further advance our understanding of the best ways to support students in using their peers’ ideas to improve their understanding.

Conclusions Science inquiry entails participation in a global knowledge community. This involves developing and practicing various collaborative skills, including expressing scientifically informed ideas, sharing ideas with peers, and evaluating multiple sources of information. This study described a curriculum-integrated tool that supports students in exchanging ideas throughout the process of constructing scientific explanations. We presented classroom findings on how middle school students used the tool during a life sciences unit, and how the sources and diversity of their ideas related to the quality of their explanations. Our findings suggest that the Idea Manager was successful at breaking down the steps of Knowledge Integration into discrete activities, and in offering a more fine-grained glimpse into students’ processes for engaging with ideas during science inquiry. As confirmed by our classroom observations, the ability to view students’ ideas in progress allowed teachers to attend to their students’ ideas throughout the inquiry activities. Teachers could track students’ ideas as they were being developed, and offer offer timely guidance both to individual students, and to the whole class, rather than need to wait until students had already completed their explanations. These findings offer an example of how technologies such as the Idea Manager, which enable and support new ways for engaging with peers’ ideas, also present students with opportunities to participate in authentic collaborative science inquiry. Acknowledgements A preliminary version of this analysis was presented at ICLS 2014. Portions of this manuscript are owned by the International Society for the Learning Sciences (ISLS). ISLS has granted the authors permission to reuse these portions for publication.

Funding Funding for this research was provided by a DR K-12 award from the National Science Foundation [grant #1119670].FundingFunding for this research was provided by a DR K-12 award from the National Science Foundation [grant #1119670].

References American Library Association. (2015). Framework for Information Literacy for Higher Education. Document ID: b910a6c4-6c8a-0d44-7dbc-a5dcbd509e3f. Retrieved from http://www.ala.org/acrl/standards/ilframework. Accessed 20 July 2018. Amigues, R. (1988). Peer interaction in solving physics problems: Sociocognitive confrontation and metacognitive aspects. Journal of Experimental Child Psychology, 45(1), 141–158. Andriessen, J. (2006). Collaboration in computer conferencing. In A. M. O'Donnell, C. E. Hmelo-Silver, & G. Erkens (Eds.), Collaborative learning, reasoning, and technology (pp. 197–230). Mahwah: Erlbaum. Asterhan, C. S., & Schwarz, B. B. (2009). Argumentation and explanation in conceptual change: Indications from protocol analyses of peer‐to‐peer dialog. Cognitive Science, 33(3), 374–400. Bailin, S. (2002). Critical thinking and science education. Science & Education, 11(4), 361–375. Bargh, J. A., & Schul, Y. (1980). On the cognitive benefits of teaching. Journal of Educational Psychology, 72(5), 593. Barron, B. (2003). When smart groups fail. The Journal of the Learning Sciences, 12(3), 307–359.

Matuk C., Linn M.C. Barzilai, S., & Zohar, A. (2012). Epistemic thinking in action: Evaluating and integrating online sources. Cognition and Instruction, 30, 39–85. https://doi.org/10.1080/07370008.2011.636495. Bell, P. (1997). Using argument representations to make thinking visible for individuals and groups. In Proceedings of the 2nd international conference on Computer Support for Collaborative Learning (pp. 10–19). International Society of the Learning Sciences. Bell, P. (2013). Promoting students' argument construction and collaborative debate in the science classroom. In Internet environments for science education (pp. 143–172). Routledge. Bell, P., & Linn, M. C. (2000). Scientific arguments as learning artifacts: Designing for learning from the web with KIE. International Journal of Science Education [Special Issue], 22(8), 797–817. Bereiter, C. (2002). Education and mind in the knowledge age. Mahwah: Lawrence Erlbaum Associates. Bereiter, C., & Scardamalia, M. (1989). Intentional learning as a goal of instruction. In L. B. Resnick (Ed.), Knowing, learning, and instruction: Essays in honor of Robert Glaser (pp. 361–392). Hillsdale: Lawrence Erlbaum Associates. Bereiter, C., & Scardamalia, M. (1993). Surpassing ourselves: An inquiry into the nature and implications of expertise. Chicago and La Salle: Open Court. Berland, L. K., & Reiser, B. J. (2009). Making sense of argumentation and explanation. Science Education, 93(1), 26–55. Blatchford, P., Kutnick, P., Baines, E., & Galton, M. (2003). Toward a social pedagogy of classroom group work. International Journal of Educational Research, 39(1), 153–172. Carey, S. (2009). The Origin of Concepts. Oxford: Oxford University Press. https://doi.org/10.1093 /acprof:oso/9780195367638.001.0001. Chen, Z., & Klahr, D. (1999). All other things being equal: Acquisition and transfer of the control of variables strategy. Child Development, 70, 1098–1120. Chi, M. (2000). Self-explaining expository texts: The dual processes of generating inferences and repairing mental models. Advances in instructional psychology, 5, 161–238. Chi, M. T., Bassok, M., Lewis, M. W., Reimann, P., & Glaser, R. (1989). Self-explanations: How students study and use examples in learning to solve problems. Cognitive Science, 13(2), 145–182. Chi, M. T., De Leeuw, N., Chiu, M. H., & LaVancher, C. (1994). Eliciting self-explanations improves understanding. Cognitive Science, 18(3), 439–477. Chi, M. T. H., Siler, S. A., Jeong, H., Yamauchi, T., & Hausmann, R. G. (2001). Learning from human tutoring. Cognitive Science, 25(4), 471–533. Chin, C., & Osborne, J. (2010). Students’ questions and discursive interaction: Their impact on argumentation during collaborative group discussions in science. Journal of Research in Science Teaching, 47, 883–908. https://doi.org/10.1002/tea.20385. Chinn, C. A., & Brewer, W. F. (1998). An empirical test of a taxonomy of responses to anomalous data in science. Journal of Research in Science Teaching, 35(6), 623–654. Cohen, E. G. (1994). Restructuring the classroom: Conditions for productive small groups. Review of Educational Research, 64(1), 1–35. Coirier, P., Andriessen, J., & Chanquoy, L. (1999). From planning to translating: The specificity of argumentative writing. Foundations of argumentative text processing, 1–28. Cropley, A. (2006). In praise of convergent thinking. Creativity Research Journal, 18(3), 391–404. Cypress, B. S. (2017). Rigor or reliability and validity in qualitative research: Perspectives, strategies, reconceptualization, and recommendations. Dimensions of Critical Care Nursing, 36(4), 253–263. De Jonge, K. M., Rietzschel, E. F., & Van Yperen, N. W. (2018). Stimulated by novelty? The role of psychological needs and perceived creativity. Personality and Social Psychology Bulletin, 44(6), 851–867. Derry, S. J., Pea, R. D., Barron, B., Engle, R. A., Erickson, F., Goldman, R., et al. (2010). Conducting video research in the learning sciences: Guidance on selection, analysis, technology, and ethics. The Journal of the Learning Sciences, 19(1), 3–53. Dillenbourg, P. (2002). Over-scripting CSCL: The risks of blending collaborative learning with instructional design. In P. A. Kirschner. Three worlds of CSCL. Can we support CSCL?, Heerlen, Open Universiteit Nederland, pp. 61–91. diSessa, A. A. (2000). Changing minds: Computers, learning, and literacy. Cambridge: MIT Press. Duschl, R. A., & Osborne, J. (2002). Supporting and promoting argumentation discourse in science education. Studies in Science Education, 38(1), 39–72. https://doi.org/10.1080/03057260208560187. Eylon, B. S., & Linn, M. C. (1988). Learning and instruction: An examination of four research perspectives in science education. Review of Educational Research, 58(3), 251–301. Facione, P. (1990). Critical thinking: A statement of expert consensus for purposes of educational assessment and instruction (The Delphi Report). Fischer, F., & Mandl, H. (2005). Knowledge convergence in computer-supported collaborative learning: The role of external representation tools. The Journal of the Learning Sciences, 14(3), 405–441.

Intern. J. Comput.-Support. Collab. Learn Fitzgerald, J. (1992). Variant views about good thinking during composing: Focus on revision. In M. Pressley, K. R. Harris, & J. T. Guthrie (Eds.), Promoting Academic Competence and Literacy in School (pp. 337–358). New York: Academic. Forman, E. A., Cazden, C. B., & Wertsch, J. V. (1985). Exploring Vygotskian perspectives in education. In D. Faulkner, K. Littleton, & M. Woodhead (Eds.), Learning Relationships in the Classroom. London and New York: Routledge. Furberg, A. (2016). Teacher support in computer-supported lab work: Bridging the gap between lab experiments and students’ conceptual understanding. International Journal of Computer-Supported Collaborative Learning, 11, 89–113. https://doi.org/10.1007/s11412-016-9229-3. Furtak, E. M., Kiemer, K., Circi, R. K., Swanson, R., de León, V., Morrison, D., & Heredia, S. C. (2016). Teachers’ formative assessment abilities and their relationship to student learning: findings from a four-year intervention study. Instructional Science, 44(3), 267–291. Furtak, E. M., & Ruiz-Primo, M. A. (2008). Making students’ thinking explicit in writing and discussion: An analysis of formative assessment prompts. Science Education, 92(5), 799–824. https://doi.org/10.1002/sce.v92:5. Gerard, L. F., Ryoo, K., McElhaney, K. W., Liu, O. L., Rafferty, A. N., & Linn, M. C. (2016). Automated guidance for student inquiry. Journal of Educational Psychology, 108(1), 60. Gillies, R. M., & Haynes, M. (2011). Increasing explanatory behaviour, problem-solving, and reasoning within classes using cooperative group work. Instructional Science, 39(3), 349–366. Glachan, M., & Light, P. (1982). Peer interaction and learning: Can two wrongs make a right. Social cognition: Studies of the development of understanding, 238–262. Halatchliyski, I., Kimmerle, J., & Cress, U. (2011). Divergent and convergent knowledge processes on Wikipedia. In Proceedings of the Computer Supported Collaborative Learning conference (pp. 566–570). Halverson, K., Siegel, M., & Freyermuth, S. (2009). Lenses for framing decisions: Undergraduates’ decision making about stem cell research. International Journal of Science Education, 31(9), 1249–1268. https://doi. org/10.1080/09500690802178123. Hargittai, E., Fullerton, L., Menchen-Trevino, E., & Thomas, K. Y. (2010). Trust online: Young adults’ evaluation of web content. International Journal of Communication, 4, 468–494. Hatano, G. (1993). Time to merge Vygotskian and constructivist conceptions of knowledge acquisition. In E. A. Forman, N. Minick, & C. A. Stone (Eds.), Contexts for learning: Sociocultural dynamics in children’s development (pp. 153–166). New York: Oxford University Press. Hausmann, R. G., Chi, M. T., & Roy, M. (2004). Learning from collaborative problem solving: An analysis of three hypothesized mechanisms. In Proceedings of the Cognitive Science Society (Vol. 26, No. 26). Herrenkohl, L. R., Palincsar, A. S., DeWater, L. S., & Kawasaki, K. (1999). Developing scientific communities in classrooms: A sociocognitive approach. The Journal of the Learning Sciences, 8, 451–493. https://doi. org/10.1080/10508406.1999.9672076. Hashweh, M. Z. (2016). The complexity of teaching density in middle school. Research in Science & Technological Education, 34(1), 1–24. Hmelo-Silver, C., Jeong, H., Faulkner, R., & Hartley, K. (2017). Computer-supported collaborative learning in STEM domains: Towards a meta-synthesis. In Proceedings of the 50th Hawaii International Conference on System Sciences. Hogan, K., Nastasi, B. K., & Pressley, M. (1999). Discourse patterns and collaborative scientific reasoning in peer and teacher-guided discussions. Cognition and Instruction, 17(4), 379–432. Hong, H. Y., Chen, B., & Chai, C. S. (2016). Exploring the development of college students' epistemic views during their knowledge building activities. Computers & Education, 98, 1–13. Hong, H. Y., & Chiu, C. H. (2016). Understanding how students perceive the role of ideas for their knowledge work in a knowledge-building environment. Australasian Journal of Educational Technology, 32(1). Howe, C., Tolmie, A., & Rodgers, C. (1990). Physics in the primary school: Peer interaction and the understanding of floating and sinking. European Journal of Psychology of Education, 5(4), 459–475. Howe, C., Tolmie, A., & Rodgers, C. (1992). The acquisition of conceptual knowledge in science by primary school children: Group interaction and the understanding of motion down an incline. British Journal of Developmental Psychology, 10(2), 113–130. Hsi, S., & Hoadley, C. M. (1997). Productive discussion in science: Gender equity through electronic discourse. Journal of Science Education and Technology, 6(1), 23–36. Hynd, C., & Alvermann, D. E. (1986). The role of refutation text in overcoming difficulty with science concepts. Journal of Reading, 29(5), 440–446. Iding, M. K., Crosby, M. E., Auernheimer, B., & Klemm, E. B. (2009). Web site credibility: Why do people believe what they believe? Instructional Science, 37, 43–63. https://doi.org/10.1007/s11251008-9080-7. Johnson, P. (1998). Progression in Children's Understanding of a ‘Basic’ Particle Theory: A Longitudinal Study. International Journal of Science Education, 20(4), 393–412. https://doi.org/10.1080/0950069980200402.

Matuk C., Linn M.C. Kang, H., Scharmann, L. C., Kang, S., & Noh, T. (2010). Cognitive Conflict and Situational Interest as Factors Influencing Conceptual Change. International Journal of Environmental and Science Education, 5(4), 383–405. Kang, S., Scharmann, L. C., & Noh, T. (2004). Reexamining the Role of Cognitive Conflict in Science Concept Learning. Research in Science Education, 34(1), 71–96. https://doi.org/10.1023/B:RISE.0000021001.77568.b3. Kapur, M., Voiklis, J., & Kinzer, C. K. (2008). Sensitivities to early exchange in synchronous computer supported collaborative learning (CSCL) groups. Computers and Education, 51(1), 54–66. Kazemi, E., & Stipek, D. (2001). Promoting conceptual thinking in four upper-elementary mathematics classrooms. Elementary School Journal, 102, 59–80. Keil, F. C. (2006). Explanation and understanding. Annual Review of Psychology, 57, 227–254. Kennedy, C. A., Brown, NJS, Draney, K. & Wilson, M. (2005). Using progress variables and embedded assessment to improve teaching and learning. Annual meeting of the American Education Research Association, in Montreal, Canada. Knudson, R. E. (1992). The development of written argumentation: An analysis and comparison of argumentative writing at four grade levels. Child Study Journal, 22(3), 167–184. Koschmann, T. (2003). CSCL, argumentation, and Deweyan inquiry. In Arguing to learn (pp. 261–269). Springer, Dordrecht. Kuhn, D. (1989). Children and adults as intuitive scientists. Psychological Review, 96(4), 674. Kuhn, D. (1991). The skills of argument. Cambridge, England: Cambridge University Press. Kuhn, D., Garcia-Mila, M., Zohar, A., & Andersen, C. (1995). Strategies of knowledge acquisition. Monographs of the Society for Research in Child Development, 60(4, Serial No. 245). Kuhn, D., & Pearsall, S. (2000). Developmental origins of scientific thinking. Journal of Cognition and Development, 1(1), 113–129. Ladd, B. C. (2003). It's all writing: Experience using rewriting to learn in introductory computer science. Journal of Computing Sciences in Colleges, 18(5), 57–64. Latour, B., & Woolgar, S. (2013). Laboratory life: The construction of scientific facts. Princeton University Press. Leitão, S. (2003). Evaluating and selecting counterarguments: Studies of children's rhetorical awareness. Written Communication, 20(3), 269–306. Lemke, J. (1990). Talking science: Language, learning and values. Norwood: Ablex. Linn, M. C., & Eylon, B.-S. (2011). Science Learning and Instruction: Taking Advantage of Technology to Promote Knowledge Integration. New York: Routledge. Linn, M. C. & Hsi, S. (2000). Computers, teachers, peers: Science learning partners. Psychology Press. Mayer, R. E. (2002). Multimedia learning. Psychology of Learning and Motivation, 41, 85–139. Liu, O. L., Lee, H. S., & Linn, M. C. (2011). Measuring knowledge integration: Validation of four-year assessments. Journal of Research in Science Teaching, 48(9), 1079–1107. Liu, X., & Lesniak, K. (2006). Progression in Children’s Understanding of the Matter Concept from Elementary to High School. Journal of Research in Science Teaching, 43(3), 320–347. https://doi.org/10.1002 /(ISSN)1098-2736. Lombardi, D., Bickel, E. S., Bailey, J. M., & Burrell, S. (2018). High school students’ evaluations, plausibility (re) appraisals, and knowledge about topics in Earth science. Science Education, 102(1), 153–177. Matuk, C. F. & King Chen, J. (2011). The WISE Idea Manager: A Tool to Scaffold the Collaborative Construction of Evidence-Based Explanations from Dynamic Scientific Visualizations. In, Proceedings of the 9th International Conference on Computer Supported Collaborative Learning CSCL2011: Connecting computer supported collaborative learning to policy and practice, July 4–8, 2011. The University of Hong Kong, Hong Kong, China. Matuk, C. F. & Linn, M. C. (2013, April 27–May 1). Technology integration to scaffold and assess students use of visual evidence in science inquiry. Paper presented at the American Educational Research Association Meeting (AERA2013): Education and Poverty: Theory, Research, Policy and Praxis, San Francisco, CA, USA. Matuk, C. & Linn, M. C. (2015). Examining the real and perceived impacts of a public idea repository on literacy and science inquiry. In CSCL’15: Proceedings of the 11th International Conference for Computer Supported Collaborative Learning, (Vol. 1, pp. 150–157). Gothenburg, Sweden: International Society of the Learning Sciences. Matuk, C., McElhaney, K. W., Chen, J. K., Lim-Breitbart, J., Kirkpatrick, D., & Linn, M. C. (2016). Iteratively Refining a Science Explanation Tool Through Classroom Implementation and Stakeholder Partnerships. International Journal of Designs for Learning, 7(2). Matuk, C., McElhaney, K., Miller, D., King Chen, J., Lim-Breitbart, J., Terashima, H., Kwan, G., & Linn, M.C. (2013). Reflectively prototyping a tool for exchanging ideas. In CSCL’13: Proceedings of the 10th International Conference on Computer Supported Collaborative Learning, (Vol. 2, pp. 101–104). Madison, WI, 2013. International Society of the Learning Sciences. Mayer, R. E. (1984). Aids to text comprehension. Educational Psychologist, 19(1), 30–42.

Intern. J. Comput.-Support. Collab. Learn McElhaney, K., Miller, D., Matuk, C., & Linn, M. C. (2012). Using the Idea Manager to promote coherent understanding of inquiry investigations. In ICLS'12: Proceedings of the 10th International Conference for the Learning Sciences, (Vol. 1, pp. 323–330). Sydney: International Society of the Learning Sciences. McGrew, S., Breakstone, J., Ortega, T., Smith, M., & Wineburg, S. (2018). Can students evaluate online sources? Learning from assessments of civic online reasoning. Theory & Research in Social Education, 1–29. McNeill, K. L., Lizotte, D. J., Krajcik, J., & Marx, R. W. (2006). Supporting students' construction of scientific explanations by fading scaffolds in instructional materials. The Journal of the Learning Sciences, 15(2), 153–191. McNeill, K. L., & Krajcik, J. (2008). Inquiry and scientific explanations: Helping students use evidence and reasoning. Science as inquiry in the secondary setting, 121–134. McNeill, K. L., & Pimentel, D. S. (2010). Scientific discourse in three urban classrooms: The role of the teacher in engaging high school students in argumentation. Science Education, 94(2), 203–229. Mercer, N., Wegerif, R., & Dawes, L. (1999). Children's talk and the development of reasoning in the classroom. British Educational Research Journal, 25(1), 95–111. Metzger, M. J., Flanagin, A. J., & Medders, R. B. (2010). Social and heuristic approaches to credibility evaluation online. Journal of Communication, 60(3), 413–439. Miles, M. B., Huberman, A. M., & Saldaña, J. (2014). Qualitative data analysis: A method sourcebook. CA, US: Sage Publications. Mittlefehldt, S. & Grotzer, T. (2003). Using metacognition to facilitate the transfer of causal models in learning density and pressure. National Association of Research in Science Teaching Conference. Philadelphia, PA. Mortimer, E., & Scott, P. (2003). Meaning making in secondary science classrooms. Maidenhead: Open University Press. Morse, J. M., Barrett, M., Mayan, M., Olson, K., & Spiers, J. (2002). Verification strategies for establishing reliability and validity in qualitative research. International Journal of Qualitative Methods, 1(2), 13–22. Moskaliuk, J., Kimmerle, J., & Cress, U. (2012). Collaborative knowledge building with wikis: The impact of redundancy and polarity. Computers & Education, 58(4), 1049–1057, ISSN 0360-1315. https://doi. org/10.1016/j.compedu.2011.11.024. NGSS Lead States. (2013). Next Generation Science Standards: For States, By States. Washington, DC: The National Academies Press. Nussbaum, E. M., & Kardash, C. M. (2005). The effects of goal instructions and text on the generation of counterarguments during writing. Journal of Educational Psychology, 97, 157–169. O’Donnell, A. M., & Dansereau, D. F. (1992). Scripted cooperation in student dyads: A method for analyzing and enhancing academic learning and performance. Interaction in cooperative groups: The theoretical anatomy of group learning, 120–141. O’Keefe, D. J. (1999). How to handle opposing arguments in persuasive messages: A meta-analytic review of the effects of one-sided and two-sided messages. Annals of the International Communication Association, 22(1), 209–249. Osborne, J., Erduran, S., & Simon, S. (2004). Enhancing the quality of argumentation in school science. Journal of Research in Science Teaching, 41, 994–1020. O’Connor, M. C., Michaels, S., & Chapin, S. H. (2015). BScaling down^ to explore the role of talk in learning: From district intervention to controlled classroom study. In L. B. Resnick, C. Asterhan, & S. N. Clarke (Eds.), Socializing Intelligence Through Talk and Dialogue. Washington, DC: American Educational Research Association. Osborne, J. (2010). Arguing to learn in science: The role of collaborative, critical discourse. Science, 328(5977), 463–466. Palinscar, A. S., & Brown, A. L. (1984). Reciprocal teaching of comprehension-fostering and comprehensionmonitoring activities. Cognition and Instruction, 1(2), 117–175. Pan, B., Hembrooke, H., Joachims, T., Lorigo, L., Gay, G., & Granka, L. (2007). In Google we trust: Users’ decisions on rank, position, and relevance. Journal of Computer-Mediated Communication, 12, 801–823. https://doi.org/10.1111/j.1083-6101.2007.00351.x. Perkins, D. N. (1987). Reasoning as it is and could be: An empirical perspective. In D. M. Topping, D. C. Crowell, & V. N. Kobayashi (Eds.), Thinking across cultures: The third international conference on thinking (pp. 175–194). Hillsdale: Erlbaum. Perkins, D. N., Farady, M., & Bushey, B. (1991). Everyday reasoning and the roots of intelligence. In J. F. Voss, D. N. Perkins, & J. W. Segal (Eds.), Informal reasoning and education (pp. 83–105). Hillsdale: Lawrence Erlbaum Associates, Inc.. Piaget, J., & Inhelder, B. (1974). The Child’s Construction of Quantities: Conservation and Atomism. London: Routledge and Kegan Paul. Rau, M. A., Bowman, H. E., & Moore, J. W. (2017). An adaptive collaboration script for learning with multiple visual representations in chemistry. Computers & Education, 109, 38–55.

Matuk C., Linn M.C. Roscoe, R. D., & Chi, M. (2008). Tutor learning: The role of instructional explaining and responding to questions. Instructional Science, 36(4), 321–350. Rowe, M. B. (1974). Relation of wait-time and rewards to the development of language, logic, and fate control: Part II-Rewards. Journal of Research in Science Teaching, 11(4), 291–308. Ruiz-Primo, M. A., & Furtak, E. M. (2007). Exploring teachers’ informal formative assessment practices and students’ understanding in the context of scientific inquiry. Journal of Research in Science Teaching, 44(1), 57–84. Rummel, N., & Spada, H. (2005). Learning to collaborate: An instructional approach to promoting collaborative problem solving in computer-mediated settings. The Journal of the Learning Sciences, 14(2), 201–241. Ryoo, K., & Linn, M. C. (2016). Designing automated guidance for concept diagrams in inquiry instruction. Journal of Research in Science Teaching, 53(7), 1003–1035. Sampson, V., Grooms, J., & Walker, J. P. (2011). Argument-driven inquiry as a way to help students learn how to participate in scientific argumentation and craft written arguments: An exploratory study. Science Education, 95, 217–257. https://doi.org/10.1002/sce.20421. Sandelowski, M. (2010). What's in a name? Qualitative description revisited. Research in Nursing & Health, 33(1), 77–84. Sandoval, W. A., & Millwood, K. A. (2005). The quality of students' use of evidence in written scientific explanations. Cognition and Instruction, 23(1), 23–55. Scardamalia, M. (2002). Collective cognitive responsibility for the advancement of knowledge. In B. Smith (Ed.), Liberal education in a knowledge society (pp. 76–98). Chicago: Open Court. Scardamalia, M., & Bereiter, C. (2014). Knowledge building and knowledge creation: Theory, pedagogy, and technology. Cambridge Handbook of the Learning Sciences, 397–417. Scardamalia, M., & Bereiter, C. (2006). Knowledge building: Theory, pedagogy, and technology. The Cambridge Handbook of the Learning Sciences, 97–115. Scardamalia, M., & Bereiter, C. (1994). Computer support for knowledge-building communities. Journal of the Learning Sciences, 3(3), 265–283. Scardamalia, M., Bereiter, C., McLean, R. S., Swallow, J., & Woodruff, E. (1989). Computer supported intentional learning environments. Journal of Educational Computing Research, 5, 51–68. Schauble, L. (1996). The development of scientific reasoning in knowledge-rich contexts. Developmental Psychology, 32(1), 102. Schellens, T., Van Keer, H., De Wever, B., & Valcke, M. (2007). Scripting by assigning roles: Does it improve knowledge construction in asynchronous discussion groups? International Journal of Computer-Supported Collaborative Learning, 2, 225–246. Schwarz, B. B., Neuman, Y., & Biezuner, S. (2000). Two wrongs may make a right… if they argue together! Cognition and Instruction, 18(4), 461–494. Shavelson, R. J., Young, D. B., Ayala, C. C., Brandon, P. R., Furtak, E. M., Ruiz-Primo, M. A., Tomita, M. K., & Yin, Y. (2008). On the Impact of Curriculum-embedded Formative Assessment on Learning: A Collaboration between Curriculum and Assessment Developers. Applied Measurement in Education, 21(4), 295–314. https://doi.org/10.1080/08957340802347647. Siegler, R. S. (2002). Microgenetic studies of self-explanation. Microdevelopment: Transition processes in development and learning, 31–58. Silver, E. A., Ghousseini, H., Gosen, D., Charalambous, C., & Strawhun, B. T. F. (2005). Moving from rhetoric to praxis: Issues faced by teachers in having students consider multiple solutions for problems in the mathematics classroom. The Journal of Mathematical Behavior, 24(3–4), 287–301. Simon, D., & Holyoak, K. J. (2002). Structural dynamics of cognition: From consistency theories to constraint satisfaction. Personality and Social Psychology Review, 6(4), 283–294. Simon, S., Erduran, S., & Osborne, J. (2006). Learning to teach argumentation: Research and development in the science classroom. International Journal of Science Education, 28(2–3), 235–260. Skoumios, M. (2009). The effect of sociocognitive conflict on students’ dialogic argumentation about floating and sinking. International Journal of Environmental and Science Education, 4(4), 381–399. Slotta, J. D., Chi, M. T., & Joram, E. (1995). Assessing students' misclassifications of physics concepts: An ontological basis for conceptual change. Cognition and Instruction, 13(3), 373–400. Slotta, J. D., & Linn, M. C. (2009). WISE science: Web-based inquiry in the classroom. Teachers College Press. Smith, C., Carey, S., & Wiser, M. (1985). On Differentiation: A Case Study of the Development of the Concepts of Size, Weight, and Density. Cognition, 21(3), 177–237. https://doi.org/10.1016/0010-0277(85)90025-3. Smith, C., Maclin, D., Grosslight, L., & Davis, H. (1997). Teaching for Understanding: A Study of Students’ Preinstruction Theories of Matter and a Comparison of the Effectiveness of Two Approaches to Teaching about Matter and Density. Cognition and Instruction, 15(3), 317–393.

Intern. J. Comput.-Support. Collab. Learn Smith, C., Snir, J., & Grosslight, L. (1992). Using Conceptual Models to Facilitate Conceptual Change: The Case of Weight-density Differentiation. Cognition and Instruction, 9(3), 221–283. https://doi.org/10.1207 /s1532690xci0903_3. Stanford, C., Moon, A., Towns, M., & Cole, R. (2016). Analysis of Instructor Facilitation Strategies and Their influences on student argumentation: A Case Study of a process Oriented Guided inquiry learning physical chemistry classroom. Journal of Chemical Education, 93(9), 1501–1513. Tao, P.-K., & Gunstone, R. F. (1999). Conceptual change in science through collaborative learning at the computer. International Journal of Science Education, 21(1), 39–57. Tirosh, D., & Stavy, R. (1999). Intuitive Rules: A Way to Explain and Predict Students’ Reasoning. Educational Studies in Mathematics, 38(1/3), 51–66. https://doi.org/10.1023/A:1003436313032. Toth, E. E., Suthers, D. D., & Lesgold, A. M. (2002). BMapping to know^: The effects of representational guidance and reflective assessment on scientific inquiry. Science Education, 86(2), 264–286. Toulmin, S. (1958). The uses of argument. Cambridge: Cambridge University Press. van Boxtel, C., van der Linden, J., & Kanselaar, G. (2000). Collaborative learning tasks and the elaboration of conceptual knowledge. Learning and Instruction, 10, 311–330. Veenman, S., Denessen, E., van den Akker, A., & van der Rijt, J. (2005). Effects of a cooperative learning program on the elaborations of students during help seeking and help giving. American Educational Research Journal, 42, 115–151. Vitale, J., Applebaum, L., & Linn, M. (2017). Individual Versus Shared Design Goals in a Graph Construction Activity. Philadelphia: International Society of the Learning Sciences. Vogel, F., Wecker, C., Kollar, I., & Fischer, F. (2017). Socio-cognitive scaffolding with computer-supported collaboration scripts: A meta-analysis. Educational Psychology Review, 29(3), 477–511. Walraven, A., Brand-Gruwel, S., & Boshuizen, H. (2009). How students evaluate information and sources when searching the world wide web for information. Computers & Education, 52, 234–246. https://doi. org/10.1016/j.compedu.2008.08.003. Wang, C. Y. (2015). Scaffolding middle school students’ construction of scientific explanations: Comparing a cognitive versus a metacognitive evaluation approach. International Journal of Science Education, 37(2), 237–271. Warner, L. B. (2008). How do students’ behaviors relate to the growth of their mathematical ideas? Journal of Mathematical Behavior, 27(3), 206–227. Webb, N. M., Franke, M. L., Ing, M., Wong, J., Fernandez, C. H., Shin, N., & Turrou, A. C. (2014). Engaging with others’ mathematical ideas: Interrelationships among student participation, teachers’ instructional practices, and learning. International Journal of Educational Research, 63, 79–93. Webb, N. M., Troper, J. D., & Fall, R. (1995). Constructive activity and learning in collaborative small groups. Journal of Educational Psychology, 87(3), 406–423. Wecker, C., & Fischer, F. (2014). Where is the evidence? A meta-analysis on the role of argumentation for the acquisition of domain-specific knowledge in computer-supported collaborative learning. Computers & Education, 75, 218–228. Weinberger, A., Ertl, B., Fischer, F., & Mandl, H. (2005). Epistemic and social scripts in computer–supported collaborative learning. Instructional Science, 33(1), 1–30. Weinberger, A., Stegmann, K., & Fischer, F. (2007). Knowledge convergence in collaborative learning: Concepts and assessment. Learning and Instruction, 17(4), 416–426. Weinstein, C. E., & Mayer, R. E. (1986). The teaching of learning strategies. In M. Wittrock (Ed.), Handbook of Research on Teaching (pp. 315–327). New York: Macmillan. Westerwick, A. (2013). Effects of sponsorship, web site design, and Google ranking on the credibility of online information. Journal of Computer-Mediated Communication, 18, 194–211. https://doi.org/10.1111 /jcc4.12006. Whitebread, D., Bingham, S., Grau, V., Pino Pasternak, D., & Sangster, C. (2007). Development of metacognition and self-regulated learning in young children: Role of collaborative and peer-assisted learning. Journal of Cognitive Education and Psychology, 6(3), 433–455. Wittrock, M. C. (1990). Generative processes of comprehension. Educational Psychologist, 24(4), 345–376. Yang, S. J., & Chen, I. Y. (2008). A social network-based system for supporting interactive collaboration in knowledge sharing over peer-to-peer network. International Journal of Human-Computer Studies, 66(1), 36–50. Zhang, M., & Quintana, C. (2012). Scaffolding strategies for supporting middle school students’ online inquiry processes. Computers & Education, 58(1), 181–196.