Read ‘Read-Only Participants: A Case for Student Communication in Online Classes’ by Nagel, Blignaut, and Cronje which is located in the Resource section of Topic 1. After reading the Nagel
Read "Read-Only Participants: A Case for Student Communication in Online Classes" by Nagel, Blignaut, and Cronje which is located in the Resource section of Topic 1.
After reading the Nagel, Blignaut, and Cronje article, write a 250-500-ws summary of it.
Be sure to include a discussion of the research problem, questions, method, findings, and implications discussed by the authors.
Full Terms & Conditions of access and use can be found at https://www.tandfonline.com/action/journalInformation?journalCode=nile20
Interactive Learning Environments
ISSN: 1049-4820 (Print) 1744-5191 (Online) Journal homepage: https://www.tandfonline.com/loi/nile20
Read-only participants: a case for student communication in online classes
L. Nagel , A. S. Blignaut & J. C. Cronjé
To cite this article: L. Nagel , A. S. Blignaut & J. C. Cronjé (2009) Read-only participants: a case for student communication in online classes, Interactive Learning Environments, 17:1, 37-51, DOI: 10.1080/10494820701501028
To link to this article: https://doi.org/10.1080/10494820701501028
Published online: 13 Mar 2009.
Submit your article to this journal
Article views: 76912
View related articles
Citing articles: 9 View citing articles
Read-only participants: a case for student communication in online classes
L. Nagel a *, A.S. Blignaut
b and J.C. Cronjé
c
a University of Pretoria, South Africa;
b North-West University, South Africa;
c Cape Peninsula
University of Technology, South Africa
(Received 5 April 2007; final version received 25 May 2007)
The establishment of an online community is widely held as the most important prerequisite for successful course completion and depends on an interaction between a peer group and a facilitator. Beaudoin reasoned that online students sometimes engage and learn even when not taking part in online discussions. The context of this study was an online course on web-based education for a Masters degree in computer-integrated education at the University of Pretoria. We used a mixed methodology approach to investigate how online activity and discussion postings relate to learning and course completion. We also investigated how student collaborative behaviour and integration into the community related to success. Although the quantitative indices measured showed highly significant differences between the stratifications of student performance, there were notable exceptions unexplained by the trends. The class harboured a well-functioning online learning community. We also uncovered the discontent students in the learning community felt for invisible students who were absent without reason from group assignments or who made shallow and insufficient contributions. Student online visibility and participation can take many forms, like read-only participants who skim over or deliberately harvest others’ discussions. Other students can be highly visible without contributing. Students who anticipate limited access due to poor connectivity, high costs or other reasons can manage their log-in time effectively and gain maximum benefit. Absent and seldom contributing students risk forsaking the benefits of the virtual learning community. High quality contributions rather than quantity builds trust among mature students. We suggest how to avoid read-only-participation: communicate the required number of online classroom postings; encourage submission of high quality, thoughtful postings; grade discussions and give formative feedback; award individual grades for group projects and rotate members of groups; augment facilitator communication with Internet-independent media to convey important information. Read-only-participants disrupt the formation of a virtual community of learners and compromise learning.
Keywords: higher education; web-based learning; participation; lurkers; virtual community of learners
Background
As more formal education courses are available online, quality and non-completion remain problems:
While online course enrolments continue to climb, retention and success rates in such courses and programs are frequently reported as typically lower than those delivered in
*Corresponding author. Email: [email protected]
Interactive Learning Environments
Vol. 17, No. 1, March 2009, 37–51
ISSN 1049-4820 print/ISSN 1744-5191 online
� 2009 Taylor & Francis DOI: 10.1080/10494820701501028
http://www.informaworld.com
a traditional classroom format; those of us in roles that support online students have a role in reversing that trend! (Schreck, 2006)
Researchers often measure the success of online learning as students’ perception of learning and course throughput rates. Drop-out rates for online courses range from 20 to 50%, often 10–20% higher than for equivalent contact courses (Bernard, Brauer, Abrami, & Surkes, 2004). Searching for a model to predict student success in online learning, Bernard et al. (2004) found that students’ frame of mind can predict readiness for learning and affect course outcomes, while ‘‘prior achievement is still the best predictor of future achievement’’ (Bernard et al., 2004, p. 44).
Research shows that online participation is necessary to ensure successful course completion (Klemm, 1998; Rovai & Barnum, 2003; Swan, Shea, Fredericksen, Pickett, & Pelz, 2000). Clark and Feldon (2005) concluded that a facilitator who participates and interacts with students prevents them from abandoning their course. Better cognitive outcomes occur when students engage and form a virtual community of learners. The development of a community depends on online interaction with their peers and the facilitator. Learner satisfaction, perseverance, and cognitive outcomes characterize the formation of a virtual learning community. Some contest participation as a prerequisite to learning, claiming students learn sufficiently by observation (Beaudoin, 2002; Sutton, 2001), and lobby for leniency towards lurking or read-only participation. This article responds to Beaudoin’s (2002) article ‘‘Learning or lurking? Tracking the ‘invisible’ online student.’’ He reasoned that students sometimes engage and learn even when not taking part in online discussions with faculty and other students and showed that low profile students:
spend a significant amount of time in learning-related tasks, including logging on, even when not visibly participating, and they feel they are still learning and benefiting from this low-profile approach to their online studies. (p. 147)
We investigated the importance of student online ‘‘visibility’’ apparent in the quantity and quality of participation. We explored as a case study the successful completion of a postgraduate online course by asking the following research questions.
(1) How did online participation relate to learning and successful course completion?
(2) How did participation influence the learning community?
Literature
The debate on online participation
Taking part in discussions
A learning management system (LMS) tracks progress and performance and reveals students who do not log in to their online classroom or who log in without participating. Klemm (1998) blamed classroom-based teaching where students expect entertainment for conditioning them to passive learning. Therefore, they seldom realize the benefits of participating actively in online discussions, naturally
38 L. Nagel et al.
lurking. Well-facilitated online discussions can be more inclusive than classroom discussions by including introverted students and enabling better quality interaction (Cox, Carr, & Hall, 2004; Prammanee, 2003). Rovai and Barnum (2003) claimed that passive online learning through ‘‘listening’’ without participation produces no measurable increase in knowledge, as they could predict perceived learning through the number of messages posted. Others have also reported that distributed students who participate in dynamic discussions had better course completion rates and that failing students interacted less frequently (Davies & Graff, 2005; Swan et al., 2000). Active online participation also benefits learning.
Improved learning
Deep cognitive learning (Prammanee, 2003) and high levels of interactivity are possible in online discussions, as students can prepare well-considered contributions (Kettner-Polley, 2005). According to Carr, Cox, Eden, and Hanslo (2004), students who focused on building knowledge and collaborative interactions had a superior average performance, as challenging online interactions promote understanding. Interactive learning provides an instructor with insight into student misconceptions, difficulties, conceptual problems, and verbal pitfalls. Asking leading questions elicits insights into what students understand, more than simply telling them the answer. Immediate feedback from their peers and instructors and social interaction built into the online discussions contribute to learning (Collins, Brown, & Holum, 1991). Collaborative learning activities contribute to deep learning, critical thinking skills, a shared understanding, and long-term retention (Garrison, Anderson, & Archer, 2001).
Consistency in course design, interaction with course instructors, and active discussion—have been consistently shown to significantly influence the success of online courses. It is posited that the reason for these findings relates to the importance of building community in online courses. (Swan et al., 2000, p. 513)
Community of learners
Interaction is conducive to the emergence of a community of practice (Collins et al., 1991) and a virtual community of learners (Collison, Elbaum, Haavind, & Tinker, 2000). Learning from your peers in a structured way can ameliorate the social isolation online students often experience (Boud, Cohen, & Sampson, 1999). Collaborative learning groups solve problems while sharing and clarifying ideas (Cox et al., 2004). In a collaborative learning environment students develop critical thinking skills and a shared understanding and deep learning, while retaining learning over the long term (Garrison et al., 2001). In a community of practice novices learn from experts by observing authentic tasks and executing progressively more advanced tasks themselves under an expert eye (Johnson, C. S., 2001). Complex tasks can be learnt in a community of practice wherein ‘‘participants actively communicate about and engage in the skills involved in expertise’’ (Collins et al., 1991, p. 16). Frequent, meaningful, valued, and dynamic discussions in an online course lead to the formation of a virtual learning community where students interact and support each other. According to Collison et al. (2000), members of a healthy online community of learners post regularly and collaborate with other participants, as well as teach and moderate the
Interactive Learning Environments 39
online discussions spontaneously. Group cohesion, trust, respect, and belonging further characterize a community of learning (Kreijns, Kirschner, & Jochems, 2003). The formation of a community cannot be taken for granted. Some students do not participate fully.
The case for read-only participation
Legitimate non-participation
Non-participation may initially be legitimate, as peripheral online learners make limited entrances into the community, remaining on the outskirts, observing the activities of more advanced participants and learning from it (Collins et al., 1991). Sutton (2001, p. 223) also reasons that ‘‘direct interaction is not necessary for all students and that those who observe and actively process interactions between others will benefit through the process of vicarious interaction.’’ As students increase their expertise, they move from the periphery to the centre (Carr et al., 2004), with increasing visibility. Beaudoin (2002) found that invisible students sometimes ‘‘spend a significant amount of time in learning-related tasks, including logging on, even when not visibly participating, and they feel they are still learning and benefiting from this low-profile approach to their online studies’’ (p. 147). Williams (2004) advocated using the term read-only participants (ROP) rather than the derogatory lurker for non-participatory students and vicarious interactors. He cautioned that while the ROPing students may be satisfied that their learning needs are met, they do not contribute to the larger community.
Inadvertent non-participation
Students do not actively participate in online discussions because they procrastinate, they feel isolated, or they’re unfamiliar with the technology. They may also miss the course structure or control of discussions and therefore remain unconvinced of the course’s benefits (Miller, Rainer, & Corley, 2003). Patterns of online participation and interaction can vary across cultural groups. In many developing countries the digital divide is increasing, due to an inadequate infrastructure and few Internet subscriptions (Roycroft & Anantho, 2003). The exclusive use of English in non- English speaking cultures, economic development, and available bandwidth also affect student success.
Facilitator participation
Student interaction is not the only factor influencing collaboration, learning, and successful course completion. Students become more involved in an online conference when the facilitator participates as guide, providing extensive critique, feedback, and encouragement (Collison et al., 2000). An effective learning community requires an instructor with integrated social, cognitive, and teaching presence (Cox et al., 2004). Facilitators should teach critical thinking, effective communication, and problem-solving skills (Shavelson & Huang, 2003). The current vogue to embrace a constructivist pedagogy where the instructor withdraws from the online learning environment, allegedly to promote discovery and experimental learning activities, is unsubstantiated (Kirschner, Sweller, & Clark, 2006).
40 L. Nagel et al.
Automated e-learning or a lurking instructor presents an even greater impediment to learning than do lurking students.
Context of this study
We presented an 8 week course on web-based distance learning to Masters students on a computer-integrated education course at the University of Pretoria. This was an elective course in a programme usually presented in blended contact and online mode. We delivered this course entirely online using the WebCT
TM
Campus Edition as the LMS. The delivery mode enabled enrolment of a diverse cohort of 22 geographically distributed students with ages ranging from nearly 30 to nearly 50. The student ages represent baby boomers and generation X (Oblinger, 2003). The course followed a constructivist approach and consisted equally of theoretical and practical applications structured around eight salient online learning topics. Each week the students had to research online scholarly literature on the topic and post their contribution to the LMS discussions area, where they also posted peer reviews. Concurrently, students had to create web- based artefacts applying the theory. We provided formative feedback during the course and assessed students using integrated assessment of authentic tasks, focusing on outcomes.
In the latter half of the course students also created two rounds of group assignments in teams of five to seven, as experience of collaborative online work was a course outcome. One of these was a rubric to score online collaborative behaviour, strongly taking into account their contributions to group assignments. Participating in discussions, replying to pleas for help and offering tips and advice completed the tally. Students used this rubric to allocate a collaboration score for each student that contributed 10% to their year mark. The other 90% derived from research postings, web artefacts, peer review, and collaborative assessment. The final course grade also included their reflective examination essays, depicting their writing skills. Unlike Davies and Graff (2005), we did not use their final course grade as an indication of success. Instead, we used the ongoing year mark that reflected a wider spectrum of mastery and application.
We observed students’ experiences with online learning through multiple windows. These consisted of their private blogs (only shared with the facilitator) for reflection and self-assessment, open paragraph questions included in an online quiz, a reflective essay, and feedback questions e-mailed to the students about one month after completion of the course. The facilitator also documented observations in a diary.
Methodology
The course presenters simultaneously conducted research, using a mixed methodology (Sharp & Fretchling, 1997). A qualitative methodology allowed us to probe the context of the non-participating students and the class’s perceptions and reactions. We conducted content analysis using ATLAS.ti
TM software on the
following primary documents: students’ blog postings, 1615 discussion posts, an online quiz, and examination essays. Representative quotes from student postings are in their original form, reflecting their use of English as a second language. We validated the findings against the facilitator’s field notes and used multiple
Interactive Learning Environments 41
documents and perspectives. The researchers also facilitated the online course and, as participant observers, ensured the reliability of the findings.
The student tracking tool in the LMS provided a quantitative view of student activity in the course, including the numbers of original postings and replies. The WebCT Campus edition student tracking tool maintains a record of the number of times a student accesses the various course areas. The term ‘‘hits’’ is defined in the WebCT help pages as ‘‘the number of times the student accessed the Homepage, a tool [including the items read or posted in discussions], or a content module page.’’ We calculated their reply ratio by dividing the number of replies to others by their own original posts. Table 1 ranks students according to their year mark and shows the students’ numbers of hits in the LMS and discussion messages posted, their reply ratio, collaboration score, and whether they returned the voluntary post-course feedback. Unlike the rest, the collaboration score is a qualitative measurement obtained by using a rubric to assess each student’s collaborative behaviour.
We represent student online activities using the assumptions of Davies and Graf (2005), who categorized students according to final course grades. Our grade categories reflected the assessment stratification used in South African Higher Education. One student abandoned the course very early, and we did not include this data. We stratified the rest of the class into three grade group categories: a Fail group for students who did not complete the entire course or achieved less than 50%; a Pass group of students who aggregated between 50% and 74%. Those with 75% or more we called Distinction candidates. One student (subject 6) changed categories after the final essay and passed the course. We used this stratification for all statistics.
Table 1. Summary of individual student grades and participation profile.
Subject no. Year mark Hits Messages Reply ratio
posted Collaboration
score Feedback submitted
1 a
424 24 0.8 0 2
a 244 14 1.8 0 Yes
3 a
1161 30 0.4 2 4
a 1706 50 0.9 6
5 38.8 871 50 1.4 3 6 48 223 10 0.1 0 7 53 1406 68 1.5 3 8 60.2 966 54 1.3 7 9 60.9 776 30 1.0 8 10 61.6 844 36 1.3 5 Yes 11 63.1 1503 73 1.1 8 Yes 12 64 1758 58 1.5 3 13 66 1093 37 1.5 9 14 66.3 1487 104 2.7 9 Yes 15 70.2 1675 53 2.3 8 Yes 16 80 1810 126 3.2 10 Yes 17 80.3 963 43 1.0 8 18 80.9 1165 68 1.8 9 Yes 19 83.8 1226 68 2.0 9 Yes 20 85.4 1853 148 2.7 10 Yes 21 88.5 2980 112 1.7 9 Yes
a Student voluntarily abandoned the course before submitting the final examination essay.
42 L. Nagel et al.
Like Davies and Graff (2005), we used the Kruskall–Wallis non-parametric test to investigate the significance of differences in online activities among these grade groups. We also calculated the significance of the difference in return rates of voluntary questions using w2 with two degrees of freedom, as shown in Table 2. Figure 1 shows a graphical representation of the values given in Table 2. We show the average value for each criterion for each of the grade groups.
Discussion
Student online visibility and learning success
Like Beaudoin (2002), we did some tracking of our ‘‘invisible’’ students, trying to pinpoint reasons for their invisibility, as well its effects. We compared their online participation profiles and indicators of their integration into the virtual community with their success in completing the course. Interested in improving course completion rates, we first identified the unsuccessful students, to see if their participation differed from the others.
Student LMS hits
One can approximate students’ participation in the online classroom quantitatively by the number of times they open pages, read discussions, or post, as shown in Figure 1a. It shows that the student group that aggregated a failing grade or did not complete the course opened significantly fewer pages than the successful students. Their average of less than 800 implied that they saw only about half the online
Table 2. Average number of hits, posts and follow-up posts per student in grade groups.
Grade group N Hits Posts Reply ratio Collaboration Feedback (%)
Fail 6 771.5 30 1.06 2.2 17 Pass 9 1278.7 57 1.43 6 44 Distinction 6 1666.2 94 2.06 9.2 83 H value/w2 H¼26.3 H¼34.5 H¼24.7 H¼52.8 w2¼47 Significance 4.001 4.001 4.001 4.001 4.01
Figure 1. Average dimensions for each grade group.
Interactive Learning Environments 43
material in the course. Students who achieved distinctions read even more than did the average students.
Learning success depends on the interaction with reliable technology (Swan, 2003). The digital divide running through the infrastructure and economic and cultural dimensions (Roycroft & Anantho, 2003) influences connectivity and participation. Students whose infrequent log ins rendered them invisible compromised their success. The blogs revealed that students employed in the e-learning industry had practically unlimited bandwidth with state of the art computers and software. Others made do with much less and singled it out in the quiz as their biggest challenge:
Costly and demanding financially, time consuming, stressful . . . . Not for the poor people, under privileged students can be dropouts (Q).
Students experienced other technical problems that compounded their infrequent Internet connectivity:
Sometimes my (dial-up) connection was not reliable. (Q)
There are moments during this module where in my area I experienced a number of electricity cuts and this kept me anxious and waiting to get started with work. (E)
Some students showed resilience in coping with poor infrastructure, regular electricity cuts, and poor connectivity; they managed successfully without compromising their studies. For others, technological problems were overwhelming.
. . . first three weeks of the course I couldn’t work productively because of constant trouble with my PC (wrong Internet Explorer program, needed Java program to read and send information and finally got the Blaster virus). This made me very aware of the high-dependency on technology in the e-learning world. No Computer—No learning— No success. (E)
It is not always clear why some students persist against enormous odds while others give up. Motivation possibly played a role for the last two students, as the student with the electricity problems required the credits to graduate. Students perceived connectivity as the reason for erratic peer contributions, as they did not ‘‘see’’ the lurkers, but noticed that some withheld contributions.
When some of the peers are struggling for access, their level of contribution is hampered.
We are a nice bunch enrolled for this course. Some learners easily share and are spontaneous, while others hold back.
Even opening numerous online pages (Table 1) does not always indicate participation. Rovai and Barnum (2003) cautioned that attending courses without participation produces no measurable increase in knowledge and students who wish to pass just through attendance do not succeed. Learning requires interaction not only with the content, but also with co-learners (Swan, 2003).
The number of discussion posts
The majority of Discussion posts were compulsory and provided a view on peer group contributions. Figure 1b depicts the extent of student participation. Like hits, there was a significant difference between the numbers of postings from the students
44 L. Nagel et al.
in different grade groups. Students who failed or abandoned the course posted on average significantly fewer discussions than their successful counterparts, confirming Davies and Grafs (2005) results. We also observed a significant difference between average and excellent students, a trend Davies and Graff could not indicate.
On average, the high performing students were also most active in the discussions. There were also average performing students (subjects 7 and 14) who posted a proliferation of messages, constituting ‘‘noise’’ in the discussions (Williams, 2004), and an excellent performer (subject 17) who posted few (Table 1), reminiscent of vicarious or read-only participation. The number of posts, therefore, does not reflect student involvement.
Ratio of replies to original posts per student grade group
This metric indicated a student’s style of participation, whether peer focused or self- focused, and is independent of participation quantities. From Figure 1c it is evident that the more successful students more readily interacted with their peers. Successful students replied two or three times more often to other posts than they initiated original posts. The less successful students’ replied less often than they originally posted. The difference between all groups is highly significant (Table 2). These observations confirm that, after a minimum interaction establishing the necessary support, the quality and dynamics of interaction further influenced online learning (Davies & Graff, 2005). This metric still does not indicate the real quality of contributions. To encourage rational discourse Klemm (1998) urged facilitators to grade on the quality of the postings and not to settle for mere opinions. Absent students and those who contribute little of value or virtually ‘‘nod’’ their approval in threaded discussions do not deceive their peers (Collison et al., 2000).
Quality participation
Klemm (1998) proposed using peer groups to grade the value of each person’s contribution. Therefore, we designed one team assignment to develop a rubric for scoring online collaborative behaviour. The collaboration score (Figure 1d) is an average of assessments by two peers and the facilitator using this rubric. While rudimentary, it indicates how students rated others’ participation. Like all previously discussed quantitative measurements of student activity in the online classroom, the collaboration score showed highly significant differences among the three stratifica- tions of students, as unsuccessful students had low collaboration scores and the highly successful ones scored highest. Interpretation of the scores is problematic, as again there are notable exceptions. Subjects 6 and 12 (Table 1) logged in often, but they did not score high on collaboration and presented themselves as classic read- only participants.
We also used peer review extensively as a mechanism to improve interaction and learn collaboratively (Boud et al., 1999). The transparent learning gave students insight into each other’s work. Most students were positive about the peer assessment process and realized the advantages:
With traditional learning, nobody really has access to your assignments, except if you want them to. To me e learning proofed to be a very transparent way of learning. For the first time in my life I had freely access to everybody else’s assignments. I were able to
Interactive Learning Environments 45
position myself, to compare my own writing and most important learn from others. I was intrigued by the differing viewpoints from which the assignments were approached.
Peer assessment sharpens a student’s responses—the student knows he cannot ‘‘get away’’ with lazy work.
While the non-contributing students may be satisfied that their learning needs are met (Beaudoin, 2002), they do not contribute to the benefit of the community. We contend that the quality of a student’s contributions to the course reflects integration into the community.
Group participation
Cooperative group assignments encourage students to participate online. As previous teamwork in this programme resulted in much unresolved conflict, we scheduled group assignments in the latter half of the course and allocated a smal portion of the grades to these activities. The rationale for using gro
Collepals.com Plagiarism Free Papers
Are you looking for custom essay writing service or even dissertation writing services? Just request for our write my paper service, and we'll match you with the best essay writer in your subject! With an exceptional team of professional academic experts in a wide range of subjects, we can guarantee you an unrivaled quality of custom-written papers.
Get ZERO PLAGIARISM, HUMAN WRITTEN ESSAYS
Why Hire Collepals.com writers to do your paper?
Quality- We are experienced and have access to ample research materials.
We write plagiarism Free Content
Confidential- We never share or sell your personal information to third parties.
Support-Chat with us today! We are always waiting to answer all your questions.