an exploratory evaluation of student engagement through content annotation in a large, asynchronous, graduate level online course at a predominately Hispanic Serving Institution.
General description of the project
Social annotation platforms such as Perusall and Hypothesis are emerging as another tool to use in instruction. These tools allow students to anchor comments and responses to assigned text readings, asking questions and reacting to peers’ comments. When coupled with Open Educational Resources (OER), these tools can significantly reduce student total cost for instruction while at the same time increasing reading comprehension and knowledge construction through peer interaction. The full description below provides references demonstrating the effectiveness of the annotation tools to a general population. There is a paucity of descriptions of the effects of these programs at predominantly Hispanic institutions. The current exploratory analysis took place in such an institution. Implications for our student bodies and students for whom English is a second language (L2) will be discussed.
Technologies
The social annotation tool, Perusall (https//:perusall.com), was used to facilitate the discussion of OER materials in a large online course. Integrations with the Blackboard LMS will be discussed.
Explain project results
The project assisted approximately 100 students in comprehension and knowledge construction using a peer-mediated, social learning methodology in a research methods course. With a few exceptions, previous evaluations have been in settings that did not serve predominately Hispanic students.
Why it should be considered best practice?
This project adds to the knowledge base and demonstrates opportunities for English Language Learners to use peer-to-peer knowledge construction and content acquisition in a highly cost-effective manner.
Highlights of your proposed presentation
Student interaction (with peers and content) in asynchronous online courses must be carefully planned so that instructor and peer responses can be managed, valued, and are additive to the course content. Face to face (in the same physical location) or online synchronous courses allow for discussions and clarifications in real time. However, at least during the nascent stages of online asynchronous courses, student questions and/or discussions regarding course content were conducted through email or online discussion boards. Responses using these methods can be delayed and require repetitive responses by the instructor and/or students. For example, referencing a specific portion of a reading or other students’ comments can be laborious. Grading comments and responses, even using rubrics, is time consuming. Arguably, these tools still the dominate methods of student interaction.
Some have used social media platforms such as Facebook™ and Twitter™ to increase student to student and student to instructor interaction. The platforms do allow for quasi-real time interaction between students and instructors. Because of linked profiles, student privacy has been identified as one problem with using these publicly available platforms. Additionally, students must be specific as to what course content they are referencing in any comment. This can be accomplished through copy and pasting read material, but again is cumbersome. Finally, these platforms are not integral to Learning Management Systems (LMS) (e.g., Blackboard™, Canvas™, etc.) which makes grading and grade management more difficult.
Most recently, technologies have been developed to increase interaction with peer students, instructors, and the course content. Indeed, here is a growing research base using k-12 and undergraduate students as participants. Two of those platforms, Hypothesis™ ( https://hypothesis.io ) and Perusall™ ( https://perusall.com ) are used for annotation of material in educational settings (Porter, 2022). Both programs have comprehensive analytics that provide instructors with insight into student understanding of content. A full description of the software and video tutorials can be found at their respective websites. The current evaluation report uses Perusall™ to examine the annotation behavior of graduate students.
Perusall™ provides student engagement solutions that solve several issues. First, it integrates with several Learning Management Systems through a simple LTI (Learning Tool Interoperability program). Second, a customizable grading tool automatically scores student work within instructor selected parameters. These features lessen issues with grading workload in large courses. The interface allows students to highlight and comment on text passages of their choosing or as directed by the instructor. This feature eliminates posters from having to retype text. Students can also reply to the annotation/post and/or up vote the post. Students can reply to their peers or the instructor using the conventional “@” symbol.
This pilot evaluation describes and analyzes 2200 student annotations from a 100 student, asynchronous research methods course taught in a 7-week format. Findings describing student engagement with course content, other class members, and instructors will be presented. Comments will be categorized in the following groups (Plevinski et al., 2017): Knowledge Construction Activities and Coordination Activities. A further delineation of this categorization schemata will use the work of Plevinski and colleagues as a guide. Implications for teaching, including teaching English Language Learners (L2), will be discussed.
This evaluation is exploratory and descriptive in nature, drawn from one instructor’s course. Audience participation will be solicited during the presentation and incorporated into future research studies. The author has no affiliation with nor funding from any social annotation company mentioned in the presentation.
References
Knowledge Construction. Making a Difference: Prioritizing Equity and Access in CSCL, 12th International Conference on Computer Supported Collaborative Learning (CSCL), 1, 111–118.
Porter, G. W. (2022). Collaborative Online Annotation: Pedagogy, Assessment and Platform Comparisons. Frontiers in Education, 7. https://www.frontiersin.org/articles/10.3389/feduc.2022.852849
The Evaluation Committee will evaluate submitted proposals based on the following criteria. Each area will be rated on a scale from 1 to 7 (1= non-satisfactory; 7 =outstanding), for a maximum of 63 points.