A Rubric for Peer Evaluation of Multi-User Virtual Environments for Education and Training
In a media-saturated online ecosystem, educational technology that fosters virtual interactions and learning opportunities, unlike those taking place face-to-face, has to bear special characteristics that facilitate the way we build our connection with others or access, consume, and produce new info...
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2025-02-01
|
| Series: | Information |
| Subjects: | |
| Online Access: | https://www.mdpi.com/2078-2489/16/3/174 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | In a media-saturated online ecosystem, educational technology that fosters virtual interactions and learning opportunities, unlike those taking place face-to-face, has to bear special characteristics that facilitate the way we build our connection with others or access, consume, and produce new information. The present study focuses on the design and implementation of a rubric for the peer assessment of collaborative educational Virtual Reality (VR) environments that were built with the aim to provide immersive-triggered meaningful learning instances. It presents the methodology employed to create the tool, its use in peer evaluation processes, and the implementation findings. The stages of the methodology employed involve the review of existing tools, the rationale lying in the creation of the certain tool, and the recruitment of educators and/or trainers to pilot test it. To this end, there was a purposeful recruitment of participants of a postgraduate program in immersive technologies, with diverse demographics and from different disciplines, who were invited to work collaboratively, in pairs or groups of three, with the intent to design and develop an educational intervention of their choice in Spatial.io software. The stages of the methodology further involved microteaching sessions with other groups, peer evaluation based on the quality criteria provided, and self-reflection and evaluation of their educational interventions. The study outcomes revealed (i) the key evaluation criteria that proved to be critical for the design of quality immersive experiences, (ii) the usefulness of the rubric created to facilitate the pilot testing of the prototypes, and (iii) challenges and benefits that arise from peer evaluation practices. In the context of interdisciplinary, diverse age and professional experience demographics peer evaluation, digital, content, and pedagogical concerns arose providing fruitful feedback to their peers for the refinement of the design of their VR environments. Challenges and recommendations of the peer review processes are also discussed. |
|---|---|
| ISSN: | 2078-2489 |