A Multisensor Dataset of South Asian Post-Graduate Students Working on Mental Rotation Tasks

Abstract Spatial thinking, in general, and mental rotation, in particular, have seen sustained research attention due to such abilities playing a critical role in STEM (science, technology, engineering and mathematics) learning. The recent development of sensor-based approaches to identify, understa...

Full description

Saved in:
Bibliographic Details
Main Authors: Ashwin T. S., Suraj Ranganath, Kabyashree Khanikar, Karishma Khan, Ramkumar Rajendran, Ritayan Mitra
Format: Article
Language:English
Published: Nature Portfolio 2025-04-01
Series:Scientific Data
Online Access:https://doi.org/10.1038/s41597-025-04865-5
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract Spatial thinking, in general, and mental rotation, in particular, have seen sustained research attention due to such abilities playing a critical role in STEM (science, technology, engineering and mathematics) learning. The recent development of sensor-based approaches to identify, understand, and measure cognition and affect opens up new possibilities to study such topics. We collected galvanic skin response, electroencephalography, screen recording, facial expressions, manual emotion logging, task performance logs (including response times, correctness, and question difficulty), gaze, and self-reports of 38 participants as they solved mental rotation tasks under various conditions, namely, (i) with no time restriction and no feedback, (ii) with no time restriction and with feedback and (iii) with time restriction and with no feedback, respectively. The availability of such a dataset will help researchers in the spatial thinking community to study interesting questions related to strategy selection, flexibility, affective response, and group differences in mental rotation tasks. Furthermore, the learning analytics community could gain valuable insights into how providing feedback might change learning and engagement during such tasks.
ISSN:2052-4463