8
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Detecting non-verbal speech and gaze behaviours with multimodal data and computer vision to interpret effective collaborative learning interactions

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Collaboration is argued to be an important skill, not only in schools and higher education contexts but also in the workspace and other aspects of life. However, simply asking students to work together as a group on a task does not guarantee success in collaboration. Effective collaborative learning requires meaningful interactions among individuals in a group. Recent advances in multimodal data collection tools and AI provide unique opportunities to analyze, model and support these interactions. This study proposes an original method to identify group interactions in real-world collaborative learning activities and investigates the variations in interactions of groups with different collaborative learning outcomes. The study was conducted in a 10-week long post-graduate course involving 34 students with data collected from groups’ weekly collaborative learning interactions lasting ~ 60 min per session. The results showed that groups with different levels of shared understanding exhibit significant differences in time spent and maximum duration of referring and following behaviours. Further analysis using process mining techniques revealed that groups with different outcomes exhibit different patterns of group interactions. A loop between students’ referring and following behaviours and resource management behaviours was identified in groups with better collaborative learning outcomes. The study indicates that the nonverbal behaviours studied here, which can be auto-detected with advanced computer vision techniques and multimodal data, have the potential to distinguish groups with different collaborative learning outcomes. Insights generated can also support the practice of collaborative learning for learners and educators. Further research should explore the cross-context validity of the proposed distinctions and explore the approach’s potential to be developed as a real-world, real-time support system for collaborative learning.

          Related collections

          Most cited references53

          • Record: found
          • Abstract: not found
          • Article: not found

          The eyes have it: the neuroethology, function and evolution of social gaze

          N.J. Emery (2000)
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found
            Is Open Access

            Brain-to-Brain Synchrony Tracks Real-World Dynamic Group Interactions in the Classroom.

            The human brain has evolved for group living [1]. Yet we know so little about how it supports dynamic group interactions that the study of real-world social exchanges has been dubbed the "dark matter of social neuroscience" [2]. Recently, various studies have begun to approach this question by comparing brain responses of multiple individuals during a variety of (semi-naturalistic) tasks [3-15]. These experiments reveal how stimulus properties [13], individual differences [14], and contextual factors [15] may underpin similarities and differences in neural activity across people. However, most studies to date suffer from various limitations: they often lack direct face-to-face interaction between participants, are typically limited to dyads, do not investigate social dynamics across time, and, crucially, they rarely study social behavior under naturalistic circumstances. Here we extend such experimentation drastically, beyond dyads and beyond laboratory walls, to identify neural markers of group engagement during dynamic real-world group interactions. We used portable electroencephalogram (EEG) to simultaneously record brain activity from a class of 12 high school students over the course of a semester (11 classes) during regular classroom activities (Figures 1A-1C; Supplemental Experimental Procedures, section S1). A novel analysis technique to assess group-based neural coherence demonstrates that the extent to which brain activity is synchronized across students predicts both student class engagement and social dynamics. This suggests that brain-to-brain synchrony is a possible neural marker for dynamic social interactions, likely driven by shared attention mechanisms. This study validates a promising new method to investigate the neuroscience of group interactions in ecologically natural settings.
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              Social signal processing: Survey of an emerging domain

                Bookmark

                Author and article information

                Contributors
                Journal
                Education and Information Technologies
                Educ Inf Technol
                Springer Science and Business Media LLC
                1360-2357
                1573-7608
                January 2024
                November 28 2023
                January 2024
                : 29
                : 1
                : 1071-1098
                Article
                10.1007/s10639-023-12315-1
                12df9139-336a-43d0-8b19-6a13bc2445bb
                © 2024

                https://creativecommons.org/licenses/by/4.0

                https://creativecommons.org/licenses/by/4.0

                History

                Comments

                Comment on this article