26
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Five mechanisms of sound symbolic association

      ,
      Psychonomic Bulletin & Review
      Springer Nature America, Inc

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Related collections

          Most cited references118

          • Record: found
          • Abstract: found
          • Article: not found

          Crossmodal correspondences: a tutorial review.

          In many everyday situations, our senses are bombarded by many different unisensory signals at any given time. To gain the most veridical, and least variable, estimate of environmental stimuli/properties, we need to combine the individual noisy unisensory perceptual estimates that refer to the same object, while keeping those estimates belonging to different objects or events separate. How, though, does the brain "know" which stimuli to combine? Traditionally, researchers interested in the crossmodal binding problem have focused on the roles that spatial and temporal factors play in modulating multisensory integration. However, crossmodal correspondences between various unisensory features (such as between auditory pitch and visual size) may provide yet another important means of constraining the crossmodal binding problem. A large body of research now shows that people exhibit consistent crossmodal correspondences between many stimulus features in different sensory modalities. For example, people consistently match high-pitched sounds with small, bright objects that are located high up in space. The literature reviewed here supports the view that crossmodal correspondences need to be considered alongside semantic and spatiotemporal congruency, among the key constraints that help our brains solve the crossmodal binding problem.
            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            Inhibiting and facilitating conditions of the human smile: A nonobtrusive test of the facial feedback hypothesis.

              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Visual statistical learning in infancy: evidence for a domain general learning mechanism.

              The rapidity with which infants come to understand language and events in their surroundings has prompted speculation concerning innate knowledge structures that guide language acquisition and object knowledge. Recently, however, evidence has emerged that by 8 months, infants can extract statistical patterns in auditory input that are based on transitional probabilities defining the sequencing of the input's components (Science 274 (1996) 1926). This finding suggests powerful learning mechanisms that are functional in infancy, and raises questions about the domain generality of such mechanisms. We habituated 2-, 5-, and 8-month-old infants to sequences of discrete visual stimuli whose ordering followed a statistically predictable pattern. The infants subsequently viewed the familiar pattern alternating with a novel sequence of identical stimulus components, and exhibited significantly greater interest in the novel sequence at all ages. These results provide support for the likelihood of domain general statistical learning in infancy, and imply that mechanisms designed to detect structure inherent in the environment may play an important role in cognitive development.
                Bookmark

                Author and article information

                Journal
                Psychonomic Bulletin & Review
                Psychon Bull Rev
                Springer Nature America, Inc
                1069-9384
                1531-5320
                October 2018
                August 24 2017
                October 2018
                : 25
                : 5
                : 1619-1643
                Article
                10.3758/s13423-017-1361-1
                28840520
                cbbf442f-3f9d-427a-afe3-b4325806c64f
                © 2018

                http://creativecommons.org/licenses/by/4.0

                History

                Comments

                Comment on this article