作者: Andrea R. Halpern , Jeffrey S. Martin , Tara D. Reed
关键词:
摘要: COMPOSERS COMMONLY USE MAJOR OR MINOR SCALES to create different moods in music.Nonmusicians show poor discrimination and classification of this musical dimension; however, they can perform these tasks if the decision is phrased as happy vs. sad.We created pairs melodies identical except for mode; first major or minor third sixth was critical note that distinguished from mode. Musicians nonmusicians judged each melody collected ERP waveforms, triggered onset note. showed a late positive component (P3) only melodies, both tasks.Nonmusicians could adequately classify sad but little evidence processing information. Major appears be default mode music, musicians apparently process differently.