This is the codes repository for the project "Emotion-guided Music Accomplaniment Generation".
Training and validation set is derived from POP909 dataset and emotion-guided music generation test set is a part of Nottingham Dataset.
Harmonization module is updated (2023/2/2)
UI module will be completed in the future.
1. Put the melodies (MIDI Format) in the original
folder;
2. Run demo.py
;
3. Wait a while and the accompaniment will be saved in the generate_midi
folder.
4. For more emotional flow guided generation samples, please refer to this website.