AccoMontage is a piano accompaniment arrangement system. It introduces a novel hybrid pathway, in which rule-based optimization (for high-level structure) and learning-based style transfer (for fine-grained coherency) are both leveraged to complement each other for high-quality generation. Our paper AccoMontage: Accompaniment Arrangement via Phrase Selection and Style Transfer is accepted by ISMIR 2021. This repository stores codes and demos of our work.
AccoMontage now supports a few new features as follows:
- Generation with MIDI velocity and pedal control messages.
- Transitions among phrases of any length (1-bar, 2-bar, ..., 16-bar).
- Input of general MIDI with arbituary tracks (besides melody) and arbituarily complex chords (e.g., 9th chords). Yet, we still quantize the chord sequence at 1-beat granularity. If the melody is not on the first track, then the melody track index is also requested.
- Whole pieces piano arrangement with intro, interlude, and outro.
- Spotlight on specific reference pieces as the donor of piano textures. Currently supported spotlight options include POP909 song index (e.g., 905), song name (e.g., '小城故事'), and/or artist name (e.g., '邓丽君'). For complete supported options, refer to checktable
./checkpoints/pop909_quadraple_meters_index.xlsx
.
- Data and checkpoints required to run AccoMontage can be downloaded here (updated Feb 27, 2024). After extraction, you should have a
./checkpoints/
folder with relevant pt and npz files inside. - Our code is now arranged in a portable manner. You can follow the guidance in
./AccoMontage_inference.ipynb
and run AccoMontage. - Alternatively, AccoMontage is now accessible on Google Colab, where you can quickly test it online.
Generated demos are listed in ./demo
. AccoMontage was applied as the backbones to rearrange the Alma Mater Music of East China Normal University (ECNU). Performance demos can be accessed here.
Thanks to Prof. Gus Xia, Yixiao Zhang, Liwei Lin, Junyan Jiang, Ziyu Wang, and Shuqi Dai for their generous help to this work. Thanks to all NYUSH Music X Lab citizens for their encouragement and companion. The following repositories are referred to by this work:
- https://github.com/music-x-lab/POP909-Dataset
- https://github.com/ZZWaang/polyphonic-chord-texture-disentanglement
- https://github.com/ZZWaang/PianoTree-VAE
- https://github.com/Dsqvival/hierarchical-structure-analysis
- https://github.com/music-x-lab/ISMIR2019-Large-Vocabulary-Chord-Recognition
- https://github.com/buggyyang/Deep-Music-Analogy-Demos
If you find our paper and this repository helpful, please consider citing our work:
@inproceedings{zhao2021accomontage,
author = {Jingwei Zhao and Gus Xia},
title = {AccoMontage: Accompaniment Arrangement via Phrase Selection and Style Transfer},
booktitle = {Proceedings of the 22nd International Society for Music Information Retrieval Conference ({ISMIR} 2021)},
pages = {833--840},
year = {2021},
}
For inquries about our work, feel free to contact me at [email protected].
Jingwei Zhao (Ph.D. Student in Data Science, NUS Graduate School)
May 29, 2023