Skip to content

LouisSerrano/aroma

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

0. Official Code

Official PyTorch implementation of AROMA | Accepted at Neurips 2024

To cite our work:

@article{serrano2024aroma,
  title={AROMA: Preserving Spatial Structure for Latent PDE Modeling with Local Neural Fields},
  author={Serrano, Louis and Wang, Thomas X and Naour, Etienne Le and Vittaut, Jean-No{\"e}l and Gallinari, Patrick},
  journal={38th Conference on Neural Information Processing Systems (NeurIPS 2024)},
  year={2024}
}

1. Code installation and setup

aroma installation

conda create -n aroma python=3.9.0
pip install -e .

setup wandb config example

add to your ~/.bashrc

export WANDB_API_TOKEN=your_key
export WANDB_DIR=your_dir
export WANDB_CACHE_DIR=your_cache_dir
export MINICONDA_PATH=your_anaconda_path

2. Data

We detail the sources of the datasets used during in this paper:

We uploaded most of the datasets on Hugging Face (https://huggingface.co/sogeeking) and provide scripts to download them directly from there in the folder download_dataset. Therefore you can use those scripts to download efficiently the data.

3. Run experiments

The code runs only on GPU. We provide sbatch configuration files to run the training scripts. They are located in bash and are organized by datasets. We expect the user to have wandb installed in its environment to ease the 2-step training. For all tasks, the first step is to launch an inr.py training. The weights of the inr model are automatically saved under its run_name. For the second step, i.e. for training the dynamics or inference model, we need to use the previous run_name as input to the config file to load the inr model. The run_name can be set in the config file, but is generated randomly by default with wandb. We provide examples of the python scripts that need to be run in each bash folder.

For instance, for burgers we need to first train the VAE: sbatch bash/burgers/inr_burgers.sh and then once we specified the correct run_name in the config: sbatch bash/burgers/refiner_burgers.sh

Acknowledgements

This project would not have been possible without these awesome repositories:

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published