pytorch model weights
Pre-releaseDescription
This repository contains a collection of model weight files from extensive hyperparameter testing designed for climate prediction models. Each file is a result of testing various configurations to optimize the models’ performance for different simulation scenarios. The models are primarily aimed at forecasting global temperature changes using different sets of hyperparameters.
Hyperparameter Configurations
The hyperparameters tested include combinations of different models, layer depths, hidden dimensions, loss functions, batch sizes, training epochs, patience levels, learning rates, and sequence lengths. Here’s an overview of the parameters:
• Models: Long Short-Term Memory (LSTM), Recurrent Neural Network (RNN), Gated Recurrent Unit (GRU), Multi-Layer Perceptron (MLP), Attention Model (Attn)
• Number of Layers: 2, 4, 6
• Hidden Dimensions: 100, 200
• Loss Functions: Mean Squared Error (MSE), Mean Absolute Error (MAE), Huber Loss
• Batch Sizes: 4096
• Epochs: 40
• Patience Levels: 10
• Learning Rates: 0.01, 0.001
• Sequence Lengths: 12, 24, 48
These parameters were methodically varied to understand their impact on the accuracy and efficiency of temperature projections under different climate scenarios. This extensive testing aids in identifying the most effective model configurations for accurate long-term climate predictions.
For more detailed information and to access the full suite of tools and resources, visit the Climate Prediction 2100 GitHub Repository.