Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Defaults from yaml not populating if not given directly on CLI #103

Closed
jbrightuniverse opened this issue Nov 24, 2021 · 2 comments
Closed
Labels
bug Something isn't working

Comments

@jbrightuniverse
Copy link

jbrightuniverse commented Nov 24, 2021

From the pytorch-lightning slack: the following code under jsonargparse==4.0.3

import sys
import torch.nn as nn
import pytorch_lightning as pl
from typing import Optional
from pytorch_lightning.utilities.cli import LightningCLI
from jsonargparse import lazy_instance

from econ_layers.layers import FlexibleSequential


class TestModel(pl.LightningModule):
    def __init__(
        self,
        ml_model: FlexibleSequential = lazy_instance(FlexibleSequential),
    ):
        pass


if __name__ == "__main__":
    sys.argv = [
        "cli.py",
        "--trainer.max_epochs=5",
        "--model.ml_model.init_args.layers=4",
    ]
    cli = LightningCLI(
        TestModel,
        run=False,
        parser_kwargs={
            "error_handler": None,
            "default_config_files": ["defaults.yaml"],
        },
    )
    print(cli.config)

with defaults.yaml as

model:
  ml_model:
    class_path:  econ_layers.layers.FlexibleSequential
    init_args:
      n_in: 1
      n_out: 1
      layers: 5
      hidden_dim: 122
      hidden_bias: true
      last_bias: true

should populate the yaml into cli.config with layers then overridden to 4. Instead, cli.config contains the defaults from econ_layers.layers.FlexibleSequential's class definition with layers overridden to 4. e.g. the output has hidden_dim: 128 instead of the expected hidden_dim: 122.

Current workaround: remove "default_config_files": ["defaults.yaml"], and change the arguments to:

sys.argv = [
        "cli.py",
        "--config=defaults.yaml"
        "--trainer.max_epochs=5",
        "--model.ml_model.init_args.layers=4",
    ]

Also see https://github.com/HighDimensionalEconLab/econ_layers/blob/main/tests/test_jsonargparse.py.

@mauvilsa mauvilsa added the bug Something isn't working label Nov 25, 2021
mauvilsa added a commit that referenced this issue Nov 25, 2021
- Only add subclass defaults on defaults merging #103.
@mauvilsa
Copy link
Member

With commits 85a6836 and c5119c6 this is now fixed.

@mauvilsa
Copy link
Member

@jbrightuniverse a side note. Your test in https://github.com/HighDimensionalEconLab/econ_layers/blob/main/tests/test_jsonargparse.py will break with pytorch-lightning>=1.6 once it comes out. The cli.config object will no longer be a dict but a jsonargparse.Namespace instead. This can be trivially converted to a dict by doing cli.config.as_dict().

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants