Skip to content

Commit

Permalink
PNN adapter arguement missing in the _init_() method ContinualAI#1625
Browse files Browse the repository at this point in the history
The issue is due to the num_prev_modules attribute not being initialized in the LinearAdapter class within the PNN.py module.
  • Loading branch information
Chillthrower authored Apr 15, 2024
1 parent 6e5e3b2 commit 9637aa0
Showing 1 changed file with 1 addition and 0 deletions.
1 change: 1 addition & 0 deletions avalanche/models/pnn.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@ def __init__(self, in_features, out_features_per_column, num_prev_modules):
:param num_prev_modules: number of previous modules
"""
super().__init__()
self.num_prev_modules = num_prev_modules
# Eq. 1 - lateral connections
# one layer for each previous column. Empty for the first task.
self.lat_layers = nn.ModuleList([])
Expand Down

0 comments on commit 9637aa0

Please sign in to comment.