Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How can I "freeze" some layers? #17

Closed
EncodeTS opened this issue Nov 4, 2016 · 4 comments
Closed

How can I "freeze" some layers? #17

EncodeTS opened this issue Nov 4, 2016 · 4 comments

Comments

@EncodeTS
Copy link

EncodeTS commented Nov 4, 2016

How can I exclude some layers from training when I want to fine-tune a model?In keras I can simply set the trainable=False.Is there a method for tensorlayer to do this?

@zsdonghao
Copy link
Member

Hi, @EncodeTS .
You may need to get the list of variables you want to update, TensorLayer provides two ways to get the variables list.

The first way is to use the all_params of a network, by default, it will store the variables in order.
You can print the variables information via
tl.layers.print_all_variables(train_only=True) or network.print_params(details=False)
To choose which variables to update, you can do as below.

train_params = network.all_params[3:]

The second way is to get the variables by a given name. For example, if you want to get all variables which the layer name contail dense, you can do as below.

train_params = tl.layers.get_variables_with_name('dense', train_only=True, printable=True)

After you get the variable list, you can define your optimizer like that so as to update only a part of the variables.

train_op = tf.train.AdamOptimizer(0.001).minimize(cost, var_list= train_params)

@EncodeTS
Copy link
Author

EncodeTS commented Nov 5, 2016

Thanks!!

@dorbarber79
Copy link

@zsdonghao
hello
will
train_op = tf.train.AdamOptimizer(0.001).minimize(cost, var_list= train_params)
freeze the layer weights? or only will not take them into account when calculating the loss?
I want to fix some layers weights

@zsdonghao
Copy link
Member

it will only update train_params, but will take all weights into account when calculating the loss.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants