Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Nano HPO] global hpo configuration, enable/diable tf hpo #4486

Merged
merged 1 commit into from
Apr 25, 2022

Conversation

shane-huang
Copy link
Contributor

  • add global hpo configuration
  • use global hpo configuration to disable/enable hpo for tf and pytorch respectively
  • when tf hpo is enabled, decorated layers (e.g. tf.keras.layers), activations(tf.keras.activations), tf functions(i.e. tf.cast), tf.keras functions(i.e. tf.keras.Input) are added to corresonponding nano.tf modules.

@yangw1234
Copy link
Contributor

hi Shane, would you mind adding a documentation check in here https://github.com/intel-analytics/BigDL/blob/main/python/nano/test/run-nano-codestyle-test.sh#L11.

@shane-huang shane-huang deleted the hpo-hpoconfig branch May 19, 2022 10:41
@shane-huang shane-huang mentioned this pull request Jun 20, 2022
73 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants