Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Disable tensor cache set to True #88

Merged
merged 1 commit into from
Jul 4, 2024

Conversation

michalkuligowski
Copy link

The disable_tensor_cache flag set to True while creating HPU graphs. This decreases memory usage of HPU graphs in Prompt and Decode.

Copy link

@madamczykhabana madamczykhabana left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@madamczykhabana madamczykhabana merged commit 1dc6cb2 into habana_next Jul 4, 2024
@michalkuligowski michalkuligowski deleted the michalkuligowski-disable-tensor-cache branch July 4, 2024 13:23
madamczykhabana added a commit that referenced this pull request Jul 4, 2024
madamczykhabana added a commit that referenced this pull request Jul 4, 2024
michalkuligowski added a commit that referenced this pull request Jul 5, 2024
madamczykhabana pushed a commit that referenced this pull request Jul 5, 2024
@kzawora-intel kzawora-intel added the habana Issues or PRs submitted by Habana Labs label Nov 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
habana Issues or PRs submitted by Habana Labs
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants