Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sequential and Parallel processing of Hypernetworks #4334

Closed
wants to merge 15 commits into from

Conversation

aria1th
Copy link
Collaborator

@aria1th aria1th commented Nov 5, 2022

image

masterpiece, 1girl, schoolgirl uniform, drawn by a***, t***
Steps: 20, Sampler: Euler a, CFG scale: 7, Seed: 2506480929, Size: 512x512, Model hash: 925997e9, ENSD: 31337

This allows processing multiple HN sequentially, or in parallel.

How?

.hns file extension can be used for Hypernetwork Structures. This structure cannot be trained.

.hns file uses standard python syntaxes. Only str, list, set, dict, and tuple is allowed.

Example file

[{('a-1', 0.45):0.5, ('ta', 0.45):0.5}, 'p']

Definition

Tuples ( ) are only used to specify Hypernetwork Strength.

('HYPER', 0.1) Apply HYPER hypernetwork with 0.1 strength

('HYPER') Equals to Singletons, 'HYPER', {'HYPER'}, ['HYPER'], or ('HYPER, 0.1)

Set are used when parallel processing with equal weighted sum is desired.

{"HN A", "HN B", "HN C"} Process HN A(x)+ HN B(x)+ HN C(x) then do average

Dictionaries (or, Maps) are used when parallel processing with Custom weights are desired.
NOTE: Weights sum should be 1, or it will produce weird results.

{"HN A": 0.1, "HN B" : 0.2, "HN C": 0.7} Process HN A(x)*0.1 + HN B(x)*0.2 + HN C(x)* 0.7

String-covered dictionaries / etc
To do some complex processing, dictionaries can be covered by ' or ".

{'{"First A" : 0.5, "First B" : 0.5}' : 0.2, "Next" : 0.8} Process First A and First B as Parallel. Then get the result, do the weighted sum with Next as 0.2, 0.8

image

As you expect, you can see those files in selection menu, and do processing as if its normal Hypernetwork File.

Fixed compatibility with 3.7 and tested.

Needs check for memory managements

@captin411
Copy link
Contributor

Really cool. Do the weights have to add to 1.0? or are they scaled relative to one another - for example are a 3.0 weight on one and 1.0 weight on another automatically scaled to 0.75 and 0.25 respectively?

@aria1th
Copy link
Collaborator Author

aria1th commented Nov 5, 2022

@captin411 Currently no, you have to set values sum to 1.0. I'm not even getting any proper results with non-1 sum, so I'll add auto normalization very soon.

Within 20 minutes, hold on!

And finished

@aria1th
Copy link
Collaborator Author

aria1th commented Nov 6, 2022

image
image

Note : HNs are tested in strength 0.45, clip skip 2
I didn't put nsfw prompt, but for safety I censored it.

why those k-v pair was shuffled?

revert assertion
fix and cleanup
Fixes and reduces memory usage too
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants