Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Flux Speed Up with PipeFusion #230

Open
mali-afridi opened this issue Aug 26, 2024 · 2 comments
Open

Flux Speed Up with PipeFusion #230

mali-afridi opened this issue Aug 26, 2024 · 2 comments
Labels
help wanted Extra attention is needed

Comments

@mali-afridi
Copy link
Contributor

Hello, you guys implemented Sequence parallel approach for Flux. But, the inference speed for 3 and 4gpus is slower as compared to 1 GPU. Do you have ideas to speed it up using distrifuser or pipefuser? Why is it not possible in pipefusion?

@Eigensystem
Copy link
Collaborator

Can you give us your command? Flux is 2x faster than 1 card when tested on our L40 4-card device with ulysses_degree = 2, ring_degree = 2

@feifeibear
Copy link
Collaborator

https://github.com/xdit-project/xDiT/blob/main/docs/performance/flux.md
@mali-afridi we have a detailed performance report regarding to Flux.1, and they can be reproduced by the scripts in the example.

Could you please provide more information about your devices and environments?

Regarding PipeFusion, we are planing to support it. Would like to help us on this project?

@feifeibear feifeibear changed the title Flux Speed Up Flux Speed Up with PipeFusion Aug 27, 2024
@feifeibear feifeibear added the help wanted Extra attention is needed label Aug 27, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

3 participants