Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use rdma-core package (instead of CDT) & add to linux_aarch64 #14

Merged
merged 10 commits into from
Oct 24, 2024

Conversation

jakirkham
Copy link
Member

@jakirkham jakirkham commented Mar 12, 2024

Part of issue: conda-forge/cuda-feedstock#28

Previously we used the CDT for rdma-core. However it would be better to use the package. Especially so as this CDT is not provided on AlmaLinux 8. This makes that change. It is also aligned with how we use rdma-core elsewhere.

Additionally it looks like rdma-core would be needed on linux_aarch64, but we didn't have it there before. So this makes sure to add it.


Checklist

  • Used a personal fork of the feedstock to propose changes
  • Bumped the build number (if the version is unchanged)
  • Reset the build number to 0 (if the version changed)
  • Re-rendered with the latest conda-smithy (Use the phrase @conda-forge-admin, please rerender in a comment in this PR for automated rerendering)
  • Ensured the license file is being packaged.

@jakirkham jakirkham requested a review from a team as a code owner March 12, 2024 18:29
@conda-forge-webservices
Copy link
Contributor

Hi! This is the friendly automated conda-forge-linting service.

I just wanted to let you know that I linted all conda-recipes in your PR (recipe) and found it was in an excellent condition.

@jakirkham
Copy link
Member Author

Not seeing Azure here. Going to try restarting

@conda-forge-admin , please restart CI

@jakirkham
Copy link
Member Author

Looks like that fixed it

@jakirkham
Copy link
Member Author

Am seeing the following on CI

sed: can't read /home/conda/feedstock_root/build_artifacts/libcufile_1710268372716/_h_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_p/lib/pkgconfig/cufile*.pc: No such file or directory

Based on looking locally, this appears to be happening because ${PREFIX}/lib/pkgconfig already exists. So the line below moves pkg-config into ${PREFIX}/lib/pkgconfig, which gives us ${PREFIX}/lib/pkgconfig/pkg-config. Where instead we would want all the .pc files in ${PREFIX}/lib/pkgconfig

[[ -d pkg-config ]] && mv pkg-config ${PREFIX}/lib/pkgconfig

IOW this looks like a bug in build.sh

@jakirkham
Copy link
Member Author

@conda-forge-admin , please re-render

Copy link
Contributor

Hi! This is the friendly automated conda-forge-webservice.

I tried to rerender for you, but it looks like I wasn't able to push to the drop_cdt branch of jakirkham-feedstocks/libcufile-feedstock. Did you check the "Allow edits from maintainers" box?

NOTE: Our webservices cannot push to PRs from organization accounts or PRs from forks made from organization forks because of GitHub permissions. Please fork the feedstock directly from conda-forge into your personal GitHub account.

This message was generated by GitHub actions workflow run https://github.com/conda-forge/libcufile-feedstock/actions/runs/8626284824.

@jakirkham
Copy link
Member Author

@conda-forge-admin , please re-render

@jakirkham
Copy link
Member Author

@conda-forge-admin , please lint

@conda-forge-webservices
Copy link
Contributor

Hi! This is the friendly automated conda-forge-linting service.

I just wanted to let you know that I linted all conda-recipes in your PR (recipe) and found it was in an excellent condition.

@jakirkham
Copy link
Member Author

@conda-forge-admin , please re-render

Copy link
Contributor

Hi! This is the friendly automated conda-forge-webservice.

I tried to rerender for you, but it looks like there was nothing to do.

This message was generated by GitHub actions workflow run https://github.com/conda-forge/libcufile-feedstock/actions/runs/8626433350.

@adibbley
Copy link
Contributor

I'm not sure the rdma-core package is providing what we needed from the cdt package originally.

One example from the 12.4.1 build log I see:

INFO (libcufile,targets/x86_64-linux/lib/libcufile_rdma.so.1.9.1): Needed DSO x86_64-conda-linux-gnu/sysroot/usr/lib64/libmlx5.so.1 found in CDT/compiler package conda-forge/noarch::libibverbs-cos7-x86_64==22.4=h9b0a68f_1105

And here I see:

WARNING (libcufile,targets/x86_64-linux/lib/libcufile_rdma.so.1.9.1): $RPATH/libmlx5.so.1 not found in packages, sysroot(s) nor the missing_dso_whitelist.

@jakirkham
Copy link
Member Author

@conda-forge-admin , please re-render

Copy link
Contributor

Hi! This is the friendly automated conda-forge-webservice.

I tried to rerender for you, but it looks like I wasn't able to push to the drop_cdt branch of jakirkham-feedstocks/libcufile-feedstock. Did you check the "Allow edits from maintainers" box?

NOTE: Our webservices cannot push to PRs from organization accounts or PRs from forks made from organization forks because of GitHub permissions. Please fork the feedstock directly from conda-forge into your personal GitHub account.

This message was generated by GitHub actions workflow run https://github.com/conda-forge/libcufile-feedstock/actions/runs/9654546805.

@leofang
Copy link
Member

leofang commented Jun 25, 2024

It seems the rerendering failed 😅

@conda-forge-webservices
Copy link
Contributor

Hi! This is the friendly automated conda-forge-linting service.

I was trying to look for recipes to lint for you, but it appears we have a merge conflict.
Please try to merge or rebase with the base branch to resolve this conflict.

Please ping the 'conda-forge/core' team (using the @ notation in a comment) if you believe this is a bug.

@conda-forge-webservices
Copy link
Contributor

Hi! This is the friendly automated conda-forge-linting service.

I just wanted to let you know that I linted all conda-recipes in your PR (recipe) and found it was in an excellent condition.

@jakirkham
Copy link
Member Author

Not sure why it wouldn't find the libraries, they are all there

rdma-core-52.0-hcccb83c_0/lib
├── libefa.so -> libefa.so.1.3.52.0
├── libefa.so.1 -> libefa.so.1.3.52.0
├── libefa.so.1.3.52.0
├── libhns.so -> libhns.so.1.0.52.0
├── libhns.so.1 -> libhns.so.1.0.52.0
├── libhns.so.1.0.52.0
├── libibmad.so -> libibmad.so.5.3.52.0
├── libibmad.so.5 -> libibmad.so.5.3.52.0
├── libibmad.so.5.3.52.0
├── libibnetdisc.so -> libibnetdisc.so.5.1.52.0
├── libibnetdisc.so.5 -> libibnetdisc.so.5.1.52.0
├── libibnetdisc.so.5.1.52.0
├── libibumad.so -> libibumad.so.3.2.52.0
├── libibumad.so.3 -> libibumad.so.3.2.52.0
├── libibumad.so.3.2.52.0
├── libibverbs.so -> libibverbs.so.1.14.52.0
├── libibverbs.so.1 -> libibverbs.so.1.14.52.0
├── libibverbs.so.1.14.52.0
├── libmana.so -> libmana.so.1.0.52.0
├── libmana.so.1 -> libmana.so.1.0.52.0
├── libmana.so.1.0.52.0
├── libmlx4.so -> libmlx4.so.1.0.52.0
├── libmlx4.so.1 -> libmlx4.so.1.0.52.0
├── libmlx4.so.1.0.52.0
├── libmlx5.so -> libmlx5.so.1.24.52.0
├── libmlx5.so.1 -> libmlx5.so.1.24.52.0
├── libmlx5.so.1.24.52.0
├── librdmacm.so -> librdmacm.so.1.3.52.0
├── librdmacm.so.1 -> librdmacm.so.1.3.52.0
└── librdmacm.so.1.3.52.0

@leofang
Copy link
Member

leofang commented Jun 25, 2024

I seem to recall we encountered a similar situation elsewhere (but can't find the example now), where a CTK component in $PREFIX/targets/<arch>/lib depends on non-CTK packages in $PREFIX/lib. Maybe we need to patch rpath manually for the cuFILE libraries?

@conda-forge-webservices
Copy link
Contributor

conda-forge-webservices bot commented Jul 30, 2024

Hi! This is the friendly automated conda-forge-linting service.

I just wanted to let you know that I linted all conda-recipes in your PR (recipe/meta.yaml) and found it was in an excellent condition.

@jakirkham
Copy link
Member Author

jakirkham commented Jul 31, 2024

Looks like we need libnuma based on this CI warning:

WARNING (gds-tools,gds/tools/gdsio): $RPATH/libnuma.so.1 not found in packages, sysroot(s) nor the missing_dso_whitelist.

However that is already here

- libnuma # [linux]

Or an RPATH fix?

@jakirkham
Copy link
Member Author

jakirkham commented Aug 5, 2024

Yeah it looks like there is something up with the rpath for gds-tools based on this line from CI:

WARNING (gds-tools,gds/tools/gdsio): $RPATH/libcufile.so.0 not found in packages, sysroot(s) nor the missing_dso_whitelist.

As there is already libcufile present in the dependencies

- {{ pin_subpackage("libcufile", max_pin="x") }}

@jakirkham
Copy link
Member Author

Yep, this is related to issue: conda-forge/cuda-feedstock#10

@jakirkham
Copy link
Member Author

@conda-forge-admin , please restart CI

@billysuh7
Copy link
Contributor

@conda-forge-admin, please restart ci

Copy link
Contributor

@billysuh7 billysuh7 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry I forgot @h-vetinari to approve this. LGTM.

@jakirkham
Copy link
Member Author

Thanks Billy! 🙏

Were you meaning PR: conda-forge/cuda-nvcc-impl-feedstock#25 ?

@billysuh7
Copy link
Contributor

Thanks Billy! 🙏

Were you meaning PR: conda-forge/cuda-nvcc-impl-feedstock#25 ?

That is all set also, I approved it and put more comment. Thanks.

@jakirkham
Copy link
Member Author

Thanks Billy! 🙏

@jakirkham
Copy link
Member Author

jakirkham commented Oct 24, 2024

The conclusion Billy and I reached after looking into the warnings above more closely is there is likely a remaining RPATH issue in this package, which we will need to look into. This will be follow up work

xref: conda-forge/cuda-feedstock#10 (comment)

@jakirkham jakirkham merged commit 55609a7 into conda-forge:main Oct 24, 2024
4 checks passed
@jakirkham jakirkham deleted the drop_cdt branch October 24, 2024 04:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants