-
-
Notifications
You must be signed in to change notification settings - Fork 47
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LAPACK, LAPACKE and CLAPACK #1
Comments
@iurisegtovich Thanks a lot for your comment and happy to see my humble repo has been noticed. I did this almost year a ago when I was working on a home project and the moment when i started it I had no idea about the these API/SDKs. Well, later I got more expert and figured these issues out. you may see my comment on the page you have mentioned :D |
Hi, thank you for replying. I noticed your repository while I was studying
this stuff and creating a repository of my own with tests and conclusions
[iurisegtovich/fortranLAB](https://github.com/iurisegtovich/fortranLAB).
what were your conclusions, are you sucessfully using BLAS/LAPACK? via
atlas/openBLAS/mkl? what about GSL? any other nice library that you would
like to recommend?
|
was my repo any help? I would be happy to know that :)
I took a look. it seems like a great peace of work. some points I noticed:
I did successfully use them in the home project I was working on at the moment. i was developing a program to solve mechanical simulations. but right now I'm busy with some other stuff.
As far as I know BLAS and LAPACK are APIs, not an implementation per say. Although the original implementations by NetLibs still exist. ATLAS and openBLAS are some of the implementations to my knowledge and if you write your code there is no difference which one do you use.
Sorry but I'm highly against any closed source vendor biased program/library such as mkl and CUDA. why should I write a code which only works on a specific type of HW?! |
I found it near the end of my studying day, and I was, at first, focused on running GSL rather than directly calling BLAS/LAPACK stuff. But it was motivating to see someone struggling with the same things I was, and I saved your URL to read your files while I try BLAS/LAPACK stuff directly later
for me it was exactly the other way around, i mainly use fortran, but gsl has only a c native interface.
I put gpu there as intentions, I am attending some lectures on programming for gpu this month but i won't be using them for a while
mkl is actually "closed source vendor biased", but differently from CUDA, it does work on non-intel hardware, although not with intel specific optimizations. It is free-to-use recently and Anaconda' distribution of python comes with it. I plan on studying both OpenBLAS and mkl (which should be simple as they both should follow BLAS/LAPACK standards) |
Quoting your README.md
You might want to check this page for clarifications:
http://nicolas.limare.net/pro/notes/2014/10/31_cblas_clapack_lapacke/
The text was updated successfully, but these errors were encountered: