-
Notifications
You must be signed in to change notification settings - Fork 178
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Automatic Differentiation #531
Comments
As for prior art: see
https://sourceforge.net/p/flibs/svncode/HEAD/tree/trunk/src/computing/automdiff.f90
(and quite possibly others as well ;)).
I would say it is a useful addition to have but you do not refer to your
Fortran implementation.
Op zo 19 sep. 2021 om 04:02 schreef Dalon Work ***@***.***>:
… Motivation
Would anyone find an automatic differentiation module interesting? I'm not
an expert in it, but I did once create a simple fortran module that worked
by overloading the various arihtmetic operators and math functions. The
idea is simple enough and I personally find it fascinating that you can
compute a function and simultaneously its derivative.
I also did a prototype in C++ (https://github.com/dalon-work/AutoDiff),
this time using expression templates to allow the compiler to inline
everything. It does worry me that fortran won't allow the same level of
inlining, and the extra copies will limit the attainable performance from
this operator overloading method.
Perhaps there are many other issues and edge conditions that I'm unaware
of too, since my experience is limited mainly to the two prototypes that
I've made. Thoughts?
Prior Art
*No response*
Additional Information
*No response*
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#531>, or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAN6YRZFQ2N4IO6LF4WJPT3UCVAE3ANCNFSM5EJZUJFA>
.
Triage notifications on the go with GitHub Mobile for iOS
<https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675>
or Android
<https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub>.
|
Unfortunately I have lost the fortran implementation that I wrote. I made it before I discovered Github. It's easy enough to do (at least the basics), so I wouldn't be surprised if there are lots of them out there. I've been out of fortran for a while, how difficult would it be to do with parameterized data types to express multiple variables? My simple version only had a single variable. |
Hello and welcome @dalon-work! Auto-diff (AD) is certainly something on the radar, but I fear it would prove quite difficult to put into Concerning Fortran AD packages, many are summarized at the autodiff tools page: http://www.autodiff.org/?module=Tools Speaking of inclusion in a library, then forward mode AD using derived types and operator overloading is what would be most logical. The following packages offer this: @sgeard has also published a Fortran autodiff code. Another implementation is the one from @lauvergn in this file. I'd also invite you to take a look at the following thread over at Fortran Discourse: We've also discussed automatic differentiation before at the monthly Fortran calls. One of the goals we discussed was to contribute an article or minibook to fortran-lang.org (see fortran-lang/webpage#63) offering a summary of the available tools with some usage examples. This would help promote AD in Fortran and bring us new collaborators. If you'd like to help with this I'd be more than happy to collaborate. In the past I've used DNAD quite successfully to differentiate some code. The code can be found here. I've been meaning to experiment with a parameterized derived type (PDT) to set the number of differentiation variables. Support of PDT has become quite complete in Intel Fortran, so perhaps this is now a viable approach. |
@ivan-pi As it turns out, it was DNAD where I first learned about AD, as it was written at my alma mater. It was what motivated my own explorations. I couldn't remember the name when I originally posted. I was unaware that Josh Hodson had put it up on GitHub. Unfortunately, Josh has recently and unexpectedly passed away, but I'd be willing to take over as a maintainer of DNAD. (He was a good friend and mentor to me.) We could use it as a basis for a module in stdlib, and try extending it to include parameterized types instead of a compile-time constant, as you suggest. I read through some of the links you provided, thank you for that. The history is nice to know. It seems that if we have so many third-party implementations for this one technique, shouldn't the "standard library" provide a canonical version so we can all stop reinventing the wheel? It being a library implementation, it wouldn't be able to do ALL the awesome things everyone would want it to do, but it could be useful for simple situations. |
Hi ! As mentioned by @ivan-pi, I'm developing and using an AD library mainly for scalars and matrices (soon for vectors). Right now, the fortran module (dnS) is available in a code (QML), but I'm currently extracting this library (it is not public on GitHub yet). This library (for scalar) is working with operator overloading and its features are the following:
I've started to work on that long time ago (> 20 years) before knowing AD exist (yes reinventing the wheel and several times!), so that the terms used in the library are unusual, but it can be changed in the new version. Anyway, I'll make the AD library public on GitHub more rapidly, if you are interested to test it. |
I'd be more than willing to collaborate as well. How should we proceed? |
That's really sad to hear about Joshua Hodson. I learned about how to use DNAD from his thesis, and using his upgraded version. @dalon-work (and others) if you send me an email I can add you to a shared folder where I downloaded the papers and source code of the existing packages (beware of potential license compatibility issues for the packages from Computer Physics Communications, we might need to do a clean-room design, if we are to stick with the current stdlib license). My own preference would be to write up a summary of the existing packages, identify the common design choices and gaps/issues, and use this information to start designing a specification and test implementation. |
From the list that @Beliavsky posted, I learned of one more code: prepared by Pauli Virtanen (SciPY BDFL). It comes with an AD version of FFTPACK, so you can take derivatives of functions which depend upon FFTs. To achieve this, the
I guess this is one of the arguments for having stdlib implemented in Fortran, since it provides the ability to do such text manipulations with derived types. Maybe when the standard includes some form of generics, a more elegant solution will open up. |
CppCon 2021 recently posted a talk about implementing AD in LLVM as a plugin, using the compiler directly, instead of any language-level constructs. I assume this would affect Flang as well. https://youtu.be/1QQj1mAV-eY?t=2240 At the end he describes an effort to make this part of the C++ standard somehow. |
Motivation
Would anyone find an automatic differentiation module interesting? I'm not an expert in it, but I did once create a simple fortran module that worked by overloading the various arihtmetic operators and math functions. The idea is simple enough and I personally find it fascinating that you can compute a function and simultaneously its derivative.
I also did a prototype in C++ (https://github.com/dalon-work/AutoDiff), this time using expression templates to allow the compiler to inline everything. It does worry me that fortran won't allow the same level of inlining, and the extra copies will limit the attainable performance from this operator overloading method.
Perhaps there are many other issues and edge conditions that I'm unaware of too, since my experience is limited mainly to the two prototypes that I've made. Thoughts?
Prior Art
No response
Additional Information
No response
The text was updated successfully, but these errors were encountered: