You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The output from a McSAS optimisation is a set of uncorrelated contributions, the sum of which comprises the scattered intensity.
As discussed during the SasView Code camp McSAS session, defining a parameter's polydispersity using either/or a classical definition and a freeform (e.g. McSAS) definition would require that the SasModels calculation can handle a range of freeform distributions.
While there is some sort of freeform distribution already implemented using a set of points and scaling factors (between which there is interpolation going on), The McSAS definition makes no assumption of the relationship between the points. For example, a set of ten McSAS cylinder contributions with a fixed length might (conceptually) look like this:
cylinder, diameter 3.4211
cylinder, diameter 1.1235
cylinder, diameter 2.1098
cylinder, diameter 2.0983
cylinder, diameter 4.0917
cylinder, diameter 3.0918
cylinder, diameter 1.091
cylinder, diameter 2.901
cylinder, diameter 2.998
cylinder, diameter 3.116
Note that a typical scattering pattern can easily be described using about 200 or 300 such contributions that make up a scattering pattern, when each contribution is scaled by its surface or volume, not the normal volume-squared scaling. This has the effect of suppressing the scattering of large contributions so that the smaller ones become visible, which is taken into account when visualising the result in a number- volume- or surface-weighted distribution.
Anyway, back to the topic. The idea during the SasView code camp was to enable a workflow that looked like this:
optimize a set of 1D or 2D model parameters using a classical optimisation
pick one to three parameters to be optimised using a McSAS optimisation core, fixing all parameters except for the background- and scaling parameters (which are least-squares optimised for every McSAS iteration).
get a coffee
allow for re-optimization of the remaining model parameters using classical optimisation, fixing the McSAS-optimized parameter distributions
Another aspect to note is that the uncertainties on the McSAS parameter distributions come from the analysis of variance from repeated, independent MC results in the optional histogramming (visualisation) phase. So, theoretically, you'd automatically repeat the above optimisation sequence a number of times to get a nice mean and standard error on the mean.
Back (again) to the topic at hand, for starters we would need a method that returns a calculated intensity as the sum (or average) of a set of individual contributions, each with its own parameters.
That's at least as far as I can imagine for now. This could be in SasModels or in SasView..
{
"status": "new",
"changetime": "2018-09-12T13:51:43",
"_ts": "2018-09-12 13:51:43.031167+00:00",
"description": "The output from a McSAS optimisation is a set of uncorrelated contributions, the sum of which comprises the scattered intensity. \n\nAs discussed during the SasView Code camp McSAS session, defining a parameter's polydispersity using either/or a classical definition and a freeform (e.g. McSAS) definition would require that the SasModels calculation can handle a range of freeform distributions. \n\nWhile there is some sort of freeform distribution already implemented using a set of points and scaling factors (between which there is interpolation going on), The McSAS definition makes no assumption of the relationship between the points. For example, a set of ten McSAS cylinder contributions with a fixed length might (conceptually) look like this:\n - cylinder, diameter 3.4211\n - cylinder, diameter 1.1235\n - cylinder, diameter 2.1098\n - cylinder, diameter 2.0983\n - cylinder, diameter 4.0917\n - cylinder, diameter 3.0918\n - cylinder, diameter 1.091\n - cylinder, diameter 2.901\n - cylinder, diameter 2.998\n - cylinder, diameter 3.116\n\nNote that a typical scattering pattern can easily be described using about 200 or 300 such contributions that make up a scattering pattern, when each contribution is scaled by its surface or volume, not the normal volume-squared scaling. This has the effect of suppressing the scattering of large contributions so that the smaller ones become visible, which is taken into account when visualising the result in a number- volume- or surface-weighted distribution. \n\nAnyway, back to the topic. The idea during the SasView code camp was to enable a workflow that looked like this:\n - optimize a set of 1D or 2D model parameters using a classical optimisation\n - pick one to three parameters to be optimised using a McSAS optimisation core, fixing all parameters except for the background- and scaling parameters (which are least-squares optimised for every McSAS iteration). \n - get a coffee \n - allow for re-optimization of the remaining model parameters using classical optimisation, fixing the McSAS-optimized parameter distributions \n\nAnother aspect to note is that the uncertainties on the McSAS parameter distributions come from the analysis of variance from repeated, independent MC results in the optional histogramming (visualisation) phase. So, theoretically, you'd automatically repeat the above optimisation sequence a number of times to get a nice mean and standard error on the mean.\n\nBack (again) to the topic at hand, for starters we would need a method that returns a calculated intensity as the sum (or average) of a set of individual contributions, each with its own parameters. \n\nThat's at least as far as I can imagine for now. This could be in SasModels or in SasView..",
"reporter": "toqduj",
"cc": "",
"resolution": "",
"workpackage": "McSAS Integration Project",
"time": "2018-09-08T14:25:07",
"component": "sasmodels",
"summary": "Allow \"polydispersity\" to be defined by series/sets of uncorrelated, discrete points",
"priority": "major",
"keywords": "mcsas parameterset",
"milestone": "SasView 5.1.0",
"owner": "",
"type": "enhancement"
}
The text was updated successfully, but these errors were encountered:
Trac update at 2018/09/08 14:35:52: toqduj commented:
One additional comment on this from Paul Kienzle:
"
It would be possible to have an alternative interface which takes in an array of parameter sets rather than individual parameter dimensions in a regular mesh. That way you don't have the overhead of data transfer on each call to the evaluator. You could also move the random generator directly into the kernel.
"
This is a possibility to consider, in particular if the kernel can return an averaged intensity from the set.
Note that during the actual MC optimisation, for speed's sake, we compute only the optional "swap"; the contribution to optionally remove and the new contribution to add from the total intensity. So 299 of 300 contributions are not re calculated if they don't change...
Trac update at 2018/09/12 13:51:43: richardh commented:
If MCSAS is doing 40 or 50 repetitions of the optimisation of ~ 300 particle lists, then presumably the repetitions could be run in parallel if using gpu?
The output from a McSAS optimisation is a set of uncorrelated contributions, the sum of which comprises the scattered intensity.
As discussed during the SasView Code camp McSAS session, defining a parameter's polydispersity using either/or a classical definition and a freeform (e.g. McSAS) definition would require that the SasModels calculation can handle a range of freeform distributions.
While there is some sort of freeform distribution already implemented using a set of points and scaling factors (between which there is interpolation going on), The McSAS definition makes no assumption of the relationship between the points. For example, a set of ten McSAS cylinder contributions with a fixed length might (conceptually) look like this:
Note that a typical scattering pattern can easily be described using about 200 or 300 such contributions that make up a scattering pattern, when each contribution is scaled by its surface or volume, not the normal volume-squared scaling. This has the effect of suppressing the scattering of large contributions so that the smaller ones become visible, which is taken into account when visualising the result in a number- volume- or surface-weighted distribution.
Anyway, back to the topic. The idea during the SasView code camp was to enable a workflow that looked like this:
Another aspect to note is that the uncertainties on the McSAS parameter distributions come from the analysis of variance from repeated, independent MC results in the optional histogramming (visualisation) phase. So, theoretically, you'd automatically repeat the above optimisation sequence a number of times to get a nice mean and standard error on the mean.
Back (again) to the topic at hand, for starters we would need a method that returns a calculated intensity as the sum (or average) of a set of individual contributions, each with its own parameters.
That's at least as far as I can imagine for now. This could be in SasModels or in SasView..
Migrated from http://trac.sasview.org/ticket/1172
The text was updated successfully, but these errors were encountered: