Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Accommodate for two different EKOs in Evolution #289

Merged
merged 42 commits into from
Jul 18, 2024
Merged
Show file tree
Hide file tree
Changes from 31 commits
Commits
Show all changes
42 commits
Select commit Hold shift + click to select a range
1962fe2
implement some brute force skeleton
Radonirinaunimi May 30, 2024
95b3bc3
reflect changes to python interface
Radonirinaunimi May 30, 2024
a4bfa50
just makes things work for the time being
Radonirinaunimi May 31, 2024
2b58ae2
address changes in old evolve
Radonirinaunimi Jun 2, 2024
c11b388
propagate different info in places that should matter
Radonirinaunimi Jun 2, 2024
16244d2
propagate different info in places that should matter
Radonirinaunimi Jun 2, 2024
e2ab5a6
reflect changes from master
Radonirinaunimi Jun 2, 2024
97aa73e
fixes issues during merging
Radonirinaunimi Jun 2, 2024
d2e6e1f
fixed some ambiguities and clean up a bit
Radonirinaunimi Jun 3, 2024
4222163
apply some rewordings
Radonirinaunimi Jun 3, 2024
6549de8
add convolve_with_two method for py interface
Radonirinaunimi Jun 4, 2024
685303d
fix a typo in passing the info
Radonirinaunimi Jun 5, 2024
80d9739
working version: remove pending todos and add a few notes
Radonirinaunimi Jun 7, 2024
baca6c4
Merge branch 'master' into extend_eko_convolution
Radonirinaunimi Jun 12, 2024
5630f70
extend CLI and try tests
Radonirinaunimi Jun 12, 2024
b455328
move downloading of pPDF into rust-yml
Radonirinaunimi Jun 13, 2024
e236b3c
fix minor inconsistencies in tests
Radonirinaunimi Jun 13, 2024
76b6f8f
reflect parts of the changes
Radonirinaunimi Jul 1, 2024
9c66218
account for multi-pdfs convolution feature
Radonirinaunimi Jul 1, 2024
57a996f
remove external download of polarized set
Radonirinaunimi Jul 5, 2024
fed2a2b
Fix release workflow
cschwan Jul 5, 2024
d007dc6
Fix CIs
cschwan Jul 5, 2024
6978daa
Restore old methods and add new ones with different names
cschwan Jul 12, 2024
cc82f9e
Add missing documentation to `Grid::evolve2`
cschwan Jul 12, 2024
246cadb
Change `unimplemented` to an error
cschwan Jul 12, 2024
1fb7e81
Prepare evolvution for more than two EKOs
cschwan Jul 12, 2024
d0b8951
Fix warning and compilation error
cschwan Jul 12, 2024
b7d55f8
Fix panic message
cschwan Jul 12, 2024
8153abe
Undo whitespace changes
cschwan Jul 12, 2024
3bf44d0
Shorten CI job name
cschwan Jul 12, 2024
99dd18a
Merge branch 'master' into extend_eko_convolution
cschwan Jul 12, 2024
6081b97
Call the right evolution method
cschwan Jul 12, 2024
65a8bfb
Fix wrong assertion statement
cschwan Jul 12, 2024
4a8f7a4
Propagate new changes from master to `evolve_slice_with_two2`
cschwan Jul 12, 2024
6585ab6
Generalize `ndarray_from_subgrid_orders_slice` function
cschwan Jul 12, 2024
e8ecba7
Add Python bind for `Grid::evolve_with_slice_iter2`
cschwan Jul 12, 2024
a4395e2
Fix clippy warning
cschwan Jul 12, 2024
ad99bcc
Remove Node 20 workaround in CI
cschwan Jul 18, 2024
3361048
Add back accidentally removed environment variable in CI
cschwan Jul 18, 2024
bb6453c
Clarify wording of error message
cschwan Jul 18, 2024
744c417
Add changelog entry
cschwan Jul 18, 2024
c1ba7ab
remove Grid::evolve2 method and its python wrapper
Radonirinaunimi Jul 18, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions .github/workflows/capi.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,11 @@ on:
- pycli
- bump-pyo3-version

env:
# our GLIBC version is too old to support Node 20:
# https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
ACTIONS_ALLOW_USE_UNSECURE_NODE_VERSION: true

jobs:
capi:
runs-on: ubuntu-latest
Expand Down
3 changes: 3 additions & 0 deletions .github/workflows/msrv.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,9 @@ on:

env:
CARGO_TERM_COLOR: always
# our GLIBC version is too old to support Node 20:
# https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
ACTIONS_ALLOW_USE_UNSECURE_NODE_VERSION: true

jobs:
build:
Expand Down
3 changes: 3 additions & 0 deletions .github/workflows/release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,9 @@ on:
env:
# this is make the `gh` binary work
GH_TOKEN: ${{ github.token }}
# our GLIBC version is too old to support Node 20:
# https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
ACTIONS_ALLOW_USE_UNSECURE_NODE_VERSION: true

jobs:
# create a release on github
Expand Down
8 changes: 7 additions & 1 deletion .github/workflows/rust.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,9 @@ defaults:

env:
CARGO_TERM_COLOR: always
# our GLIBC version is too old to support Node 20:
# https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
ACTIONS_ALLOW_USE_UNSECURE_NODE_VERSION: true

jobs:
build:
Expand All @@ -25,7 +28,7 @@ jobs:
uses: actions/cache@v4
with:
path: test-data
key: test-data-v11
key: test-data-v12
- name: Download test data
if: steps.cache-test-data.outputs.cache-hit != 'true'
run: |
Expand Down Expand Up @@ -55,6 +58,9 @@ jobs:
curl -s -C - -O 'https://ploughshare.web.cern.ch/ploughshare/db/applfast/applfast-h1-dijets-appl-arxiv-0010054/grids/applfast-h1-dijets-appl-arxiv-0010054-xsec000.appl'
curl -s -C - -O 'https://ploughshare.web.cern.ch/ploughshare/db/applfast/applfast-h1-incjets-fnlo-arxiv-0706.3722/grids/applfast-h1-incjets-fnlo-arxiv-0706.3722-xsec000.tab.gz'
curl -s -C - -O 'https://ploughshare.web.cern.ch/ploughshare/db/atlas/atlas-atlas-wpm-arxiv-1109.5141/grids/atlas-atlas-wpm-arxiv-1109.5141-xsec001.appl'
curl -s -C - -O 'https://data.nnpdf.science/pineappl/test-data/STAR_WMWP_510GEV_WM-AL-POL.pineappl.lz4'
curl -s -C - -O 'https://data.nnpdf.science/pineappl/test-data/STAR_WMWP_510GEV_WM-AL-POL_PolPDF.tar'
curl -s -C - -O 'https://data.nnpdf.science/pineappl/test-data/STAR_WMWP_510GEV_WM-AL-POL_UnpolPDF.tar'

- name: Set RUSTDOCFLAGS
run: |
Expand Down
3 changes: 3 additions & 0 deletions maintainer/generate-coverage.sh
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,9 @@ wget --no-verbose --no-clobber -P test-data 'https://ploughshare.web.cern.ch/plo
wget --no-verbose --no-clobber -P test-data 'https://ploughshare.web.cern.ch/ploughshare/db/applfast/applfast-h1-dijets-appl-arxiv-0010054/grids/applfast-h1-dijets-appl-arxiv-0010054-xsec000.appl'
wget --no-verbose --no-clobber -P test-data 'https://ploughshare.web.cern.ch/ploughshare/db/applfast/applfast-h1-incjets-fnlo-arxiv-0706.3722/grids/applfast-h1-incjets-fnlo-arxiv-0706.3722-xsec000.tab.gz'
wget --no-verbose --no-clobber -P test-data 'https://ploughshare.web.cern.ch/ploughshare/db/atlas/atlas-atlas-wpm-arxiv-1109.5141/grids/atlas-atlas-wpm-arxiv-1109.5141-xsec001.appl'
wget --no-verbose --no-clobber -P test-data 'https://data.nnpdf.science/pineappl/test-data/STAR_WMWP_510GEV_WM-AL-POL.pineappl.lz4'
wget --no-verbose --no-clobber -P test-data 'https://data.nnpdf.science/pineappl/test-data/STAR_WMWP_510GEV_WM-AL-POL_PolPDF.tar'
wget --no-verbose --no-clobber -P test-data 'https://data.nnpdf.science/pineappl/test-data/STAR_WMWP_510GEV_WM-AL-POL_UnpolPDF.tar'

# we compile with different flags and don't want to destroy the other target directory
export CARGO_TARGET_DIR="$(mktemp -d)"
Expand Down
2 changes: 1 addition & 1 deletion maintainer/pineappl-ci/script.sh
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ ldconfig
cd ..

# install PDF sets
for pdf in NNPDF31_nlo_as_0118_luxqed NNPDF40_nnlo_as_01180 NNPDF40_nlo_as_01180; do
for pdf in NNPDF31_nlo_as_0118_luxqed NNPDF40_nnlo_as_01180 NNPDF40_nlo_as_01180 NNPDF40_nlo_pch_as_01180; do
curl "https://lhapdfsets.web.cern.ch/current/${pdf}.tar.gz" | tar xzf - -C /usr/local/share/LHAPDF
done

Expand Down
119 changes: 119 additions & 0 deletions pineappl/src/evolution.rs
Original file line number Diff line number Diff line change
Expand Up @@ -633,3 +633,122 @@ pub(crate) fn evolve_slice_with_two(
.collect(),
))
}

pub(crate) fn evolve_slice_with_two2(
grid: &Grid,
input_operator_a: &ArrayView4<f64>,
input_operator_b: &ArrayView4<f64>,
info_a: &OperatorSliceInfo,
info_b: &OperatorSliceInfo,
order_mask: &[bool],
xi: (f64, f64),
alphas_table: &AlphasTable,
) -> Result<(Array3<SubgridEnum>, Vec<Channel>), GridError> {
let gluon_has_pid_zero = gluon_has_pid_zero(grid);

let (pid_indices_a, pids_a) =
pid_slices(input_operator_a, info_a, gluon_has_pid_zero, &|pid1| {
grid.channels()
.iter()
.flat_map(Channel::entry)
.any(|&(a, _, _)| a == pid1)
})?;
let (pid_indices_b, pids_b) =
pid_slices(input_operator_b, info_b, gluon_has_pid_zero, &|pid1| {
grid.channels()
.iter()
.flat_map(Channel::entry)
.any(|&(_, b, _)| b == pid1)
})?;

let channels0 = channels0_with_two(&pids_a, &pids_b);
let mut sub_fk_tables = Vec::with_capacity(grid.bin_info().bins() * channels0.len());

let mut last_x1a = Vec::new();
let mut last_x1b = Vec::new();
let mut operators_a = Vec::new();
let mut operators_b = Vec::new();

for subgrids_ol in grid.subgrids().axis_iter(Axis(1)) {
// NOTE: `info_a.x0.len()` and `info_b.x0.len()` are the same
let mut tables = vec![Array2::zeros((info_a.x0.len(), info_a.x0.len())); channels0.len()];
cschwan marked this conversation as resolved.
Show resolved Hide resolved

for (subgrids_o, channel1) in subgrids_ol.axis_iter(Axis(1)).zip(grid.channels()) {
let (x1_a, x1_b, array) = ndarray_from_subgrid_orders_slice(
info_a,
&subgrids_o,
grid.orders(),
order_mask,
xi,
alphas_table,
)?;

if (last_x1a.len() != x1_a.len())
|| last_x1a
.iter()
.zip(x1_a.iter())
.any(|(&lhs, &rhs)| !approx_eq!(f64, lhs, rhs, ulps = EVOLUTION_TOL_ULPS))
{
operators_a = operator_slices(input_operator_a, info_a, &pid_indices_a, &x1_a)?;
last_x1a = x1_a;
}

if (last_x1b.len() != x1_b.len())
|| last_x1b
.iter()
.zip(x1_b.iter())
.any(|(&lhs, &rhs)| !approx_eq!(f64, lhs, rhs, ulps = EVOLUTION_TOL_ULPS))
{
operators_b = operator_slices(input_operator_b, info_b, &pid_indices_b, &x1_b)?;
last_x1b = x1_b;
}

let mut tmp = Array2::zeros((last_x1a.len(), info_a.x0.len()));

for &(pida1, pidb1, factor) in channel1.entry() {
for (fk_table, opa, opb) in channels0.iter().zip(tables.iter_mut()).filter_map(
|(&(pida0, pidb0), fk_table)| {
pids_a
.iter()
.zip(operators_a.iter())
.find_map(|(&(pa0, pa1), opa)| {
(pa0 == pida0 && pa1 == pida1).then_some(opa)
})
.zip(pids_b.iter().zip(operators_b.iter()).find_map(
|(&(pb0, pb1), opb)| (pb0 == pidb0 && pb1 == pidb1).then_some(opb),
))
.map(|(opa, opb)| (fk_table, opa, opb))
},
) {
linalg::general_mat_mul(1.0, &array, &opb.t(), 0.0, &mut tmp);
linalg::general_mat_mul(factor, opa, &tmp, 1.0, fk_table);
}
}
}

sub_fk_tables.extend(tables.into_iter().map(|table| {
ImportOnlySubgridV2::new(
SparseArray3::from_ndarray(table.insert_axis(Axis(0)).view(), 0, 1),
vec![Mu2 {
// TODO: FK tables don't depend on the renormalization scale
//ren: -1.0,
ren: info_a.fac0,
fac: info_a.fac0,
}],
info_a.x0.clone(),
info_a.x0.clone(),
)
.into()
}));
}

Ok((
Array1::from_iter(sub_fk_tables)
.into_shape((1, grid.bin_info().bins(), channels0.len()))
.unwrap(),
channels0
.iter()
.map(|&(a, b)| channel![a, b, 1.0])
.collect(),
))
}
Loading
Loading