Skip to content

Commit

Permalink
GSoC 2024 | Benchmark | First objective | Israel Roldan (#2565)
Browse files Browse the repository at this point in the history
* Added the development environment to make easy the test of the changes.

* Improved the ASV framework; Added benchmarks taking the unit tests.

* Added the script for the run the ASV benchmarks by tag release in Git.

* Added my conact information in the mail map.

* Removed the Docker files.

* Updated the release script which runs the benchmarks.

* Updated the runtime timeout for the benchmarks.

* Updated the function get relative paths for benchmarks.

* Added comments for the deprecated function 'delim_whitespace'.

* Fixed the usage of the regression data class.

* Added proposal to divide the main functionality of the Benchmark Base class.

* Update the benchmark for Gamma Ray.

* Renamed the class name for NLTE in benchmarks.

* Applied the 'black' formatter to main_gamma_ray_loop.

* Revert "Applied the 'black' formatter to main_gamma_ray_loop."

This reverts commit abe3b0d.

* Added documentation for the benchmarks.

* Fixed the format error warning by black.

* Fixed again the format error warning by black.

* Deleted all benchmarks except montecarlo numba and tardis;

---------

Co-authored-by: Israel Roldan <[email protected]>
  • Loading branch information
airvzxf and Israel Roldan authored May 6, 2024
1 parent 7db7ef0 commit 328ec77
Show file tree
Hide file tree
Showing 29 changed files with 1,657 additions and 67 deletions.
9 changes: 9 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ __pycache__
*/cython_version.py
htmlcov
.coverage
coverage.xml
MANIFEST
.ipynb_checkpoints

Expand Down Expand Up @@ -77,3 +78,11 @@ pip-wheel-metadata/

# Random files
.hypothesis/unicode_data/11.0.0/charmap.json.gz

# Data files
benchmarks/data/*.h5

# ASV
.asv/
pkgs/
release_hashes.txt
4 changes: 4 additions & 0 deletions .mailmap
Original file line number Diff line number Diff line change
Expand Up @@ -270,3 +270,7 @@ Kim Lingemann <[email protected]> kimsina <[email protected]>
Kim Lingemann <[email protected]> kim <[email protected]>

Sumit Gupta <[email protected]>

Israel Roldan <[email protected]> Israel Roldan <[email protected]>
Israel Roldan <[email protected]> AirvZxf <[email protected]>
Israel Roldan <[email protected]> airv_zxf <[email protected]>
2 changes: 1 addition & 1 deletion asv.conf.json
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
],
"branches": ["master"],
"environment_type": "mamba",
"show_commit_url": "https://github.com/tardis-sn/tardis/commit",
"show_commit_url": "https://github.com/tardis-sn/tardis/commit/",
"conda_environment_file": "tardis_env3.yml",
"benchmark_dir": "benchmarks",
"env_dir": ".asv/env",
Expand Down
99 changes: 99 additions & 0 deletions benchmarks/asv_by_release.bash
Original file line number Diff line number Diff line change
@@ -0,0 +1,99 @@
#!/usr/bin/env bash

RELEASE_LIST=$(git tag -l "release-202[34]*" | sort -r)

readarray -t RELEASE_TAGS <<<"${RELEASE_LIST[@]}"
RELEASE_HASHES=()
for release_tag in "${RELEASE_TAGS[@]}"; do
echo "Tag: ${release_tag}"
HASH_COMMIT=$(git show-ref -s "${release_tag}")
RELEASE_HASHES+=("${HASH_COMMIT}")
done
echo "RELEASE_HASHES: ${#RELEASE_HASHES[*]}"

ASV_CONFIG_PATH="/app/asv"
cd "${ASV_CONFIG_PATH}" || exit

rm -f release_hashes.txt
touch release_hashes.txt
for release_hash in "${RELEASE_HASHES[@]}"; do
echo "${release_hash}" >>release_hashes.txt
done

function show_timed_time {
local time=${1}
local milliseconds="${time: -3}"
local seconds=$((time / 1000))
local minutes=0
local minutes_display=""
local hours=0
local hours_display=""
local days=0
local days_display=""

if [[ "${seconds}" -gt 59 ]]; then
minutes=$((seconds / 60))
seconds=$((seconds % 60))
minutes_display="${minutes}m "
fi

if [[ "${minutes}" -gt 59 ]]; then
hours=$((minutes / 60))
minutes=$((minutes % 60))
minutes_display="${minutes}m "
hours_display="${hours}h "
fi

if [[ "${hours}" -gt 23 ]]; then
days=$((hours / 24))
hours=$((hours % 24))
hours_display="${hours}h "
days_display="${days}d "
fi

echo "${days_display}${hours_display}${minutes_display}${seconds}.${milliseconds}s"
}

start=$(date +%s%N | cut -b1-13)

# ASV has an argument called “bench”, which filters the benchmarks.
# I had two problems with ASV regarding benchmark filtering.
# 1. If we want to run only the `time_read_stella_model` benchmark.
# ASV will run 3 benchmarks using this argument
# `--bench time_read_stella_model`:
# - `time_read_stella_model`.
# - `time_read_stella_model_meta`.
# - `time_read_stella_model_data`.
# It is because the ASV uses regular expressions to match the name.
# To run only the benchmark `time_read_stella_model` we need to run:
# - `--bench "time_read_stella_model$"`
# - The `$` means that it matches the end of a string without
# consuming any characters.
# Note:
# - If the benchmark doesn't have parameters (`@parameterize`),
# then the name in ASV is the benchmark name without parameters.
# 2. The second problem is when I want to search for some benchmark that
# has parameters (`@parameterize`) because the name of the benchmark
# includes parenthesis and the parameters and their values.
# One example is `time_get_inverse_doppler_factor` which has 3 benchmarks
# with this prefix, and has parameters.
# To prevent it, we need to add the argument with this syntax:
# `--bench time_benchmark_name([^_A-Za-z]|$)`
# The regular expression match with all the parameters generated by ASV.
#time asv run \
# --bench "time_read_stella_model$"
# --bench "time_get_inverse_doppler_factor([^_A-Za-z]|$)" \
# --bench "time_get_inverse_doppler_factor_full_relativity([^_A-Za-z]|$)" \
# release-2023.01.11..master

# This command runs all benchmarks for all commits that have not been run.
time asv run \
--skip-existing-commits \
ALL

end=$(date +%s%N | cut -b1-13)
runtime=$((end - start))
display_time="$(show_timed_time ${runtime})"
echo ""
echo "Time: ${display_time}"
echo ""
Loading

0 comments on commit 328ec77

Please sign in to comment.