-
Notifications
You must be signed in to change notification settings - Fork 3.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Dciborow/linux template testing #1008
Changes from 18 commits
e71c6bf
0364c1a
7db339f
c62d696
6738f57
f90331e
5553841
5e249cf
28cf068
ef33607
e4af432
3d88ea2
7d9c740
1145712
20aa2bf
86279e4
6d4f68e
e21acf2
31be356
995ae75
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,43 @@ | ||
# Copyright (c) Microsoft Corporation. All rights reserved. | ||
# Licensed under the MIT License. | ||
# | ||
# A Github Service Connection must also be created with the name "AI-GitHub" | ||
# https://docs.microsoft.com/en-us/azure/devops/pipelines/process/demands?view=azure-devops&tabs=yaml | ||
|
||
resources: | ||
repositories: | ||
- repository: aitemplates | ||
type: github | ||
name: microsoft/AI | ||
endpoint: AI-GitHub | ||
|
||
schedules: | ||
- cron: "7 0 * * *" | ||
displayName: Nightly build master | ||
branches: | ||
include: | ||
- master | ||
always: true | ||
- cron: "7 12 * * *" | ||
displayName: Nightly build staging | ||
branches: | ||
include: | ||
- staging | ||
always: true | ||
|
||
trigger: none | ||
|
||
pr: none | ||
|
||
variables: | ||
- group: LinuxAgentPool | ||
|
||
stages: | ||
- template: stages/linux_test_stages.yml | ||
parameters: | ||
stage_name: "Nightly" | ||
job_name: nightly | ||
conda_env_root: nightly_reco | ||
Agent_Pool: $(Agent_Pool) | ||
smoke_tests: true | ||
integration_tests: true |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,42 @@ | ||
# Copyright (c) Microsoft Corporation. All rights reserved. | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. where is this used? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. https://dev.azure.com/AZGlobal/Azure%20Global%20CAT%20Engineering/_build?definitionId=134&_a=summary we can create a pipeline off of it in the other ado as well, but I didnt want to do to much there without everyone on the team understanding it first (i only did the minimum i needed to param the agents) There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. |
||
# Licensed under the MIT License. | ||
# | ||
# A Github Service Connection must also be created with the name "AI-GitHub" | ||
# https://docs.microsoft.com/en-us/azure/devops/pipelines/process/demands?view=azure-devops&tabs=yaml | ||
|
||
resources: | ||
repositories: | ||
- repository: aitemplates | ||
type: github | ||
name: microsoft/AI | ||
endpoint: AI-GitHub | ||
|
||
# Pull request against these branches will trigger this build | ||
pr: | ||
- master | ||
- staging | ||
|
||
#Any commit to this branch will trigger the build. | ||
trigger: | ||
- staging | ||
- master | ||
|
||
variables: | ||
- group: LinuxAgentPool | ||
|
||
stages: | ||
- template: stages/linux_test_stages.yml | ||
parameters: | ||
stage_name: "Unit" | ||
job_name: unit | ||
conda_env_root: reco | ||
Agent_Pool: $(Agent_Pool) | ||
notebook_tests: true | ||
|
||
- template: stages/linux_test_stages.yml | ||
parameters: | ||
stage_name: "Notebook" | ||
job_name: notebook | ||
conda_env_root: reco | ||
Agent_Pool: $(Agent_Pool) | ||
notebook_tests: true |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,41 @@ | ||
parameters: | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. this file is similar but no exactly the same as There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. linux_test_stage is the generic template for one stage while linux_test_stages specifically calls that template 3 different ways, for CPU, GPU, and Spark tests. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. so, first question, we have second question: who is calling third question, the original yml are still there, are they still needed if we have (1st Q)? fourth question, if we don't need the yamls in (3rd Q), is github going to identify the 12 original pipelines independently so they will appear when someone does a PR? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. as a general comment, I'm honestly making an effort to understand the value of all this parametrization, but to me all this is increasing the complexity. I'm trying to understand how we are making things easier for everyone, but so far I can't There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. So, I'm slowly trying to expand the test harness that we created for the HPs to the Best Practice repos. I'm looking to minimize the configuration differences, as well as the duplicated code, for all of the build yamls that I need to bring together to do this. I included the additional templates in this repo, for this PR, to try and make it more clear then my previous PR how the templates fit together. Then, as I look to the next BP repo, I'll start pulling common pieces into the central microsoft/ai location. for your questions... 2 - I had a typo in linux_test_stage, that should have been calling conda_pytest_linux_steps, updated it. 3 - So I did not want to touch these, because they are connecting with your CI/CD system. We could do a separate PR to remove them, only after the multistage blending versions are committing, and a new ADO pipeline has been created, if you are interested in that. 4 - if someone does a PR, and you use the multistage pipeline, Github will roll this into a single check, but if the check fails, the user can check the logs to see which stage failed. The "Stages" in the ADO pipeline is a new feature, that they haven't included in any type of badge yet. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I see. I think we ideally would like to have the same 12 pipelines for unit test + 6 for nightly builds, but trying to make them simple. An idea that came from your development would be this. First there will be some components in this repo and some in the common repo (microsoft/ai). The common components will be:
In this repo we will have:
These are the bricks, then for each of the unit and nightly tests, we will have a yaml file and this will be connected to github. A final thing, I think we will save a lot of lines of code if we don't use true and false flag for each component but we group them. An example on this file: https://github.com/microsoft/recommenders/blob/master/tests/ci/azure_pipeline_test/dsvm_unit_linux_gpu.yml, this code:
could be parametrized with would this make sense? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @anargyri proposed the following idea, we would have only 2 yaml files, one for unit tests and another for nightly builds. Similarly to the previous comment, we will parametrize all the content like Now we have two possibilities:
if this idea makes sense to people, next step would be to explore if 1 is feasable There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I would also add that if (1) fails, you could still use two templates and create the yml files with a script (and check them in, so that GitHub can find them). |
||
Agent_Pool: # | ||
stage_name: # | ||
job_name: # | ||
conda_env: # | ||
spark: not spark # | spark | ||
gpu: not gpu # | gpu | ||
unit_tests: false | ||
smoke_tests: false | ||
integration_tests: false | ||
notebook_tests: false | ||
|
||
stages: | ||
- stage: ${{parameters.stage_name}} | ||
dependsOn: [] | ||
|
||
jobs: | ||
- job: ${{parameters.job_name}} | ||
displayName: 'Nightly tests Linux CPU' | ||
timeoutInMinutes: 180 # how long to run the job before automatically cancelling | ||
pool: | ||
name: ${{parameters.Agent_Pool}} | ||
|
||
steps: | ||
- template: .ci/steps/reco_config_conda_linux.yml@aitemplates | ||
parameters: | ||
conda_env: ${{parameters.conda_env}} | ||
|
||
- template: ../steps/conda_pytest_linux.yml | ||
parameters: | ||
conda_env: ${{parameters.conda_env}} | ||
spark: ${{parameters.spark}} | ||
gpu: ${{parameters.gpu}} | ||
unit_tests: ${{parameters.unit_tests}} | ||
smoke_tests: ${{parameters.smoke_tests}} | ||
integration_tests: ${{parameters.integration_tests}} | ||
notebook_tests: ${{parameters.notebook_tests}} | ||
|
||
- template: .ci/steps/reco_conda_clean_linux.yml@aitemplates | ||
parameters: | ||
conda_env: ${{parameters.conda_env}} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm trying to understand the workflow, this file will call tests/ci/azure_pipeline_test/steps/conda_pytest_linux.yml, right?
Two questions:
PYSPARK_PYTHON
andPYSPARK_DRIVER_PYTHON
always set, even if they are not in the spark environment?There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
foot in mouth: i figured out how to only insert the parmas for spark.