Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Detect deployments that can not impact test classes and run only specified tests in that case to improve performances #444

Closed
nvuillam opened this issue Oct 19, 2023 · 4 comments · Fixed by #770
Labels

Comments

@nvuillam
Copy link
Member

detecting which type of metadata update can not break anything for sure, and if only those metadata are updated in a deployment, dynamically call just a subset of specified tests (that would be always the same list defined by the release manager)

Except for production deployments when we keep RunLocalTests

Cc @pablogonzaleznetwork

@EndritSino
Copy link

detecting which type of metadata update can not break anything for sure, and if only those metadata are updated in a deployment, dynamically call just a subset of specified tests (that would be always the same list defined by the release manager)

Except for production deployments when we keep RunLocalTests

Cc @pablogonzaleznetwork

Just wanted to point, that the notion of "metadata that cannot break anything for sure", is a bit loose considering the edge cases when a Static Resource, Reports or Custom Labels translation could be referenced somewhere in code.

Nevertheless, we can maybe achieve such a list if we make sure that there are some strong commandments of not allowing any references anywhere in our code of such metadata types. Then this sanity list could potentially be of some more effective use.

I guess, we're still in a safe spot to say that certain metadata like CSS classes on LWC, or page layouts(hopefully phased out completely by the Dyn Forms), should under any circumstance cause any coded logic to break, thus unnecessary to be validated by a comprehensive Apex Test suite.

@nvuillam
Copy link
Member Author

@EndritSino maybe we can accept to take the risk for some of them, because negative impact would be only in case of dynamic reference (static references will be catched by deployment simulation with a compile error)

Initial list from Macej Ptak

  • aura, lwc, labels, staticresources, customMetadata, experiences, reports, communityTemplateDefinitions, communityThemeDefinitions, audience, sites, navigationMenus, dashboards, bots, flexipages

@nvuillam
Copy link
Member Author

nvuillam commented Sep 16, 2024

Dev ready in future v5 :)

Only remaining item is a bug in SF cli -> forcedotcom/cli#3023

@EndritSino @0ptaq0 any feedback about the list ? :)

Smart Deployments Tests

Not all metadata updates can break test classes, use Smart Deployment Tests to skip running test classes if ALL the following conditions are met:

  • Delta deployment is activated and applicable to the source and target branches
  • Delta deployed metadatas are all matching the list of NOT_IMPACTING_METADATA_TYPES (see below)
  • Target org is not a production org

Activate Smart Deployment tests with:

  • env variable USE_SMART_DEPLOYMENT_TESTS=true
  • .sfdx-hardis.yml config property useSmartDeploymentTests: true

Defaut list for NOT_IMPACTING_METADATA_TYPES (can be overriden with comma-separated list on env var NOT_IMPACTING_METADATA_TYPES)

  • Audience
  • AuraDefinitionBundle
  • Bot
  • BotVersion
  • ContentAsset
  • CustomObjectTranslation
  • CustomSite
  • CustomTab
  • Dashboard
  • ExperienceBundle
  • Flexipage
  • GlobalValueSetTranslation
  • Layout
  • LightningComponentBundle
  • NavigationMenu
  • ReportType
  • Report
  • SiteDotCom
  • StandardValueSetTranslation
  • StaticResource
  • Translations

Note: if you want to disable Smart test classes for a PR, add nosmart in the text of the latest commit.

@nvuillam nvuillam added the v5 label Sep 16, 2024
@nvuillam
Copy link
Member Author

Solved in latest alpha, ready to be tested :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants