1.7
Major Additions
OpenLineage Namespace Conventions
Conventions for setting namespaces when leveraging Data Lineage
has been updated to better follow OpenLineage's guidelines. Moving forward, namespaces should be defined in the data-lineage.properties
file, such that Jobs are tied to pipelines and Datasets are tied to data sources. This is a departure from the old pattern of one single namespace property (data.lineage.namespace
) being leveraged for an entire project. Refer to the GitHub docs for updated guidance. Usage of the data.lineage.namespace
property in a project's data-lineage.properties
file will be supported as a fallback but should not be used in practice.
Maven Build Cache
The Maven Build Cache is now enabled by default for new projects. Existing projects can reference the generation template to enable this functionality in their own projects.
Kafka Docker Image
The baseline Kafka Docker image has moved away from using the wurstmeister/kafka image (which was outdated and is no longer available) to using Bitnami's Kafka image as its base. If you are using the v2 Kafka chart managed by aiSSEMBLE, it will now pull the baseline Kafka image instead of directly using the Bitnami image. If you are still on the older v1 chart, it is already using the baseline image and will be the Bitnami flavor in 1.7.0. Kafka Connect support is still included in the baseline image.
Package Renaming
- Python modules were renamed to reflect aiSSEMBLE. These include the following.
Old Python Module | New Python Module |
---|---|
foundation-core-python | aissemble-core-python |
foundation-model-training-api | aissemble-foundation-model-training-api |
foundation-versioning-service | aissemble-foundation-versioning-service |
foundation-drift-detection-client | aissemble-foundation-drift-detection-client |
foundation-encryption-policy-python | aissemble-foundation-encryption-policy-python |
foundation-model-lineage | aissemble-foundation-model-lineage |
foundation-data-lineage-python | aissemble-foundation-data-lineage-python |
foundation-messaging-python-client | aissemble-foundation-messaging-python-client |
foundation-pdp-client-python | aissemble-foundation-pdp-client-python |
foundation-transform-core-python | aissemble-Foundation-transform-core-python |
extensions-model-training-api-sagemaker | aissemble-extensions-model-training-api-sagemaker |
extensions-data-delivery-spark-py | aissemble-extensions-data-delivery-spark-py |
extensions-encryption-vault-python | aissemble-extensions-encryption-vault-python |
extensions-transform-spark-python | aissemble-extensions-transform-spark-python |
test-data-delivery-pyspark-model | aissemble-test-data-delivery-pyspark-model |
test-data-delivery-pyspark-model-basic | aissemble-test-data-delivery-pyspark-model-basic |
machine-learning-inference | aissemble-machine-learning-inference |
machine-learning-training | aissemble-machine-learning-training |
machine-learning-training-base | aissemble-machine-learning-training-base |
machine-learning-sagemaker-training | aissemble-machine-learning-sagemaker-training |
- Helm Charts and their relevant modules have been renamed to the following:
Old Module Name | New Helm Chart and Module Name |
---|---|
extensions-helm-airflow | aissemble-airflow-chart |
extensions-helm-data-access | aissemble-data-access-chart |
extensions-helm-elasticsearch | aissemble-elasticsearch-chart |
extensions-helm-elasticsearch-operator | aissemble-elasticsearch-operator-chart |
extensions-helm-fastapi | aissemble-fastapi-chart |
extensions-helm-hive-metastore-db | aissemble-hive-metastore-db-chart |
extensions-helm-hive-metastore-service | aissemble-hive-metastore-service-chart |
extensions-helm-inference | aissemble-inference-chart |
extensions-helm-jenkins | aissemble-jenkins-chart |
extensions-helm-kafka | aissemble-kafka-chart |
extensions-helm-keycloak | aissemble-keycloak-chart |
extensions-helm-lineage-http-consumer | aissemble-lineage-http-consumer-chart |
extensions-helm-localstack | aissemble-localstack-chart |
extensions-helm-metadata | aissemble-metadata-chart |
extensions-helm-mlflow | aissemble-mlflow-chart |
extensions-helm-pipeline-invocation | aissemble-pipeline-invocation-chart |
extensions-helm-pipeline-invocation-lib | aissemble-pipeline-invocation-lib-chart |
extensions-helm-policy-decision-point | aissemble-policy-decision-point-chart |
extensions-helm-quarkus | aissemble-quarkus-chart |
extensions-helm-sealed-secrets | aissemble-sealed-secrets-chart |
extensions-helm-spark-application | aissemble-spark-application-chart |
extensions-helm-spark-operator | aissemble-spark-operator-chart |
extensions-helm-vault | aissemble-vault-chart |
extensions-helm-versioning | aissemble-versioning-chart |
Breaking Changes
Note instructions for adapting to these changes are outlined in the upgrade instructions below.
- The maven property
version.clean.plugin
was changed toversion.maven.clean.plugin
causing the*-deploy/pom.xml
to be invalid. - The specification of private maven repositories has been changed from prior releases.
- The specification of private PyPI repositories has been changed from prior releases.
- The specification of private docker repository has been changed from prior releases.
- The specification of Helm publishing repositories has been changed from prior releases.
- The Kafka home directory in the aissemble-kafka image has changed from /opt/kafka to /opt/bitnami/kafka
Known Issues
- There is currently a bug with the way the
pipeline-invocation-service
is accessing the spark application helm chart. The chart cannot be accessed using the--repo
flag and must instead use a direct reference to the chart.
Known Vulnerabilities
Date identified |
Vulnerability | Severity | Package | Affected versions |
CVE | Fixed in |
---|
How to Upgrade
The following steps will upgrade your project to 1.7. These instructions consist of multiple phases:
- Automatic Upgrades - no manual action required
- Precondition Steps - needed in all situations
- Conditional Steps (e.g., Python steps, Java steps, if you use Metadata, etc)
- Final Steps - needed in all situations
Automatic Upgrades
To reduce burden of upgrading aiSSEMBLE, the Baton project is used to automate the migration of some files to the new version. These migrations run automatically when you build your project, and are included by default when you update the build-parent
version in your root POM. Below is a description of all of the Baton migrations that are included with this version of aiSSEMBLE.
Migration Name | Description |
---|---|
upgrade-tiltfile-aissemble-version-migration | Updates the aiSSEMBLE version within your project's Tiltfile |
upgrade-v2-chart-files-aissemble-version-migration | Updates the helm chart dependencies within your project's deployment resources (<YOUR_PROJECT>-deploy/src/main/resources/apps/) to use the latest version of the aiSSEMBLE |
upgrade-v1-chart-files-aissemble-version-migration | Updates the docker image tags within your project's deployment resources (<YOUR_PROJECT>-deploy/src/main/resources/apps/) to use the latest version of the aiSSEMBLE |
upgrade-mlflow-v2-external-s3-migration | Update the mlflow V2 deployment (if present) in your project to utilize Localstack for local development and SealedSecrets for remote deployments |
upgrade-spark-application-s3-migration | Update the pipeline SparkApplication(s) (if present) in your project to utilize Localstack for local development and SealedSecrets for remote deployments |
upgrade-foundation-extension-python-package-migration | Updates the pyproject.toml files within your projects pipelines folder (<YOUR_PROJECT>-pipelines) to use the updated aiSSEMBLE foundation and extension Python packages with the latest naming convention |
upgrade-helm-chart-names-migration | Updates the Chart.yaml and values*.yaml files within your project's deploy folder (<YOUR_PROJECT>-deploy) to use the new Helm chart naming convention (aissemble-<chart-name>-chart ). |
upgrade-helm-module-names-migration | Updates the Chart.yaml and values*.yaml files within your project's deploy folder (<YOUR_PROJECT>-deploy) to use the new Helm module naming convention (aissemble-<chart-name>-chart ) |
upgrade-helm-chart-repository-url-migration | Updates the Helm repository URL within your project's deploy Chart.yaml file to point to ghcr.io. Only runs if the previous Helm chart repository URL is passed in through the oldHelmRepositoryUrl system property (using -DoldHelmRepositoryUrl ) |
upgrade-dockerfile-pip-install-migration | Updates dockerfiles such that python dependency installations fail during the build, rather than at runtime |
enable-habushu-build-cache-migration | Updates the pom.xml file for any Habushu-managed modules to ensure that the build directory is specified. |
data-lineage-package-import-migration | Updates the package imports for all java files that are referencing com.boozallen.aissemble.data.lineage . |
upgrade-spark-application-exec-migration. | Fixes the exec-maven-plugin executions in pipeline POMs to use the new ghcr.io aissemble-spark-application-chart package |
upgrade-project-specific-image-naming-convention-migration | Updates the project specific aiSSEMBLE generated image names by removing the boozallen/ prefix |
To deactivate any of these migrations, add the following configuration to the baton-maven-plugin
within your root pom.xml
:
<plugin>
<groupId>org.technologybrewery.baton</groupId>
<artifactId>baton-maven-plugin</artifactId>
<dependencies>
<dependency>
<groupId>com.boozallen.aissemble</groupId>
<artifactId>foundation-upgrade</artifactId>
<version>${version.aissemble}</version>
</dependency>
</dependencies>
+ <configuration>
+ <deactivateMigrations>
+ <deactivateMigration>NAME_OF_MIGRATION</deactivateMigration>
+ <deactivateMigration>NAME_OF_MIGRATION</deactivateMigration>
+ </deactivateMigrations>
+ </configuration>
</plugin>
Precondition Steps - Required for All Projects
Beginning the Upgrade
To start your aiSSEMBLE upgrade, update your project's pom.xml to use the 1.7.0 version of the build-parent:
<parent>
<groupId>com.boozallen.aissemble</groupId>
<artifactId>build-parent</artifactId>
<version>1.7.0</version>
</parent>
Delete the Old maven-clean-plugin Version
In order to follow the standard naming conventions for maven properties, the original property used for the
maven-clean-plugin version no longer exists. To resolve the Maven failure this causes, delete the version from the
plugin in *-deploy/pom.xml. This can be achieved with:
sed -i'' -e '/version.clean.plugin/d' *-deploy/pom.xml
Update Maven Repository Configuration
Update the following properties in your project's root pom.xml
file with the appropriate Maven repository IDs and URLs
for publishing and retrieving releases and snapshots. Adjust for your project as appropriate:
<properties>
...
<maven.repo.id>maven-releases</maven.repo.id>
<maven.repo.url>https://release-PLACEHOLDER/repository/maven-releases</maven.repo.url>
<maven.snapshot.repo.id>maven-snapshots</maven.snapshot.repo.id>
<maven.snapshot.repo.url>https://snapshot-PLACEHOLDER/repository/maven-snapshots</maven.snapshot.repo.url>
</properties>
Update Habushu to Point to Your Private PyPI Repository
Add the following property
and plugin
into your project's root pom.xml
file with the appropriate PyPI repository URL (Nexus is used
in this example - adjust for your project, as appropriate):
<properties>
...
<pypi.project.repository.url>https://nexus.yourdomain/repository/your-pypi-repo-name/</pypi.project.repository.url>
</properties>
...
<build>
<pluginManagement>
<plugins>
...
<plugin>
<groupId>org.technologybrewery.habushu</groupId>
<artifactId>habushu-maven-plugin</artifactId>
<configuration>
<!--
Ensure you have configured credentials for this repo, as explained in the following link:
https://github.com/TechnologyBrewery/habushu?tab=readme-ov-file#pypirepoid
-->
<pypiRepoUrl>${pypi.project.repository.url}</pypiRepoUrl>
</configuration>
</plugin>
</plugins>
</pluginManagement>
</build>
Add the Helm Publishing Repository Configuration
Add the following property
into your project's root pom.xml
file with the appropriate Helm repository URL and name you wish to publish your charts to (Nexus is used in this example - adjust for your project, as appropriate):
<properties>
...
<helm.publishing.repository.url>https://nexus.mydomain.com/repository</helm.publishing.repository.url>
<helm.publishing.repository.name>my-helm-charts</helm.publishing.repository.name>
</properties>
Update the following plugin
within your project's -deploy/pom.xml
file. Adjust for your project as appropriate:
<plugin>
<groupId>${group.helm.plugin}</groupId>
<artifactId>helm-maven-plugin</artifactId>
<executions>
...
<execution>
<id>deploy</id>
<phase>deploy</phase>
<goals>
<goal>push</goal>
</goals>
</execution>
</executions>
<configuration>
...
<uploadRepoStable>
<name>${helm.publishing.repository.name}</name>
<url>${helm.publishing.repository.url}</url>
</uploadRepoStable>
<uploadRepoSnapshot>
<name>${helm.publishing.repository.name}</name>
<url>${helm.publishing.repository.url}</url>
</uploadRepoSnapshot>
...
</configuration>
</plugin>
For further configurations that can be set for your specific Helm repository needs, please see the helm-maven-plugin documentation
Update Docker Repository Configuration
Update the docker repository in your project's root pom.xml
file with the appropriate docker repository URL
for publishing and retrieving docker images. Adjust for your project as appropriate:
<properties>
...
<docker.project.repository.url>docker-registry-PLACEHOLDER/repository/</docker.project.repository.url>
</properties>
Additionally, add the docker repoId in the orphedomos-maven-plugin
configuration in the project's -docker/pom.xml
file.
Update Tiltfile with New Docker Repository
Update the build_args
defined in your Tiltfile
to point DOCKER_BASELINE_REPO_ID
to the ghcr.io/
docker repository
+ build_args = { 'DOCKER_BASELINE_REPO_ID': 'ghcr.io/',
'VERSION_AISSEMBLE': aissemble_version}
Update Integration Test Base Image
Update <project>-tests/<project>-tests-docker/src/main/resources/docker/Dockerfile
to use the public openjdk
image
- ARG DOCKER_BASELINE_REPO_ID
- FROM ${DOCKER_BASELINE_REPO_ID}boozallen/openjdk:11-slim
+ FROM openjdk:11-slim
Update v1 Helm Charts with New Docker Repository
Update the values.yaml file for any v1 helm charts in your -deploy/apps
directory to point to the correct repository.
Github Container Registry Images
If you have a v1 chart that is included in this list
hive-metastore-db
hive-metastore-service
jenkins
kafka
metadata
spark-infrastructure
Then you will need to update the dockerRepo
value in the corresponding values.yaml
file to point to "ghcr.io/"
:
image:
+ dockerRepo: "ghcr.io/"
Model Training
If you leverage model-training-api
and/or model-training-api-sagemaker
, update the image name to point to ghcr.io/
like so:
image:
+ name: ghcr.io/boozallen/aissemble-model-training-api
for sagemaker:
image:
+ name: ghcr.io/boozallen/aissemble-model-training-api-sagemaker
Conditional Steps
Point Habushu to the correct publishing repository
[REQUIRED]
The Jenkins pipeline defaults to test-pypi for the publishing repo. Should you intend to publish your artifacts elsewhere, create the following profile in the root pom.xml
file:
<profiles>
<profile>
<id>ci</id>
<build>
<plugins>
<plugin>
<groupId>org.technologybrewery.habushu</groupId>
<artifactId>habushu</artifactId>
<executions>
<execution>
<phase>deploy</phase>
<goals>
<goal>publish-to-pypi-repo</goal>
</goals>
<configuration>
<pypiRepositoryId><YOUR_PYPI_REPO_ID</pypiRepositoryId>
<pypiRepositoryUrl><YOUR_PYPI_REPO_URL></pypiRepositoryUrl>
<useDevRepository>false</useDevRepository>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
</profiles>
Upgrade Steps for Projects Leveraging the aiSSEMBLE Kafka Image
[REQUIRED]
If you are leveraging the aissemble-kafka image and are adding custom files to the Kafka home directory either through a Docker build or a Helm configuration, you will need to change the location of these files from /opt/kafka/ to /opt/bitnami/kafka/.
For Projects leveraging Pipeline Invocation Service
[REQUIRED]
In order to use the pipeline-invocation-service
for your downstream project, you will need to downgrade the service to version 1.6.1. The following steps will guide you through this process:
-
Run
mvn clean install
with 1.7.0 in your root pom.xml (Allowing for the baton migrations to update the project) -
Update the
pipeline-invocation-service
Chart.yaml and values*.yaml files within your project's deploy folder (<YOUR_PROJECT>-deploy). Replace the following values and names accordingly:Current Value (1.7.0) Workaround Value aissemble-pipeline-invocation-chart aissemble-pipeline-invocation aissemble-pipeline-invocation-lib-chart aissemble-pipeline-invocation-lib aissemble-quarkus-chart aissemble-quarkus oci://ghcr.io/boozallen https://nexus.boozallencsn.com/repository/aiops-helm-internal/ 1.7.0 1.6.1 -
Deactivate the necessary migrations by adding the following configurations to the
baton-maven-plugin
within your project rootpom.xml
:<deactivateMigrations> <deactivateMigration>upgrade-v2-chart-files-aissemble-version-migration</deactivateMigration> <deactivateMigration>upgrade-v1-chart-files-aissemble-version-migration</deactivateMigration> <deactivateMigration>upgrade-helm-chart-names-migration</deactivateMigration> </deactivateMigrations>
Upgrade Steps for Projects Leveraging Data Lineage
DataLineage and ModelLineage Event Customization
[REQUIRED]
We have made adjustments regarding customizing the lineage event so that we can customize the lineage event based on the event type. If you have overridden any of the functions in the table below, they will need to be updated to their type-specific variants that are outlined in the What Gets Generated section of the Lineage documentation.
Python Method Signature | Java Method Signature |
---|---|
create_run(self) → Run | Run createRun() |
create_job(self) → Job | Job createJob() |
create_run_event(self, run: Run, job: Job, status: str) → RunEvent | RunEvent createRunEvent(Run run, Job job, String status) |
Updated Namespace Conventions with Data Lineage
In order to follow standards for defining namespaces for OpenLineage Jobs and Datasets, the default behavior around namespace values has been changed. The namespace values should be configured in the data-lineage.properties
file. If your project does not already have this file, it will be automatically generated the next time you build. The following namespace configuration changes should be made.
[REQUIRED]
If your pipeline leverages any lineage Datasets, you must define a namespace for each dataset, per the configuration guidance in the Lineage documentation:
data.lineage.<dataset-name>.namespace=<dataset's-source-name>
[OPTIONAL]
If you are already setting the data.lineage.namespace
value in your <project-name>-docker/<project-name>-spark-worker-docker/src/main/resources/krausening/base/data-lineage.properties
file, it is recommended that you follow the configuration documentation and set data.lineage.<pipeline>.namespace
and data.lineage.<pipeline>.<step>.namespace
instead.
Associate Step Lineage Events to Pipeline
[OPTIONAL]
The data lineage now supports pipeline level lineage run event, which provides the parent run facet for all the step level lineage events, and helps to preserve pipeline-step job hierarchy and to tie all the step level lineage events' job together. To leverage this functionality, modify the driver class of your pipeline to import PipelineBase
and using this, record the start and end lineage events of the pipeline execution.
pyspark pipeline driver class
from krausening.logging import LogManager
+ from <pipeline_name>.generated.pipeline.pipeline_base import PipelineBase
if __name__ == "__main__":
logger.info("STARTED: FirstProcess driver")
+ PipelineBase().record_pipeline_lineage_start_event()
FirstStep().execute_step()
...
LastStep().execute_step()
+ PipelineBase().record_pipeline_lineage_complete_event()
spark pipeline driver class
+ import <base.package>.pipeline.PipelineBase;
import org.slf4j.Logger;
...
public static void main(String[] args) {
logger.info("STARTED: {} driver", "SparkPipeline");
SparkPipelineBaseDriver.main(args);
+ PipelineBase.getInstance().recordPipelineLineageStartEvent();
...
final Step2 step2 = CDI.current().select(Step2.class, new Any.Literal()).get();
CompletionStage<Void> step2Result = step2.executeStep();
...
+ PipelineBase.getInstance().recordPipelineLineageCompleteEvent();
Final Steps - Required for All Projects
Finalizing the Upgrade
- Run
mvn org.technologybrewery.baton:baton-maven-plugin:baton-migrate -DoldHelmRepositoryUrl=<old-helm-repo>
to apply the automatic migrations - Run
mvn clean install
and resolve any manual actions that are suggested- NOTE: This will update any aiSSEMBLE dependencies in 'pyproject.toml' files automatically
- Repeat the previous step until all manual actions are resolved
What's Changed
- #1 📝 adding notes about documentation that is being worked as pa… by @d-ryan-ashcraft in #7
- #1 add missed files during migration by @ewilkins-csi in #10
- #5 📝 some initial documentation tweaks by @d-ryan-ashcraft in #11
- #13 🐛 fix broken surefire variable name by @d-ryan-ashcraft in #14
- #5 📝 GitHub Pages setup and tranche 1 of docs by @d-ryan-ashcraft in #16
- #5 📝 Tranche 2 of documentation migration by @d-ryan-ashcraft in #18
- #5 📝 Tranche 3 of documentation migration by @d-ryan-ashcraft in #21
- #5 📝 Tranche 4 of documentation migration by @d-ryan-ashcraft in #29
- #5 📝 Tranche 5 of documentation migration by @d-ryan-ashcraft in #32
- #22 Removed maven-clean-plugin version from pom generation by @cwoods-cpointe in #33
- #9 enable ConfigLoader to read/write config to vault by @csun-cpointe in #19
- correct log content by @csun-cpointe in #35
- #23:As a data engineer, I want the default memory for Spark applicatio… by @tianliang0038 in #30
- #5 📝 Tranche 6 of documentation migration by @d-ryan-ashcraft in #36
- #9 fix the build issue by @csun-cpointe in #39
- #8 Require docker builds to fail if Python installation fails + migration by @Cho-William in #34
- #15: Adding Poetry Python package migration in foundation-upgrade by @jacksondelametter in #31
- #2 Add Github Action build workflow by @aaron-gary in #40
- #2 💚 update OCI configuration to get helm chart publishing by @d-ryan-ashcraft in #42
- #5 📝 Tranche 7 of documentation migration by @d-ryan-ashcraft in #37
- #5 🔧 fix branch reference causing failing publish job by @d-ryan-ashcraft in #43
- Resolve SageMaker CVEs by @ewilkins-csi in #44
- #24 Create Baton migration for Helm chart name changes by @chang-annie in #46
- #51 update kafka base image to bitnami by @ewilkins-csi in #53
- #1 🔧 working on CI build process with test.pypi.org config and… by @d-ryan-ashcraft in #54
- #1 👷 test removal of old ghcr.io containers via a… by @d-ryan-ashcraft in #57
- #28 Change credentials for LocalStack between environments without replicating all of my environment properties by @carter-cundiff in #56
- #59 🐛 fix helm template references in downstream builds so they w… by @d-ryan-ashcraft in #61
- #62 Migrate custom open lineage schemas to new repo by @carter-cundiff in #64
- #58 Enable maven build cache for newly created downstream projects. by @peter-mcclonski in #63
- 49-downstream-maven-publishing-repo by @colinpalmer-pro in #67
- #65 ✨ add innate support for downstream projects using a dif… by @d-ryan-ashcraft in #68
- #50 bumped to latest version by @ahartwellCpointe in #69
- #58 Fix a bug when attempting to migrate pom files with no build sect… by @peter-mcclonski in #71
- #72 resync test pyspark poms by @ewilkins-csi in #73
- #45 Incorrect service name reference in generated ingress for Spark H… by @tianliang0038 in #77
- #48 - Rename Data Lineage module with latest aiSSEMBLE landscape PR by @habibimoiz in #60
- #47 create configuration store service by @csun-cpointe in #78
- #52 Create Baton migration for Helm Chart repository URLs and old mo… by @chang-annie in #79
- #82 Behave Test reporting in our downstream projects by @aaron-gary in #83
- #85 ⬇️ downgrade requests temporarily to work around breaki… by @d-ryan-ashcraft in #86
- #97 ⬆️ update to latest habushu; #98 ⬆️ update to han… by @d-ryan-ashcraft in #99
- #84 Set Helm publishing repo for downstream projects by @chang-annie in #100
- #75 Set Docker repo in downstream projects by @colinpalmer-pro in #94
- #91 Create Github Issue Templates by @nartieri in #101
- 104-Update-SECURITY.md by @nartieri in #105
- Spark history v2 chart by @peter-mcclonski in #107
- #82 Address caching defects in downstream html report generation by @cpointe-ibllanos in #109
- #108 vault property dao class can be injected when use the extensions-configuration-store-vault module by @csun-cpointe in #113
- #2 Set up GitHub Actions for building aissemble (GitHub Runner) by @aaron-gary in #112
- Remove habushu flag for pyenv by @aaron-gary in #114
- Add config to build workflow by @aaron-gary in #119
- #88: Rename Foundation Core Java module with latest aiSSEMBLE landscape by @habibimoiz in #117
- #103 thrift server chart by @peter-mcclonski in #121
- #87 As a devops engineer, I want a combined manifest for the config s… by @tianliang0038 in #120
- #123 add baton migration to update spark app exec by @ewilkins-csi in #124
- #123 Fix NPE when running in Tests module by @ewilkins-csi in #125
- #111 Baton Script to Update Image Names by @colinpalmer-pro in #126
- #128 Replace occurrences of 'mvn' command with maven wrapper './mvnw' in the documentations and READMEs. by @J-Clingerman in #131
- #128 Removed maven wrapper from the archetype.adoc and foundation-ar… by @J-Clingerman in #135
New Contributors
- @d-ryan-ashcraft made their first contribution in #7
- @ewilkins-csi made their first contribution in #10
- @cwoods-cpointe made their first contribution in #33
- @csun-cpointe made their first contribution in #19
- @tianliang0038 made their first contribution in #30
- @Cho-William made their first contribution in #34
- @jacksondelametter made their first contribution in #31
- @aaron-gary made their first contribution in #40
- @chang-annie made their first contribution in #46
- @carter-cundiff made their first contribution in #56
- @colinpalmer-pro made their first contribution in #67
- @habibimoiz made their first contribution in #60
- @nartieri made their first contribution in #101
- @cpointe-ibllanos made their first contribution in #109
- @J-Clingerman made their first contribution in #131
Full Changelog: https://github.com/boozallen/aissemble/commits/aissemble-root-1.7.0