Skip to content

Commit

Permalink
#395 Upgrade IT infrastructure for JDK 17
Browse files Browse the repository at this point in the history
Update it test dockerfile to use jdk17
Update it helm Tiltfile to be up to date with recent V2 chart changes Correct IT jar shading and copy
Added migration to update test docker JDK
  • Loading branch information
cwoods-cpointe committed Oct 10, 2024
1 parent 0a0c902 commit e78342c
Show file tree
Hide file tree
Showing 18 changed files with 182 additions and 151 deletions.
17 changes: 9 additions & 8 deletions DRAFT_RELEASE_NOTES.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,13 +26,14 @@ The following steps will upgrade your project to 1.10. These instructions consis
## Automatic Upgrades
To reduce burden of upgrading aiSSEMBLE, the Baton project is used to automate the migration of some files to the new version. These migrations run automatically when you build your project, and are included by default when you update the `build-parent` version in your root POM. Below is a description of all of the Baton migrations that are included with this version of aiSSEMBLE.

| Migration Name | Description |
|------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| upgrade-tiltfile-aissemble-version-migration | Updates the aiSSEMBLE version within your project's Tiltfile |
| upgrade-v2-chart-files-aissemble-version-migration | Updates the Helm chart dependencies within your project's deployment resources (`<YOUR_PROJECT>-deploy/src/main/resources/apps/`) to use the latest version of the aiSSEMBLE |
| upgrade-v1-chart-files-aissemble-version-migration | Updates the docker image tags within your project's deployment resources (`<YOUR_PROJECT>-deploy/src/main/resources/apps/`) to use the latest version of the aiSSEMBLE |
| spark-version-upgrade-migration | Updates the Spark Application executor failure parameters to their new key name to ensure compatibility with spark `3.5` |
| log4j-maven-shade-plugin-migration | Updates the Maven Shade Plugin with the new Log4j dependency information |
| Migration Name | Description |
|----------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| upgrade-tiltfile-aissemble-version-migration | Updates the aiSSEMBLE version within your project's Tiltfile |
| upgrade-v2-chart-files-aissemble-version-migration | Updates the Helm chart dependencies within your project's deployment resources (`<YOUR_PROJECT>-deploy/src/main/resources/apps/`) to use the latest version of the aiSSEMBLE |
| upgrade-v1-chart-files-aissemble-version-migration | Updates the docker image tags within your project's deployment resources (`<YOUR_PROJECT>-deploy/src/main/resources/apps/`) to use the latest version of the aiSSEMBLE |
| spark-version-upgrade-migration | Updates the Spark Application executor failure parameters to their new key name to ensure compatibility with spark `3.5` |
| it-infrastructure-java-upgrade-migration | Updates the Java docker image version in the integration test docker module to JDK 17 |
| log4j-maven-shade-plugin-migration | Updates the Maven Shade Plugin with the new Log4j dependency information |

To deactivate any of these migrations, add the following configuration to the `baton-maven-plugin` within your root `pom.xml`:

Expand Down Expand Up @@ -78,4 +79,4 @@ To start your aiSSEMBLE upgrade, update your project's pom.xml to use the 1.10.0
3. Repeat the previous step until all manual actions are resolved

# What's Changed
- `pyproject.toml` files updated to allow for Python version `>=3.8`.
- `pyproject.toml` files updated to allow for Python version `>=3.8`.
Original file line number Diff line number Diff line change
Expand Up @@ -110,6 +110,7 @@
<artifactId>${project.parent.artifactId}-java</artifactId>
<version>${project.version}</version>
<type>jar</type>
<classifier>shaded</classifier>
<overWrite>true</overWrite>
<destFileName>${project.artifactId}.jar</destFileName>
<outputDirectory>${project.build.directory}</outputDirectory>
Expand All @@ -124,4 +125,4 @@
</build>
</profile>
</profiles>
</project>
</project>
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,8 @@
</goals>
<configuration>
<finalName>${parentArtifactId}-tests-java</finalName>
<shadedClassifierName>shaded</shadedClassifierName>
<shadedArtifactAttached>true</shadedArtifactAttached>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
Expand Down Expand Up @@ -190,4 +192,4 @@
</build>
</profile>
</profiles>
</project>
</project>
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
<dependency>
<groupId>com.boozallen.aissemble</groupId>
<artifactId>aissemble-quarkus-bom</artifactId>
<version>${project.version}</version>
<version>${version.aissemble}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,18 +14,11 @@
<dependencyManagement>
<dependencies>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-bom</artifactId>
<version>${version.quarkus}</version>
<groupId>com.boozallen.aissemble</groupId>
<artifactId>aissemble-quarkus-bom</artifactId>
<version>${version.aissemble}</version>
<type>pom</type>
<scope>import</scope>
<exclusions>
<!-- Quarkus 3.6 uses SmallRye Messaging 4.10, which is incompatible with 4.24. This causes issues for in-memory import -->
<exclusion>
<groupId>io.smallrye.reactive</groupId>
<artifactId>*</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>io.smallrye.reactive</groupId>
Expand Down Expand Up @@ -148,4 +141,4 @@
</plugin>
</plugins>
</build>
</project>
</project>
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,7 @@
# GENERATED DOCKERFILE - please ***DO*** modify.
#
# Generated from: ${templateName}
ARG DOCKER_BASELINE_REPO_ID
FROM openjdk:11-slim
FROM openjdk:17-jdk-slim

COPY ./target/specifications/ ./specifications
COPY ./target/${artifactId}.jar ./
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
@pipeline
Feature: Run Spark Operator job
Feature: Example feature file
Generated sample BDD specification/feature file - PLEASE ***DO*** MODIFY.

Scenario: Executes a pipeline
When the job is triggered
Then the pipeline is executed
Scenario: Update me
Given a precondition
When an action occurs
Then a postcondition results
Original file line number Diff line number Diff line change
@@ -1,127 +1,42 @@
package ${basePackage}.tests;

import static org.junit.Assert.assertTrue;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import io.cucumber.java.After;
import io.cucumber.java.Before;
import io.cucumber.java.en.Given;
import io.cucumber.java.en.Then;
import io.cucumber.java.en.When;
import io.fabric8.kubernetes.api.model.HasMetadata;
import io.fabric8.kubernetes.api.model.Pod;
import io.fabric8.kubernetes.client.KubernetesClient;
import io.fabric8.kubernetes.client.KubernetesClientBuilder;
import io.fabric8.kubernetes.client.informers.ResourceEventHandler;
import org.junit.Assert;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import java.io.File;
import java.io.FileInputStream;
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;
import java.util.Objects;
import java.util.concurrent.TimeUnit;

public class PipelineSteps {

private static final Logger logger = LoggerFactory.getLogger(PipelineSteps.class);
List<File> listOfFiles = new ArrayList<>();
Boolean jobRequestFailed;
List<Pod> podList;
List<HasMetadata> hasMetadata;

@Before("@pipeline")
public void setup() {
this.jobRequestFailed = false;
this.podList = new ArrayList<>();
this.hasMetadata = new ArrayList<>();
}

@After("@pipeline")
public void cleanup() {
}

@When("the job is triggered")
public void theJobIsTriggered() throws IOException, InterruptedException {
getFiles();
}

@Then("the pipeline is executed")
public void thePipelineIsExecuted() {
Assert.assertTrue(String.format("Pod list size should be greater than %d", podList.size()), podList.size() > 0);
}

private void getFiles() throws IOException, InterruptedException {

File dir = new File("./pipelines");
addFilesForFolder(dir);

logger.info(String.format("Got %d application.yaml file(s) to process", listOfFiles.size()));

for (File file : listOfFiles) {

try (KubernetesClient client = new KubernetesClientBuilder().build()) {

//Initializing Informer
this.initializeInformer(client);

List<HasMetadata> metadata = client
.load(new FileInputStream(file))
.get();

this.hasMetadata.addAll(metadata);
logger.info(String.format("has meta data size is %d", hasMetadata.size()));

//Creating/Replacing resource here
metadata.forEach(o -> logger.info("Creating resource " + o.getMetadata().getName()));
client.resourceList(metadata).inNamespace("default").createOrReplace();
TimeUnit.MINUTES.sleep(2);

}
}
}

private void initializeInformer(KubernetesClient client) {

client.pods().inNamespace("default").inform(new ResourceEventHandler<>() {

@Override
public void onAdd(Pod pod) {
addToPodList(pod);
}

@Override
public void onUpdate(Pod oldPod, Pod newPod) {
addToPodList(oldPod);
}

@Override
public void onDelete(Pod pod, boolean deletedFinalStateUnknown) {
addToPodList(pod);
}

}, 30 * 1000L);

logger.info("Informer initialized.");
@Given("a precondition")
public void a_precondition() {
// code the items you need before performing your action
}


private void addFilesForFolder(final File folder) {
for (final File fileEntry : Objects.requireNonNull(folder.listFiles())) {
if (fileEntry.isDirectory()) {
addFilesForFolder(fileEntry);
} else {
listOfFiles.add(fileEntry);
}
}
@When("an action occurs")
public void an_action_occurs() {
// execute your action
}

private void addToPodList(Pod pod) {

boolean anyMatch = this.hasMetadata.stream().anyMatch(o -> pod.getMetadata().getName().contains(o.getMetadata().getName()));
if (anyMatch) {
logger.info("Pod " + pod.getMetadata().getName() + " added to pod list");
this.podList.add(pod);
}
@Then("a postcondition results")
public void a_postcondition_results() {
// check for expected postconditions - continue to use normal assert pattern within tests
assertTrue(true);
}

}
}
Original file line number Diff line number Diff line change
Expand Up @@ -41,22 +41,6 @@ parent_dir + '${parentArtifactId}-deploy/src/main/resources/apps/spark-operator/
)
k8s_yaml(yaml)


# Hive Metastore DB
yaml = helm(
parent_dir + '${parentArtifactId}-deploy/src/main/resources/apps/hive-metastore-db',
values=[parent_dir + '${parentArtifactId}-deploy/src/main/resources/apps/hive-metastore-db/values.yaml']
)
k8s_yaml(yaml)

# Zookeeper Alert
yaml = helm(
parent_dir + '${parentArtifactId}-deploy/src/main/resources/apps/zookeeper-alert',
values=[parent_dir + '${parentArtifactId}-deploy/src/main/resources/apps/zookeeper-alert/values.yaml',
parent_dir + '${parentArtifactId}-deploy/src/main/resources/apps/zookeeper-alert/values-dev.yaml']
)
k8s_yaml(yaml)

# Kafka
yaml = helm(
parent_dir + '${parentArtifactId}-deploy/src/main/resources/apps/kafka-cluster',
Expand All @@ -67,7 +51,7 @@ k8s_yaml(yaml)

# Policy Decision Point
docker_build(
ref='boozallen/${parentArtifactId}-policy-decision-point-docker',
ref='${parentArtifactId}-policy-decision-point-docker',
context=parent_dir + '${parentArtifactId}-docker/${parentArtifactId}-policy-decision-point-docker',
build_args=build_args,
dockerfile=parent_dir + '${parentArtifactId}-docker/${parentArtifactId}-policy-decision-point-docker/src/main/resources/docker/Dockerfile'
Expand Down
Original file line number Diff line number Diff line change
@@ -1,8 +1,15 @@
#set ($index = $artifactId.indexOf('-tests'))
#set ($parentArtifactId = $artifactId.substring(0, $index))

deployment:
command: '[ "java" ]'
args: '[ "-jar", "${artifactId}.jar" ]'
args: '[ "-jar", "${parentArtifactId}-tests-docker.jar" ]'
initContainers:
- name: wait-for-spark-operator
image: busybox:latest
command: [ "/bin/sh","-c" ]
args: [ "until nc -vz spark-operator-webhook.default 443; do sleep 5; echo 'waiting for spark operator...'; done" ]
args: [ "until nc -vz spark-operator-webhook-svc.default 443; do sleep 5; echo 'waiting for spark operator...'; done" ]
- name: wait-for-spark-thrift-service
image: busybox:latest
command: [ "/bin/sh","-c" ]
args: [ "until nc -vz thrift-server.default 10001; do sleep 5; echo 'waiting for spark thrift service...'; done" ]
Original file line number Diff line number Diff line change
@@ -1,4 +1,7 @@
#set ($index = $artifactId.indexOf('-tests'))
#set ($parentArtifactId = $artifactId.substring(0, $index))

image:
name: boozallen/${artifactId}
name: ${parentArtifactId}-tests-docker
imagePullPolicy: IfNotPresent
dockerRepo: ""
dockerRepo: ""
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
package com.boozallen.aissemble.upgrade.migration.v1_10_0;

/*-
* #%L
* aiSSEMBLE::Foundation::Upgrade
* %%
* Copyright (C) 2021 Booz Allen
* %%
* This software package is licensed under the Booz Allen Public License. All Rights Reserved.
* #L%
*/

import java.io.File;
import java.io.IOException;
import java.nio.file.Files;

import org.technologybrewery.baton.BatonException;

import com.boozallen.aissemble.upgrade.migration.AbstractAissembleMigration;

public class ItInfrastructureJavaUpgradeMigration extends AbstractAissembleMigration {
private static final String OLD_JDK_VERSION = "FROM openjdk:11-slim";
private static final String NEW_JDK_VERSION = "FROM openjdk:17-jdk-slim";

@Override
protected boolean shouldExecuteOnFile(File file) {
try {
return (Files.readString(file.toPath()).contains(OLD_JDK_VERSION));
} catch (IOException e) {
throw new RuntimeException(e);
}
}

@Override
protected boolean performMigration(File file) {
try {
Files.writeString(file.toPath(), Files.readString(file.toPath())
.replace(OLD_JDK_VERSION, NEW_JDK_VERSION));
return true;
} catch (IOException e) {
throw new BatonException("Failed to update Dockerfile parent image", e);
}
}
}
11 changes: 11 additions & 0 deletions foundation/foundation-upgrade/src/main/resources/migrations.json
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,17 @@
}
]
},
{
"name": "it-infrastructure-java-upgrade-migration",
"implementation": "com.boozallen.aissemble.upgrade.migration.v1_10_0.ItInfrastructureJavaUpgradeMigration",
"fileSets": [
{
"includes": [
"*-tests/**/Dockerfile"
]
}
]
},
{
"name": "log4j-maven-shade-plugin-migration",
"implementation": "com.boozallen.aissemble.upgrade.migration.v1_10_0.Log4jMavenShadePluginMigration",
Expand Down
Loading

0 comments on commit e78342c

Please sign in to comment.